Despite the recent advances in test generation, fully automatic software testing remains a dream: Ultimately, any generated test input depends on a test oracle that determines correctness, and, except for generic properties such as ``the program shall not crash'', such oracles require human input in one form or another. CrowdSourcing is a recently popular technique to automate computations that cannot be performed by machines, but only by humans. A problem is split into small chunks, that are then solved by a crowd of users on the Internet. In this paper we investigate whether it is possible to exploit CrowdSourcing to solve the oracle problem: We produce tasks asking users to evaluate CrowdOracles -- assertions that reflect the current behavior of the program. If the crowd determines that an assertion does not match the behavior described in the code documentation, then a bug has been found. Our experiments demonstrate that CrowdOracles are a viable solution to automate the oracle problem, yet taming the crowd to get useful results is a difficult task.

Pastore, F., Mariani, L., Fraser, G. (2013). CrowdOracles: Can the Crowd Solve the Oracle Problem. In Proceedings of the International Conference on Software Testing, Verification and Validation (ICST) (pp.342-351) [10.1109/ICST.2013.13].

CrowdOracles: Can the Crowd Solve the Oracle Problem

PASTORE, FABRIZIO;MARIANI, LEONARDO;
2013

Abstract

Despite the recent advances in test generation, fully automatic software testing remains a dream: Ultimately, any generated test input depends on a test oracle that determines correctness, and, except for generic properties such as ``the program shall not crash'', such oracles require human input in one form or another. CrowdSourcing is a recently popular technique to automate computations that cannot be performed by machines, but only by humans. A problem is split into small chunks, that are then solved by a crowd of users on the Internet. In this paper we investigate whether it is possible to exploit CrowdSourcing to solve the oracle problem: We produce tasks asking users to evaluate CrowdOracles -- assertions that reflect the current behavior of the program. If the crowd determines that an assertion does not match the behavior described in the code documentation, then a bug has been found. Our experiments demonstrate that CrowdOracles are a viable solution to automate the oracle problem, yet taming the crowd to get useful results is a difficult task.
paper
crowdsourcing, oracle, testing, test case generation
English
International Conference on Software Testing, Verification and Validation (ICST)
2013
Proceedings of the International Conference on Software Testing, Verification and Validation (ICST)
978-1-4673-5961-0
2013
342
351
open
Pastore, F., Mariani, L., Fraser, G. (2013). CrowdOracles: Can the Crowd Solve the Oracle Problem. In Proceedings of the International Conference on Software Testing, Verification and Validation (ICST) (pp.342-351) [10.1109/ICST.2013.13].
File in questo prodotto:
File Dimensione Formato  
cameraReady.pdf

accesso aperto

Dimensione 240.61 kB
Formato Adobe PDF
240.61 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/42638
Citazioni
  • Scopus 81
  • ???jsp.display-item.citation.isi??? 52
Social impact