Taxonomy of Test Oracles
Mudit Agrawal
Course: Introduction to Software Testing, Fall ‘06
10/3/2015
Software Testing, Fall 2006
1
Motivation

Shortcomings of automated tests (as compared to
human ‘eye-ball oracle’)

Factors influencing test oracles




Environmental (private, global) variables
State (initial and present) of the program
Dependence on test case
What kind of oracle(s) are best suited for which
applications?
10/3/2015
Software Testing, Fall 2006
2
Contents

Mutating Automated Tests – Challenges for test oracles

Generations of Automations and Oracles

Challenges for Oracles

Types of Oracles

Conclusion
10/3/2015
Software Testing, Fall 2006
3
Automated Tests

Advantages





No intervention needed after launching tests
Automatically sets up and/or records relevant test
environment
Evaluates actual against expected results
Reports Analysis of pass/fail
Limitations:



10/3/2015
Less likely to cover latent defects
Doesn’t do anything different each time it runs
Oracle is not ‘complete’
Software Testing, Fall 2006
4
10/3/2015
Software Testing, Fall 2006
5
Limited by Test Oracle



‘Predicting’ and comparing the results
Playing twenty questions with all the
questions written down in advance!
Has to be influenced by



10/3/2015
Data
Program State
Configuration of the system environment
Software Testing, Fall 2006
6
Testing with the Oracle
Source: Douglas Hoffman, SQM, LLC
10/3/2015
Software Testing, Fall 2006
7
Generations of Automation:
First Generation Automation

Automate existing tests by creating
equivalent exercises
Self verifying tests
Hard Coded oracles

Limitations



10/3/2015
Handling negative test cases
Software Testing, Fall 2006
8
Second Generation Automation



Automated Oracles
Emphasis on expected results – more
exhaustive input variation, coverage
Increasing



10/3/2015
Frequency
Intensity
Duration of automated test activities (load testing)
Software Testing, Fall 2006
9
Second Generation continued…

Random selection among alternatives





10/3/2015
Partial domain coverage
Dependence of system state on previous test
cases
Mechanism to determine whether SUT’s behavior
is expected
Pseudo random number generator
Test recovery
Software Testing, Fall 2006
10
Third Generation Automation

Take into account knowledge and visibility

Software instrumentation

Multi-threaded tests

Fall back compares

10/3/2015
Using other oracles
Software Testing, Fall 2006
11
Third Generation continued…

Heuristic Oracles


Fuzzy comparisons, approximations
Diagnostics


10/3/2015
Looks for errors
Performs additional tests based on the specific
type of error encountered
Software Testing, Fall 2006
12
Challenges for Oracles

Independence and completeness – difficult to achieve both

Independence from





Algorithms
Sub-programs, libs
Platforms
OS
Completeness in form of information

Comparing computed functions, screen navigations and
asynchronous event handling

Speed of predictions

Time of Execution of oracle
10/3/2015
Software Testing, Fall 2006
13
Challenges for Oracle continued…

Better an oracle is, more complex it becomes

Comprehensive oracles make up for long test
cases (DART paper)

More it predicts, more dependent it is on SUT

More likely for it to contain the same fault
10/3/2015
Software Testing, Fall 2006
14
Challenges for Oracle continued…

Legimitate oracle – an oracle that produces
accurate rather than estimated outputs

Generates results based on the formal specs

What if formal specs are wrong?

Very few errors cause noticeable abnormal
test termination
10/3/2015
Software Testing, Fall 2006
15
Source: An automated oracle for software testing
10/3/2015
Software Testing, Fall 2006
16
IORL – Input/Output Req. Lang.


Graphics based Language
Optimal representation between the informal
requirements and the target code
10/3/2015
Software Testing, Fall 2006
17
Oracles for different scenarios

Transducers




That read an input sequence and produce and
output sequence
Logical correspondence between I/O structures
e.g. native file format to HTML conversions in web
applications
Solution – CFGs

10/3/2015
System translates a formal specs of I/O files into
an automated oracle
Software Testing, Fall 2006
18
Embedded Assertion Languages
[Oracles for different scenarios]

Asserts!

Problems:

Non-local assertions



State caching

10/3/2015
Asserts for pre/post condition pairs with a procedure as
a whole
e.g. asserts for each method that modifies the object
state
Saving parts or all of ‘before’ values
Software Testing, Fall 2006
19
Embedded Assertion Languages
[Oracles for different scenarios]

Auxiliary variables

Quantification
10/3/2015
Software Testing, Fall 2006
20
Extrinsic Interface Contracts
[Oracles for different scenarios]

Instead of inserting asserts within the program,
checkable specs are kept separate from the
implementation

Extrinsic specs are written in notations

Less tightly coupled with target programming
language

Useful when source-code need not be touched
10/3/2015
Software Testing, Fall 2006
21
Pure Specification Languages
[Oracles for different scenarios]




Problem with older approach: specs were not
pure
Z and object-Z are model based specification
languages
Describe intended behavior using familiar
mathematical objects
Free of the constraints of the language
10/3/2015
Software Testing, Fall 2006
22
Trace Checking
[Oracles for different scenarios]

Uses a partial trace of events

Such a trace can be checked by an oracle
derived from formal specs of externally
observable behavior
10/3/2015
Software Testing, Fall 2006
23
Types of Oracles
Categorized based on Oracle-Outputs

True Oracle




10/3/2015
Faithfully reproduces all relevant results
Uses independent platform, processes, compilers,
code etc.
Lesser commonality => more confidence in the
correctness of results
e.g. sin(x) [problem: how is ‘all inputs’ defined?
Software Testing, Fall 2006
24
Types of Oracles continued…

Stochastic Oracle



10/3/2015
Statistically random input selection
Error prone areas of the software are no more or
less likely to be encountered
sin() – pseudo random generator to select input
values
Software Testing, Fall 2006
25
Types of Oracles continued…

Heuristic Oracle




Reproduces selected results for the SUT
Remaining values are checked based on
heuristics
No exact comparison
Sampling


10/3/2015
Values are selected using some criteria (not
random)
e.g. boundary values, midpoints, maxima, minima
Software Testing, Fall 2006
26
Types of Oracles continued…

Consistent Oracle


10/3/2015
Uses results from one test run as the Oracle for
subsequent runs
Evaluating the effects of changes from one
revision to another
Software Testing, Fall 2006
27
Comparison
10/3/2015
Software Testing, Fall 2006
28
Who wants to be a Millionaire?
10/3/2015
Software Testing, Fall 2006
29
Which category best describes GUI Testing?
10/3/2015
a. Heuristic
b. Trace Checking
c. Transducers
d. None
Software Testing, Fall 2006
30
Thanks!
10/3/2015
Software Testing, Fall 2006
31
Descargar

Oracles