Validation & Verification
Tom Kelliher, CS 319
Sept. 25, 1998
Project requirements discussion.
- The testing process.
- Test Planning.
- Testing Strategies.
Read Chapter 5.
- Validation: Are we building the right product?
- Verification: Are we building the product right?
- The difference: Consider a recipe for chocolate chip cookies. We may
follow the recipe (verification), but not care for the result because we
don't like walnuts (validation).
Terminology:
- Static V & V: structured reviews of system representations
(documents, source code). Formal methods. Can be applied to all stages of
the process.
- Dynamic V & V: exercising an implementation. Only applies to
prototype or final system. The predominant method.
- Statistical testing: Select tests which reflect typical usage patterns.
Calculate system reliability, performance.
- Defect testing: Select tests which reflect the requirements
specification. Uncover deficiencies in the software.
- Testing vs. debugging.
- Regression testing: after fixing a defect, re-run the test suite.
Stages of the testing process:
- Unit testing: Independent testing of units.
- Module testing: A module is a set of dependent components, such as a
class or set of dependent functions. Can be tested independently of other
modules.
- Sub-system testing: Sub-systems may be independently designed and
implemented. Predominantly interface errors. Integrate modules into
sub-systems and test. Push on the interfaces.
- System testing: Integrate sub-systems. Erroneous interactions
between sub-systems and system. Validate against requirements.
- Acceptance testing: System tested with procurer supplied data, rather
than synthesized data. May reveal defects in requirements spec.
Additional requirements validation. Also known as alpha test.
Beta test used for generic software: deliver an early version to
prospective customers. Some vendors have elevated this to an art form.
Testing is expensive and time consuming. Plan for slippages.
Components of a test plan:
- The testing process: overview of the test process.
- Requirements traceability: how are the requirements being validated?
- Test items: which software process products are being tested?
- Testing schedule: overall testing schedule and resource allocation.
- Test recording procedures: record results, allow for audits.
- Hardware and software requirements: SW tools and hardware
utilization.
- Constraints: anticipate any possible testing constraints.
The relationship between software process, test plans, and testing:
Mix and match strategies. Test incrementally.
- Top-Down testing:
- Start with most abstract components. Use with top-down design.
- Early validation, prototype. Psychological, managerial
advantages.
- Use of stubs. How to generate returned items?
- Upper levels may not produce (testable) output.
- Bottom-Up testing:
- Start at the bottom. Use with object-oriented design, reusable
components.
- Architectural flaws may not be discovered until late. May force
re-write of implemented components.
- Test drivers can be provided along with system to support reuse.
- Thread testing:
- Test the thread/communication pathways within a system,
particularly real-time, event driven system.
- Sequence: single event, multiple events of single type, multiple
events of various types. Incremental approach again.
- Stress testing:
- Determine if system can handle its intended load (and beyond!).
- Determine system failure behavior.
- May uncover hidden flaws.
- Extremely useful for distributed systems.
- Back-to-Back testing: Test different versions of a system against
each other. Diff the results.
- How do you test?
- Is validation difficult?
- Why is top-down testing not useful for object-oriented systems?
- Why is regression testing necessary?
- Can a programmer objectively test their own code?
- Consider the ethics of testing until you exhaust the test budget,
then deliver the product.
Thomas P. Kelliher
Wed Sep 23 17:11:05 EDT 1998
Tom Kelliher