You are on page 1of 6

Questions Black Box Testing

Software Engineering • What is done during testing? Input


BLACK BOX
Output = func(input)
Contents
unknown Test if observed output matches
expected output
• Who does the testing?
Lecture#17
Issue: Is it possible to test ALL possible inputs?
Testing • What are the different types of Without knowledge of internal workings it may be difficult to come up with
testing? good representative test cases.

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 1 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 2 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 3

White Box Testing The testing process The Testing process Contd
White BOX Design Specs Function Performance Customer User
Input Reqs reqs reqs environment
Output = func(input) Component-1
code Unit test

Use the structure to test Function Performance Acceptance Installation


Tested components
Test Test Test Test
in different ways Integration
Component-2 tests Integrated Accepted
Integrated Functioning Verified
code Unit test modules system
modules system validated
Issue: Is it possible to test ALL possible paths, statements, …??

Component-n
code Unit test

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 4 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 5 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 6

Test Plan Test Plan Summary Information 2 pages


• 1. System Description (one para)
• Q: Why is it necessary to create a • Three Sections • 2. Major Tests (description of strategy +
test plan? • Summary Information important things to be tested
• Test Descriptions • unit tests **
• Test Analysis (The results) • integration tests **
• functional tests **
• performance tests
• acceptance tests
• installation tests

• 3. Traceability matrix
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 7 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 8 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 9

1
Test Descriptions (1 page each) Test Analysis (1 page) Testing Plan contd
For each Test TEST CASE PASS/FAIL FAULT DESCRIPTION SEVERITY
Test-1 PASS • Should also contain
1: requirements/functions tested subtest1 PASS • schedule for testing
2: Methods subtest2 FAIL hangs the system MAJOR
subtest3 PASS • resources needed for testing
- planned strategy for testing
Test-2 PASS
i.e. coverage of i/p, paths, statements etc
subtest1 FAIL error message pops up MINOR
with reasons
subtest .. PASS
- manual/automatic testing
3: Test procedures (note - there may be many test-cases) Total# failed
- Steps to performing the tests Total# passed

- i/p and expected o/p

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 10 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 11 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 12

Unit Testing Unit Testing Example


• Statement testing • Equivalence classes 1: sum = 0;
• each statement is executed at least once in some test 2: for (i= 0; i <= 100; i++) {
3: sum = sum + a*i;
• Branch Testing
• each decision point is chosen at least once 4: if (sum >100) break;
• Path testing 5: }
• each distinct path is tested at least once 6: printf(“%d\n”, sum);
• definition-use path testing
• every path from definition to every uses of that definition
statement tests? branch tests? path tests?

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 13 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 14 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 15

Integration Testing Bottom-up Integration Bottom-up Integration


• Assemble tested components to Test D, E, F, G, H
before testing B, C
• Combine tested components to
form the subsystem A
test_driver_D() form components.
{
// test # 1
• Easier to integrate small pieces and B C prepare inputs for D
call D() • Test D, E, F, G, H
test them - than to integrate the compare outputs with
expected outputs
entire system and then test the • Test B, D, E, F and C, G, H
D E F G H // test #2
whole system ….. • Test A, B, C, D, E, F, G, H
…..
}
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 16 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 17 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 18

2
Pros and cons Top Down Integration Top-Down Integration
test_A()
• Pros { • Test A
A // test # 1
• many of lowest components are general
purpose UTILITIES that are used in many
prepare inputs for A
call A()
• Test A, B, C
compare outputs with
places - good to test them first. expected outputs • Test A, B, C, D, E, F, G, H
B C …………………..
• Cons }
• high-level functionality tested last - discovery of B( ) //stub
major defects postponed - whereas mundane, E F G H { trivial processing}
D
repetitive computations are tested first
C( ) // stub
{ trivial processing}

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 19 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 20 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 21

Pros and cons Sandwich Integration Pros and cons


• Pros • Test A, Test D, E , F //top and bottom layers • Pros
• major design errors/ functionality problems are (preferably test only the utilities at the bottom level and not • verify low-level utilities AND also discover
detected early ALL the routines) major design issues EARLY

• Cons • Test B, D, E, F and C // middle layer • Cons


• large number of stubs may be needed • need to develop driver AND stub codes
• lowest level may contain a large number of • planning the testing takes more thought
general purpose utilities. • Test A, B, C, D, E, F // integrated

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 22 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 23 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 24

Functional Testing Performance Testing


Functional
Performance • Test all the functionality as per • Stress tests - load system using many users,
devices etc
Acceptance requirements
• volume tests - test ability to handle large
Installation amounts of data
• Example: Word processing • configuration tests - test s/w and h/w configs
Testing • compatability tests - test interfacing with other
Document Modification - major functional group
add a char, word, para systems
delete a char, word, para
• security tests
change format, font
…..
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 25 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 26 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 27

3
Cont’d Acceptance Testing Installation Testing
• Recovery tests - response to faults • Benchmark tests etc • Usually involves running tests at
and loss of data • Alpha test - pilot test run in-house customer site to verify working of
• quality tests - up-time (Mean Time • Beta test - pilot test run at customer site installed system
To Failure) • Parallel testing - both existing and new
system run in parallel (allows time to
• Usability tests - test user interfaces
build up confidence in new system)
• and so on….
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 28 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 29 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 30

Automated Testing Tools Code Analysis Tools


• Code Analysis Tools • Static Analysis
Automated Testing Tools • Test Execution tools – Code structure Analyzer
• checks for syntax
• defines/uses of variables
• generates graph of program
• marks problematic control flow
• identifies unreachable statements
• identifies nesting depth
• generates complexity measures for code
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 31 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 32 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 33

Code Analysis Tools Code Analysis Tools Test Execution Tools


– Data Analysis • Dynamic Analysis • Generate stubs and drivers
• Identifies illegal data usage • run-time info on how many times a function has
• conflicting data definitions been called, • generate test-cases based on
• unitialized variables • how much time was spent at each function program structure
• which paths of code has been executed and
which have not. • automated testing environment
• These are called profilers • database of test reports, measurement tools,
• Useful to determine test coverage, and to identify code-analysis tools, modelling, simulation tools,
bottlenecks. editor, ...
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 34 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 35 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 36

4
Automated Defect Tracking PR fields
• Ex: GNATS PR-ID: (Automatic)
WORK-PRODUCT:
Automated Defect Tracking BRIEF DESCRIPTION:
• user/developer - opens web-page and enters
STATE: (open/closed/analyze/develop/suspend/feedback)
info into a form and submits Problem Report
SUBMITTER:
(PR)
RESPONSIBLE:
• defect logged and appropriate personnel
PRIORITY:
notified.
SEVERITY:

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 37 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 38 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 39

PR Fields Example State Diagram PR STATES


PHASE CREATED: SUSPEND
• OPEN:
PHASE DETECTED:
User submits PR • Team Leader is responsible person
TYPE OF FAULT: (ex: error-checking, interface, …)
OPEN ANALYZE DEVELOP • checks if error can be replicated
MODE OF FAULT: (missing, unclear, wrong, changed, new)
• moved to ANALYZE state
ENVIRONMENT: (CPU/Memory/Tools/OS/Compiler …) finished
DESCRIPTION: (test case location, steps to replicate defect …) Not-OK ok • ANALYZE
SQA MERGE
• problem analyzed
AUDIT TRAIL: (actions taken and when and by whom - also • data entered into PR
all messages about the PR) • CCB decides priority of error
CLOSED
• moved to SUSPEND state or DEVELOP state
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 40 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 41 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 42

PR states PR STATES Main ideas in defect tracking


• DEVELOP state • MERGE • Unique ID for PRs
• Developer is responsible • developer is responsible
• person responsible for a PR fixed
• makes fixes • incorporates the fixes into the product
• moves state to FEEDBACK • moves state to CLOSED • priority, severity
• FEEDBACK • SUSPEND • states of a PR
• SQA group is responsible • team-leader is responsible
• move a defect through the various
• tests fix • periodic review to see if they need to be revived
• pass/fail => MERGE/ANALYZE state stages in a controlled fashion (why?)
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 43 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 44 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 45

5
Advantages of Automated
Cont’d
Defect Tracking
• Management aspects • Developer Aspects
• Automated reminders • Automatic logging of e-mails regarding problem
details, analysis, solutions
When to stop testing?
• different views of the defects recorded (by
person responsible, priority, STATE, etc) • central info on a problem allows new
• graphs of # open defects, # closed etc etc developers to quickly catch up on a problem
• makes it easy to enforce defect tracking report (so as to be able to fix it)
policies
• Provides an audit trail (verify practice matches
policies)

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 46 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 47 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 48

When to stop testing? Fault Seeding Example


• Estimate the # of remaining faults? • Faults are inserted (called seeds) • 10 seeds embedded
• Helps decide on when to stop into the software to be tested - • 5 seeds were found
• gives confidence in the code
testers are not informed. • 20 faults were also found
• Number of seeds = S
• Use historical data to compare with
• Detected Seeds = d
similar components • remaining faults = 10*20/5 - 20
• Number of detected non-seeded faults = n
• remaining non-seeded faults = (S*n/d - n) = 20 faults
10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 49 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 50 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 51

Problems Testing OO objects


• Assume seed is of same type and • In addition to testing each modules
frequencies as the actual faults
Testing OO objects • also need to test inheritance scheme,
polymorphism etc
• seed can mask actual fault from being caught
• Usually bottom-up integration
• can forget to remove seeds before shipping testing

10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 52 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 53 10/22/01 10:40:08 AM coms 309 Lecture#17 (smitra) 54

You might also like