Using Dynamic Execution Data to Generate Test Cases Rozita Dara Shimin Li Weining Liu Angi Smith Ghorbani Ladan Tahvildari

University of Waterloo Canada

Research In Motion Canada

Background • Joint project of –

Software Technologies Applied Research (STAR) laboratory led by Prof. Ladan Tahvildari from University of Waterloo, Canada



Research In Motion (RIM) – A leading designer, manufacturer and marketer of innovative wireless solutions (e.g. BlackBerry) for the worldwide mobile communications market

• This project aims at reducing the amount of required testing time and associated resources to decide whether a software product of RIM meets its quality release criteria

ICSM'09 Edmonton

2

Outline • • • • • •

Motivation SV&V Testing Model SV&V Testing Process Flow Components of Testing Process Contributions & Summary Thank You

ICSM'09 Edmonton

3

Motivation • Satisfy newly added test goals – Find bugs earlier – Achieve reduced bug escape rate – Achieve required code coverage

• Reduce time to market – Reduce SV&V testing time – Improve testing effectiveness (e.g. improve test case quality)

ICSM'09 Edmonton

4

SV&V Testing Model

Design Test Plan (Validate Requirements)

Test Plan (TP)

Requirements Update Requirements

Test Execution

Acceptance Testing

Lifecycle Checkpoint

Review Test Plan

Design

TP Review Design

Review Test Plan

TP

Implementation Review Code

ICSM'09 Edmonton

5

Test Execution

Integration Testing

Component Testing

SV&V Testing Process Flow

Software Requirements

Historical Testing Data

Requirement Modeling

Product Test (PT) Model

ICSM'09 Edmonton

Test Case Suites

Test Case Generation

Test Goals

6

Source Code Analysis Report

Builds

Test Execution

Test Execution Data (Coverage, # of defects, etc.)

Testing Process – Requirement Modeling 1. Requirement Review

Product Test (PT) Model

2. Feature Identification

Features To Be Tested

3. Test Scenario Creation

Test Scenarios

4. Test Variable Identification

Variable-Value Matrix

5. Test Value Profiling

Priority Matrix

Software Requirements

Historical Testing Data

Traceability Matrix

Test Value Profile

ICSM'09 Edmonton

7

Testing Process – Requirement Review 1. Requirement Review

Product Test (PT) Model

2. Feature Identification

Features To Be Tested

3. Test Scenario Creation

Test Scenarios

4. Test Variable Identification

Variable-Value Matrix

5. Test Value Profiling

Priority Matrix

Software Requirements

Historical Testing Data

Traceability Matrix

Test Value Profile

ICSM'09 Edmonton

8

Requirement Review  Reviewing and understanding new requirements from testing point of view  Ensuring that requirements are complete, unambiguous, logically consistent and testable  Example: An online hotel room reservation system R1 – Users must be able to search and reserve rooms by specifying check-in/check-out date, room type, # of rooms, and # of persons

Testing Process – Feature Identification 1. Requirement Review

2. Feature Identification

Product Test (PT) Model Features To Be Tested

Software Requirements

Historical Testing Data

3. Test Scenario Creation

Test Scenarios

4. Test Variable Identification

Variable-Value Matrix

5. Test Value Profiling

Priority Matrix

Traceability Matrix

Test Value Profile

ICSM'09 Edmonton

9

Feature Identification  Identifying features to be tested from the reviewed requirements (what to test)  Example: Features for R1 R1 – Users must be able to search and reserve rooms by specifying check-in/check-out date, room type, # of rooms, and # of persons F1 – Users are able to search rooms F2 – Users are able to book the rooms found

Testing Process – Test Scenario Creation 1. Requirement Review

2. Feature Identification

Product Test (PT) Model Features To Be Tested

Software Requirements

Historical Testing Data

3. Test Scenario Creation

Test Scenarios

4. Test Variable Identification

Variable-Value Matrix

5. Test Value Profiling

Priority Matrix

Traceability Matrix

Test Value Profile

ICSM'09 Edmonton

10

Test Scenario Creation  Determining how to test identified features (how to test)  Analyzing use cases for each feature  Defining (or reusing) one or more test scenarios for each feature  Example: Test scenarios for F1 & F2 TS1 – Verify that users are able to search rooms by specifying room type, check-in/check-out dates, and # of persons, using supported browsers and operation systems TS2 – Verify that users are able to book found rooms by specifying the number of rooms

Testing Process – Test Variable Identification • Test Variable Identification – Deriving test variables and their values from test scenarios – Classifying test variables into two categories: • •

Functional Variables - represent the functional units/areas of the system under test Environment Variables - represent the environment conditions affecting the execution of a function unit

– Example: Test variables and values for TS1 and TS2 Functional Variable

Values SearchRoom

RoomReservation

Environment Variable

BookRoom Values Single

RoomTypes

Suite Family

OperationSystems

Browsers

ICSM'09 Edmonton

WinXP Vista IE7 Firefox

11

Testing Process – Test Value Profiling • Test Value Profiling – Associating test values with code coverage and defect count information by looking up the data from historical testing data repository – Example: Test value profile Code Coverage Environment Variable

RoomTypes

OperationSystems

Browsers

ICSM'09 Edmonton

Values

# of defects found Method_1

Method_2

Method_3

Single

1

2

1

0

Suite

2

2

3

1

Family

1

1

0

1

WinXP

3

0

1

0

Vista

0

0

0

0

IE7

1

1

2

1

Firefox

1

1

1

0

12

SV&V Testing Process Flow

Software Requirements

Historical Testing Data

Requirement Modeling

Product Test (PT) Model

ICSM'09 Edmonton

Test Case Suites

Test Case Generation

Test Goals

13

Source Code Analysis Report

Builds

Test Execution

Test Execution Data (Coverage, # of defects, etc.)

Testing Process – Test Case Generation

From Test Execution

Product Test (PT) Model

1. Variable Classification

Source Code Analysis Report

2. Value Profile Updating

Test Execution Data

Historical Testing Data

3. Test Result Analysis

(Coverage, # of defects etc.)

Test Goals

Test Goals Achieved?

To Test Execution

No

Test Case Suites

ICSM'09 Edmonton

4. Value Combination

14

Yes

Stop Generating Test Cases

Testing Process – Variable Classification

Product Test (PT) Model

Source Code Analysis Report

From Test Execution

Test Execution Data

1. Variable Classification

2. Value Profile Updating

Historical Testing Data

3. Test Result Analysis

(Coverage, # of defects etc.)

Test Goals

Test Goals Achieved?

To Test Execution

Yes

Stop Generating Test Cases

No

Test Case Suites

4. Value Combination

Variable Classification  Categorizing related functional variables into groups, called Test Frame Sets  Each test frame set is passed to the Value Combination sub-process to generate test cases  Test cases generated in each test frame set constitute a test case suite

ICSM'09 Edmonton

15

Testing Process – Value Profile Updating • Value Profile Updating – Updating current test value profile with the test execution data (e.g. code coverage, # of defects found, etc.) from the last iteration – Saving the updated profile to test data repository – Computing and updating the weight for each test value – The weight for a test value is determined based on the number of methods covered and number of bugs found by the test value – Example: Test execution data and value weight

Method_1

Method_2

Method_3

# of defects found

TC001

1

2

1

0

TC002

2

2

3

0

TC003

1

1

0

1

Test Case ID

Code Coverage

Variable

RoomTypes

OperationSystems

Browsers

ICSM'09 Edmonton

16

Values

Weight

Single

0.25

Suite

1

Family

0.75

WinXP

0.75

Vista

0.25

IE7

0.5

Firefox

1

Testing Process – Test Result Analysis

Product Test (PT) Model

Source Code Analysis Report

From Test Execution

Test Execution Data

1. Variable Classification

2. Value Profile Updating

Historical Testing Data

3. Test Result Analysis

(Coverage, # of defects etc.)

Test Goals

Test Goals Achieved?

To Test Execution

Yes

Stop Generating Test Cases

No

Test Case Suites

4. Value Combination

Test Result Analysis  Analyzing test execution data to determine if the test goals are achieved  If the goals are met, the process stops generating test cases; otherwise, it runs the Value Combination sub-process to generate an additional set of test cases

ICSM'09 Edmonton

17

Testing Process – Value Combination Product Test (PT) Model

Source Code Analysis Report

From Test Execution

Test Execution Data

1. Variable Classification

2. Value Profile Updating

Historical Testing Data

3. Test Result Analysis

(Coverage, # of defects etc.)

Test Goals

Test Goals Achieved?

To Test Execution

Yes

Stop Generating Test Cases

No

Test Case Suites

4. Value Combination

Value Combination  Based on the updated value weights from Value Profile Updating, automatically adjusting the predefined combination algorithm to generate a set of test cases to achieve the test goals  The combination algorithm could be any of the linear/multivariate statistical methods, genetic algorithm, and machine learning algorithms  Example: Generated test cases TC003 – SearchRoom[Suite][2009-09-23][2009-09-26][1][IE7][WinXP] TC004 – BookRoom[2][IE7][WinXP]

ICSM'09 Edmonton

18

Contributions & Summary • Modeling requirements written in natural language into structured format to automate test case generation • Utilizing historical test execution data and source code information during test case generation phase to improve effectiveness of test cases • Generating test cases in an iterative manner to incrementally achieve test goals, resulting in smaller number of test cases • Evaluating and adapting different test case generation algorithms (i.e. test value combination algorithms) for RIM SV&V testing

ICSM'09 Edmonton

19

Using Dynamic Execution Data to Generate Test Cases Rozita Dara Shimin Li Weining Liu Angi Smith Ghorbani Ladan Tahvildari

University of Waterloo Canada

Research In Motion Canada

Market Opportunity

ICSM'09 Edmonton. 7. Testing Process – Requirement Modeling. 1. Requirement Review. 2. Feature Identification. 3. Test Scenario Creation. 4. Test Variable Identification. 5. Test Value Profiling. Software. Requirements. Historical. Testing Data. Features. To Be Tested. Test. Scenarios. Variable-Value. Matrix. Priority Matrix.

246KB Sizes 1 Downloads 35 Views

Recommend Documents

World of Opportunity
High School, they don't have to wait their turn for a computer locked in a cart or across ... Chromebooks in hand, they methodically comb a 10-acre plot of.

Advertising Opportunity & Payment.pdf
Indicate that payment is for sponsorship). Page 3 of 3. Advertising Opportunity & Payment.pdf. Advertising Opportunity & Payment.pdf. Open. Extract. Open with.

JOB OPPORTUNITY Communications Manager ... - OpenChannels
Working at the boundary between science and management, our programs span fisheries management, ocean acidification ... technology. ... Bachelor's or Master's degree in related field (e.g., English, journalism, public relations, marketing,.

Postdoctoral Research Opportunity - Groupe Calcul
Feb 15, 2014 - standardization process may severely bias the estimates of abundance indices. .... Dr. Emmanuel Chassot at [email protected],.

Internship Opportunity - Northern Texas PGA
Manage and promote the Junior Tour via social media outlets, including ... Assist with the management of the NTPGA Junior Golf Foundation Website ... The Northern Texas PGA will hire approximately 20 total interns for the summer of 2015.

The-Entrepreneurial-Mindset-Continuously-Opportunity ...
The-Entrepreneurial-Mindset-Continuously-Opportunity-B0140D35EA-ebook.pdf. The-Entrepreneurial-Mindset-Continuously-Opportunity-B0140D35EA-ebook.

JOB OPPORTUNITY Communications Manager ... - OpenChannels
JOB OPPORTUNITY. Communications ... with timely and credible information that is useful for making sound ocean management decisions. The ... technology.

Opportunity knocks, Chromebook answers
All other company and product names may be trademarks of the respective companies with which they ... Compared to other machines, there's no software to be.

2014325 - The Opportunity Agenda.pdf
TOOL! Toolkit.OpportunityAgenda.org. Page 4 of 29. 2014325 - The Opportunity Agenda.pdf. 2014325 - The Opportunity Agenda.pdf. Open. Extract. Open with.