Saturday, January 24, 2009
Friday, January 23, 2009
Thursday, January 22, 2009
20 Best Testing Practices
Here are some of the best testing practices I learned by experience:
1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions.
2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible still you can always try to reach near it.
3) To ensure maximum test coverage break your application under test (AUT) into smaller functional modules. Write test cases on such individual unit modules. Also if possible break these modules into smaller parts.
E.g: Let’s assume you have divided your website application in modules and ‘accepting user information’ is one of the modules. You can break this ‘User information’ screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the ‘User information’ form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all such test cases for maximum coverage.
4) While writing test cases, write test cases for intended functionality first i.e. for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of application under test.
5) Think positive. Start testing the application by intend of finding bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application by intention of finding bugs you will definitely succeed to find those subtle bugs also.
6) Write your test cases in requirement analysis and design phase itself. This way you can ensure all the requirements are testable.
7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop quality application. This will also save the re-work time.
8) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing.
9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored part by testers due to lack of required performance testing large data volume. Find out ways to test your application for performance. If not possible to create test data manually then write some basic scripts to create test data for performance test or ask developers to write one for you.
10) Programmers should not test their own code. As discussed in our previous post, basic unit testing of developed application should be enough for developers to release the application for testers. But you (testers) should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in agile project environment.
11) Go beyond requirement testing. Test application for what it is not supposed to do.
12) While doing regression testing use previous bug graph (Bug graph - number of bugs found against time for different modules). This module-wise bug graph can be useful to predict the most probable bug part of the application.
13) Note down the new terms, concepts you learn while testing. Keep a text file open while testing an application. Note down the testing progress, observations in it. Use these notepad observations while preparing final test release report. This good habit will help you to provide the complete unambiguous test report and release details.
14) Many times testers or developers make changes in code base for application under test. This is required step in development or testing environment to avoid execution of live transaction processing like in banking projects. Note down all such code changes done for testing purpose and at the time of final release make sure you have removed all these changes from final client side deployment file resources.
15) Keep developers away from test environment. This is required step to detect any configuration changes missing in release or deployment document. Some times developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to testing environment they will not do any such changes accidentally on test environment and these missing things can be captured at the right place.
16) It’s a good practice to involve tester’s right from software requirement and design phase. These way testers can get knowledge of application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make request to your lead or manager to involve your testing team in all decision making processes or meetings.
17) Testing teams should share best testing practices, experience with other teams in their organization.
18) Increase your conversation with developers to know more about the product. Whenever possible make face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also when you understand the requirement or resolve any dispute - make sure to communicate the same over written communication ways like emails. Do not keep any thing verbal.
19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work.
20) Write clear, descriptive, unambiguous bug report. Do not only provide the bug symptoms but also provide the effect of the bug and all possible solutions.
“Don’t forget testing is a creative and challenging task. Finally it depends on your skill and experience, how you handle this challenge.”
1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions.
2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible still you can always try to reach near it.
3) To ensure maximum test coverage break your application under test (AUT) into smaller functional modules. Write test cases on such individual unit modules. Also if possible break these modules into smaller parts.
E.g: Let’s assume you have divided your website application in modules and ‘accepting user information’ is one of the modules. You can break this ‘User information’ screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the ‘User information’ form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all such test cases for maximum coverage.
4) While writing test cases, write test cases for intended functionality first i.e. for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of application under test.
5) Think positive. Start testing the application by intend of finding bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application by intention of finding bugs you will definitely succeed to find those subtle bugs also.
6) Write your test cases in requirement analysis and design phase itself. This way you can ensure all the requirements are testable.
7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop quality application. This will also save the re-work time.
8) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing.
9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored part by testers due to lack of required performance testing large data volume. Find out ways to test your application for performance. If not possible to create test data manually then write some basic scripts to create test data for performance test or ask developers to write one for you.
10) Programmers should not test their own code. As discussed in our previous post, basic unit testing of developed application should be enough for developers to release the application for testers. But you (testers) should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in agile project environment.
11) Go beyond requirement testing. Test application for what it is not supposed to do.
12) While doing regression testing use previous bug graph (Bug graph - number of bugs found against time for different modules). This module-wise bug graph can be useful to predict the most probable bug part of the application.
13) Note down the new terms, concepts you learn while testing. Keep a text file open while testing an application. Note down the testing progress, observations in it. Use these notepad observations while preparing final test release report. This good habit will help you to provide the complete unambiguous test report and release details.
14) Many times testers or developers make changes in code base for application under test. This is required step in development or testing environment to avoid execution of live transaction processing like in banking projects. Note down all such code changes done for testing purpose and at the time of final release make sure you have removed all these changes from final client side deployment file resources.
15) Keep developers away from test environment. This is required step to detect any configuration changes missing in release or deployment document. Some times developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to testing environment they will not do any such changes accidentally on test environment and these missing things can be captured at the right place.
16) It’s a good practice to involve tester’s right from software requirement and design phase. These way testers can get knowledge of application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make request to your lead or manager to involve your testing team in all decision making processes or meetings.
17) Testing teams should share best testing practices, experience with other teams in their organization.
18) Increase your conversation with developers to know more about the product. Whenever possible make face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also when you understand the requirement or resolve any dispute - make sure to communicate the same over written communication ways like emails. Do not keep any thing verbal.
19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work.
20) Write clear, descriptive, unambiguous bug report. Do not only provide the bug symptoms but also provide the effect of the bug and all possible solutions.
“Don’t forget testing is a creative and challenging task. Finally it depends on your skill and experience, how you handle this challenge.”
Wednesday, January 21, 2009
How to write a good bug report? Tips and Tricks
Why good Bug report?
If your bug report is effective, chances are higher that it will get fixed. So fixing a bug depends on how effectively you report it. Reporting a bug is nothing but a skill and I will tell you how to achieve this skill.
“The point of writing problem report(bug report) is to get bugs fixed” - By Cem Kaner.
If tester is not reporting bug correctly, programmer will most likely reject this bug stating as irreproducible. This can hurt testers moral and some time ego also. (I suggest do not keep any type of ego. Ego’s like “I have reported bug correctly”, “I can reproduce it”, “Why he/she has rejected the bug?”, “It’s not my fault” etc etc..)
What are the qualities of a good software bug report? Anyone can write a bug report. But not everyone can write a effective bug report. You should be able to distinguish between average bug report and a good bug report. How to distinguish a good or bad bug report? It’s simple, apply following characteristics and techniques to report a bug.
1) Having clearly specified bug number: Always assign a unique number to each bug report. This will help to identify the bug record. If you are using any automated bug-reporting tool then this unique number will be generated automatically each time you report the bug. Note the number and brief description of each bug you reported.
2) Reproducible: If your bug is not reproducible it will never get fixed. You should clearly mention the steps to reproduce the bug. Do not assume or skip any reproducing step. Step by step described bug problem is easy to reproduce and fix.
3) Be Specific: Do not write a essay about the problem. Be Specific and to the point. Try to summarize the problem in minimum words yet in effective way. Do not combine multiple problems even they seem to be similar. Write different reports for each problem.
How to Report a Bug?
Use following simple Bug report template:
This is a simple bug report format. It may vary on the bug report tool you are using. If you are writing bug report manually then some fields need to specifically mention like Bug number which should be assigned manually.
Reporter: Your name and email address.
Product: In which product you found this bug.
Version: The product version if any.
Component: These are the major sub modules of the product.
Platform: Mention the hardware platform where you found this bug. The various platforms like ‘PC’, ‘MAC’, ‘HP’, ‘Sun’ etc.
Operating system: Mention all operating systems where you found the bug. Operating systems like Windows, Linux, Unix, SunOS, Mac OS. Mention the different OS versions also if applicable like Windows NT, Windows 2000, Windows XP etc.
Priority: When bug should be fixed? Priority is generally set from P1 to P5. P1 as “fix the bug with highest priority” and P5 as ” Fix when time permits”.
Severity: This describes the impact of the bug.
Types of Severity:
Blocker: No further testing work can be done.
Critical: Application crash, Loss of data.
Major: Major loss of function.
Minor: minor loss of function.
Trivial: Some UI enhancements.
Enhancement: Request for new feature or some enhancement in existing one.
Status: When you are logging the bug in any bug tracking system then by default the bug status is ‘New’. Later on bug goes through various stages like Fixed, Verified, Reopen, Won’t Fix etc. Click here to read more about detail bug life cycle.
Assign To: If you know which developer is responsible for that particular module in which bug occurred, then you can specify email address of that developer. Else keep it blank this will assign bug to module owner or Manger will assign bug to developer. Possibly add the manager email address in CC list.
URL: The page url on which bug occurred.
Summary: A brief summary of the bug mostly in 60 or below words. Make sure your summary is reflecting what the problem is and where it is.
Description: A detailed description of bug. Use following fields for description field:
Reproduce steps: Clearly mention the steps to reproduce the bug.
Expected result: How application should behave on above mentioned steps.
Actual result: What is the actual result on running above steps i.e. the bug behavior.
These are the important steps in bug report. You can also add the “Report type” as one more field which will describe the bug type.
The report types are typically:
1) Coding error
2) Design error
3) New suggestion
4) Documentation issue
5) Hardware problem
Some Bonus tips to write a good bug report:
1) Report the problem immediately: If you found any bug while testing, do not wait to write detail bug report later. Instead write the bug report immediately. This will ensure a good and reproducible bug report. If you decide to write the bug report later on then chances are high to miss the important steps in your report.
2) Reproduce the bug three times before writing bug report: Your bug should be reproducible. Make sure your steps are robust enough to reproduce the bug without any ambiguity. If your bug is not reproducible every time you can still file a bug mentioning the periodic nature of the bug.
3) Test the same bug occurrence on other similar module: Sometimes developer use same code for different similar modules. So chances are high that bug in one module can occur in other similar modules as well. Even you can try to find more severe version of the bug you found.
4) Write a good bug summary: Bug summary will help developers to quickly analyze the bug nature. Poor quality report will unnecessarily increase the development and testing time. Communicate well through your bug report summary. Keep in mind bug summary is used as a reference to search the bug in bug inventory.
5) Read bug report before hitting Submit button: Read all sentences, wording, steps used in bug report. See if any sentence is creating ambiguity that can lead to misinterpretation. Misleading words or sentences should be avoided in order to have a clear bug report.
6) Do not use Abusive language: It’s nice that you did a good work and found a bug but do not use this credit for criticizing developer or to attack any individual. Conclusion: No doubt that your bug report should be a high quality document. Focus on writing good bug reports, spend some time on this task because this is main communication point between tester, developer and manager. Mangers should make aware to their team that writing a good bug report is primary responsibility of any tester. Your efforts towards writing good bug report will not only save company resources but also create a good relationship between you and developers.
If your bug report is effective, chances are higher that it will get fixed. So fixing a bug depends on how effectively you report it. Reporting a bug is nothing but a skill and I will tell you how to achieve this skill.
“The point of writing problem report(bug report) is to get bugs fixed” - By Cem Kaner.
If tester is not reporting bug correctly, programmer will most likely reject this bug stating as irreproducible. This can hurt testers moral and some time ego also. (I suggest do not keep any type of ego. Ego’s like “I have reported bug correctly”, “I can reproduce it”, “Why he/she has rejected the bug?”, “It’s not my fault” etc etc..)
What are the qualities of a good software bug report? Anyone can write a bug report. But not everyone can write a effective bug report. You should be able to distinguish between average bug report and a good bug report. How to distinguish a good or bad bug report? It’s simple, apply following characteristics and techniques to report a bug.
1) Having clearly specified bug number: Always assign a unique number to each bug report. This will help to identify the bug record. If you are using any automated bug-reporting tool then this unique number will be generated automatically each time you report the bug. Note the number and brief description of each bug you reported.
2) Reproducible: If your bug is not reproducible it will never get fixed. You should clearly mention the steps to reproduce the bug. Do not assume or skip any reproducing step. Step by step described bug problem is easy to reproduce and fix.
3) Be Specific: Do not write a essay about the problem. Be Specific and to the point. Try to summarize the problem in minimum words yet in effective way. Do not combine multiple problems even they seem to be similar. Write different reports for each problem.
How to Report a Bug?
Use following simple Bug report template:
This is a simple bug report format. It may vary on the bug report tool you are using. If you are writing bug report manually then some fields need to specifically mention like Bug number which should be assigned manually.
Reporter: Your name and email address.
Product: In which product you found this bug.
Version: The product version if any.
Component: These are the major sub modules of the product.
Platform: Mention the hardware platform where you found this bug. The various platforms like ‘PC’, ‘MAC’, ‘HP’, ‘Sun’ etc.
Operating system: Mention all operating systems where you found the bug. Operating systems like Windows, Linux, Unix, SunOS, Mac OS. Mention the different OS versions also if applicable like Windows NT, Windows 2000, Windows XP etc.
Priority: When bug should be fixed? Priority is generally set from P1 to P5. P1 as “fix the bug with highest priority” and P5 as ” Fix when time permits”.
Severity: This describes the impact of the bug.
Types of Severity:
Blocker: No further testing work can be done.
Critical: Application crash, Loss of data.
Major: Major loss of function.
Minor: minor loss of function.
Trivial: Some UI enhancements.
Enhancement: Request for new feature or some enhancement in existing one.
Status: When you are logging the bug in any bug tracking system then by default the bug status is ‘New’. Later on bug goes through various stages like Fixed, Verified, Reopen, Won’t Fix etc. Click here to read more about detail bug life cycle.
Assign To: If you know which developer is responsible for that particular module in which bug occurred, then you can specify email address of that developer. Else keep it blank this will assign bug to module owner or Manger will assign bug to developer. Possibly add the manager email address in CC list.
URL: The page url on which bug occurred.
Summary: A brief summary of the bug mostly in 60 or below words. Make sure your summary is reflecting what the problem is and where it is.
Description: A detailed description of bug. Use following fields for description field:
Reproduce steps: Clearly mention the steps to reproduce the bug.
Expected result: How application should behave on above mentioned steps.
Actual result: What is the actual result on running above steps i.e. the bug behavior.
These are the important steps in bug report. You can also add the “Report type” as one more field which will describe the bug type.
The report types are typically:
1) Coding error
2) Design error
3) New suggestion
4) Documentation issue
5) Hardware problem
Some Bonus tips to write a good bug report:
1) Report the problem immediately: If you found any bug while testing, do not wait to write detail bug report later. Instead write the bug report immediately. This will ensure a good and reproducible bug report. If you decide to write the bug report later on then chances are high to miss the important steps in your report.
2) Reproduce the bug three times before writing bug report: Your bug should be reproducible. Make sure your steps are robust enough to reproduce the bug without any ambiguity. If your bug is not reproducible every time you can still file a bug mentioning the periodic nature of the bug.
3) Test the same bug occurrence on other similar module: Sometimes developer use same code for different similar modules. So chances are high that bug in one module can occur in other similar modules as well. Even you can try to find more severe version of the bug you found.
4) Write a good bug summary: Bug summary will help developers to quickly analyze the bug nature. Poor quality report will unnecessarily increase the development and testing time. Communicate well through your bug report summary. Keep in mind bug summary is used as a reference to search the bug in bug inventory.
5) Read bug report before hitting Submit button: Read all sentences, wording, steps used in bug report. See if any sentence is creating ambiguity that can lead to misinterpretation. Misleading words or sentences should be avoided in order to have a clear bug report.
6) Do not use Abusive language: It’s nice that you did a good work and found a bug but do not use this credit for criticizing developer or to attack any individual. Conclusion: No doubt that your bug report should be a high quality document. Focus on writing good bug reports, spend some time on this task because this is main communication point between tester, developer and manager. Mangers should make aware to their team that writing a good bug report is primary responsibility of any tester. Your efforts towards writing good bug report will not only save company resources but also create a good relationship between you and developers.
Monday, January 19, 2009
Sunday, January 18, 2009
Saturday, January 17, 2009
Wednesday, January 14, 2009
Sample Software Testing Standards and Procedures
By
Craig Borysowich (Chief Technology Tactician)
Some of my entries that continue to get heavy traffic after being posted over a year ago, is my '10 step guide to developing a test plan' and my 'Sample Test Plan Template' - which are also good lead-ins for the following post on software testing standards and procedures that should be a part of theDesign and Development Standards and Procedures document.
The purpose of this document is to describe the standards and procedures to follow during the software testing phases of the SYSTEM Z project. This document supports the section on Testing and Validation in the Integration and Methods Quality Manual.
1. Scope
These standards and procedures state the general standards and procedures to follow to plan and conduct software testing and validation. These standards and procedures may be changed via a change control mechanism that allows all those concerned to be notified of changes made to the steps.
2. SOFTWARE TEST PLANNING
2.1 Introduction
Software test planning is the process whereby the following are established for the testing of a given project deliverable:
· Testing requirements (scope),
· Testing approach,
· Testing tasks and deliverables,
· Estimates,
· Testing phases,
· Testing schedule,
· Completion criteria,
· Test environment and team roles and responsibilities.
2.2 Application Level Planning
High level software test planning is conducted within the project planning phase to establish the high level plan for testing.
2.2.1 Objectives
· To identify Testing Requirements (Scope):
· to identify the software to be tested
· to identify the testing objectives
· to identify the test phases (testing coverage) within the testing life cycle that is required
· To identify Testing Approach:
· to identify the methods and testing tools required
· to identify any client assumptions/dependencies/limitations
· To identify Testing Tasks and Deliverables:
· to identify the activities to perform within each testing phase.
· to identify the external (client) deliverable document.
· to identify the table of contents for each deliverable
· to identify the internal deliverable documents
· to identify document deliverable reviewer
· To compile Testing Estimates:
· to identify the budgetary estimate for each identified phase of software testing
· To determine Testing Schedules:
· to identify the start and end date for each phase of software testing
· to identify all testing phase overlaps in the schedule
· to identify delivery dates for all document deliverables
· To determine Testing Phase Completion Criteria:
· to identify the completion criteria of each identified software phase
· To determine Test Environments:
· to identify the software/hardware requirements for each test phase
· to identify the number of test environments
· To identify Test Team Roles and Responsibilities:
· to identify the overall testing management responsibility and for each test phase
· to identify client roles and responsibilities
2.2.2 Responsibilities
· Development Manager:
· ensures that proper analysis and planning is done for the unit testing phase
· Technical Services Manager:
· ensures that proper analysis and planning is done for performance testing
· Application Test Manager:
· ensures that proper analysis and planning is done for all other test phases
2.2.3 Inputs
· First Release:
· Statement of Compliance
· Contract Proposal
· System Blueprint
· Subsequent Releases:
· Previous Releases Internal and External Deliverables
2.2.4 Method
· see Testing Work Instructions
2.2.5 Working Documents
· Test Hardware and Software Requirements
· PWW plans
· Meeting Minutes
· Testing Work Instructions
2.2.6 Deliverables
· Program Plan
· Release Estimates
· SDE Requirements Report
· System Z Standards and Procedures
2.3 Test Phase Level Planning
2.3.1 Objectives
Software test planning is conducted at the testing phase level to establish a working plan for each phase.
· To identify Testing Requirements (Scope):
· to identify the testing phase objectives
· to identify the testing activities for the phase
· to identify software load within a phase
· to identify contents of the software load(s)
· to identify special testing requirements of critical components
· To identify detailed testing tasks
· To identify the estimates for each task within the phase
· To determine Testing Schedules
· to identify the testing start and end date for each software load
· to identify internal completion dates for internal/external deliverables
· to identify start and end dates for resources (both human and physical)
· to identify training dates required for testing staff
· To specify SDE requirements
· to identify any specify setup requirements for desktop workstations
· to specify the schedule for the setup of test environment which requires SDE support
· to specify requirements for special tools (e.g., PRS, Functional Requirement Analysis Matrix, etc.)
· To identify Test Team Roles and Responsibilities
· to identify team member's roles and responsibility
· to identify team member's skill set requirements
· to identify team member training requirements
2.3.2 Responsibilities
· Integration Test Manager:
· ensures that proper analysis and planning is done for Integration testing
· System Test Manager:
· ensures that proper analysis and planning is done for System testing
· Development Manager:
· ensures that proper analysis and planning is done for unit testing phase
· Technical Services Manager:
· ensures that proper analysis and planning is done for performance testing
2.3.3 Inputs
· First Release:
· Requirements Specification
· Functional Specification
· System Description
· System Blue Print
· Subsequent Releases:
· Test Hardware and Software Requirements
· Previous Releases Internal and External Deliverables
2.3.4 Method
· see Testing Work Instructions
2.3.5 Working Documents
· PWW plans
· Meeting Minutes
· Testing Work Instructions
3. UNIT TESTING PROCEDURES
The goal of unit testing is to assure that all functions and features of a single compilable unit of code perform as specified in the Design Specification.
A unit test covers the testing of a software unit, or a group of closely related units, as a single entity. Unit testing is performed in isolation, using test drivers to simulate higher level units, and/or stubs to simulate lower level units. Unit Testing Procedures consist of:
· Creating a Unit Test Plan
· Creating test data
· Conducting tests according to the Unit Test Plan
· Reporting and reviewing the results of the test
These procedures are performed by the team member responsible for programming and testing of the unit.
A Unit Test Plan is a set of test cases arranged in the sequence of chronological execution. The Unit Test Plan is created before the programming of the unit is started, and the test cases should cover the functional, input, output, and function interaction of the unit.
3.1 Documents Required
The following documents provide information required to create the Unit Test Plan and are recommended reading before creating the Unit Test Plan.
· Design Report
· Requirements Reports
· Change Requests
3.2 Unit Test Design Guidelines
The guidelines to be followed during the creation of Unit Test Plans are:
· A test case must exist for every branch in the code
· Design test cases and test data which reveal errors in software
· Design test data that will ensure all conditions and quality of data edits are covered
· Create test cases for special formulae and extreme conditions (e.g., Test case "File is Empty" shall be used for all files.)
· Test the interaction between units within the task
· To minimize the number of test cases, combine test cases into one if they test the same feature. (i.e., can cover a group of units or a full task)
· Use test cases which already exist wherever possible. Include the generic test plan
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together
· Where possible, arrange the test cases in the chronological order in which they will be performed
3.3 Unit Testing Steps
Unit tests are created and executed by the developer of the unit. The procedure for unit testing is as follows:
· Create a unit test plan following the Unit Test Plan guidelines
· Conduct the unit test as specified in the test cases
· Identify and fix or report any problems encountered
· Re-run the necessary tests
· Sign the Test Plan Cover Page (Tested By and Date)
· Package the Test Documentation and pass it to the development team leader
· The development team leader is to verify that the documentation is complete, sign the Test Plan Cover Page (Reviewed By and Date) and submit the package to Quality Assurance for review and Configuration Management for promotion
· Promote script or command files used to run the tests along with the unit
3.4 Create Unit Test Plan
· Identify Features to Test
· Using the Functional Specification, Preliminary and Detailed Design Specification (Unit Procedural Description) identify:
· All Functions performed by the Unit
· All Inputs to the Unit
· All Outputs from the Unit
· Define all ranges and discrete values of the test data necessary to run the tests
· Prepare the Unit Test Plan following the Unit Test Plan guidelines
Note: See the Appendix entitled Templates for the template to be used and a description of the cover page contents.
· Design a set of Test Cases
· Use the checklists for the five types of coverage and outlined functions, inputs and outputs to create the minimum set of test cases for testing the functionality of the unit
· For each test case identified in the first point:
· State the condition that will be tested by the test case (this should be used as the title of the test case)
· List the steps/actions to be performed in order to accomplish the test
· For each action performed identify the expected result
· Create test data necessary to create the condition being tested and for each piece of data, indicate the expected results
For example, a test case for an invalid id on a data entry screen could be named "Invalid id". The title states the condition of the test. The procedure for testing this condition should indicate in which data entry field the cursor should be positioned and what key should be pressed to trigger the edit. A table containing the various data elements to be entered can be attached and referenced by one of the steps in the procedure. This data table could also contain the expected results for each data item to be entered.
Note: Skeletons for the test plan and test case are available as templates in Word for Windows
3.5 Conduct Unit Test
Unit testing is performed according to the following procedures:
· Unit tests will be run according to the Unit test Plan created by the developer
· If any of the actual results do not agree with the expected results, the developer fixes the code and re-runs the test. There is a possibility that the Unit Test Plan will need to be updated if it is determined that the Unit Test Plan is not correct or no longer up-to-date
· Once all test cases have been successfully completed the developer signs the top page of the test plan, completes a promotion request form and passes the package to the developers team leader
· The team leader verifies that the unit test has been performed and the team leader (or developer) passes the promotion request to Configuration Management
4. INTEGRATION TESTING PROCEDURES
The goal of integration testing is to ensure that all interacting subsystems in a system interface correctly with one another to produce the desired results. Furthermore, in trying to attain this goal, integration tests will ensure that the introduction of one or more subsystems into the system does not have an adverse affect on existing functionality.
An integration test covers the testing of interface points between subsystems. Integration testing is performed once unit testing has been completed for all units contained in the subsystems being tested. Integration Testing Procedures consist of:
· Creating and integration test plan
· Creating test data
· Conducting tests according to the integration test plan
· Reporting and reviewing the results of the test
During this phase, the interaction between subsystems is tested. This includes interfaces through Inter Process Communications (IPC) and files. This phase is performed by an independent test team. This team prepares and executes integration tests, generates problem reports and is responsible for passing the integrated system on to the System Test Team for system testing. The Integration Test team then enters a support mode in which it will test problem reports generated by the System Test team before forwarding code fixes to the System Testing environment.
This phase is sometimes combined with the system test phase as per the client's request.
4.1 Documents Required
The following documents provide information required to create the Integration Test Plan and are recommended reading before starting the planning phase.
· System Blueprint
· High Level Design overview from the developers
· Detailed Design
· Entity Relationship Diagrams
· System Requirements Report
· Change Requests
4.2 Integration Test Design Guidelines
The guidelines to be followed during the creation of Integration Test Plans are:
· The number of new units or tasks to be tested by one Test Plan should not exceed five.
· To minimize the number of test cases, combine test cases into one if they test the same interface point.
· Use test cases which already exist. Portions of available Unit Test Plans or System Test Plans can be used where applicable.
· When testing few or minor changes to an existing subsystem, structure the Test Plan such that the bulk of it will be regression testing using an existing Integration or System Test plan. New test cases can then be added to cover the detailed testing of changes.
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together. Where possible, arrange the test cases in the order they will be performed.
· Whenever possible, design test plans to run independently of other test plans.
· Design test cases and test data which reveal errors in the interaction between the software components. (Check the various response codes to calls to external interfaces.).
4.3 Integration Testing Steps
Integration tests will be created and performed by designated members of each team. Each individual will be responsible for the preparation of all test cases, procedures and data, as well as for conducting and documenting the tests. Each individual will also be responsible for the specification of all additional tools and facilities required for the integration testing of their tasks. The procedure for integration testing is as follows:
· Review all relevant design documentation and attend all design overviews/walkthroughs.
· Create an integration test plan.
· Where possible, create scripts to automate the execution of the test case.
· Arrange to have Integration Test Plans reviewed by Development for technical accuracy. The Test Plans may have to be updated after these reviews to incorporate changes suggested by the Developers.
· Conduct the test as specified in the test cases.
· Identify any problems which are encountered or where the actual results do not agree with the defined expected results. Complete a Problem Report. (see the TBU Problem Report and System User Guide for the procedure to follow for handling problem reports.) Update Test Plan execution status in the tracking document (see the appendix entitled Matrices, Logs and indices).
· Once all problems have been resolved, re-run the necessary tests.
4.4 Create Integration Test Plan
This section provides a guide for creating an Integration Test Plan. Skeletons for the test plan, test case and results summary are available in Word for Windows. By using this template and the style codes defined, table of contents can be created that are used to create the tracking document.
· Identify subsystem interface points:
· The Design Reports identify subsystem interface points. This should provide a high level view of which subsystems are changing and what, if any, new subsystems will be created to bring the system in line with requirements.
· A review of the Detailed Designs is conducted to determine which units (and therefore, which subsystems) are changing .
· For new subsystems, or major changes to existing subsystems, the interface points must be identified by using the Detailed Designs--these contain IPCs, Tables/Files accessed and process descriptions which will help the tester to identify critical interface points.
· If the subsystem in question is not new and will not require major changes, then this points to the need for regression testing of existing interface points to test that the subsystem functions as it did before any changes were implemented .
· Divide the interface points into logical groupings (test plans). Draw the IPC diagram illustrating the interface points.
· Create test cases to test each interface:
· Enter a purpose for each test case. Identify the conditions being tested. Ensure that each statement in the purpose is proven in the Expected Results.
· Using the Detailed Design Report, identify the processes within the subsystems that are the actual interfaces. These could be messages passed between processes or data written by one process and read by another. List these processes under Interface Components Tested. If the interface is by file, identify the tables being read, written or updated and list them in the File/Table Interface Points section.
· List the steps to be followed in order to accomplish the purpose of the test:
· List the sub-test that identifies the interfaces being tested in each test case.
· Below each sub-test heading, list the steps required to accomplish the test.
· In test cases for interactive functions, describe the actions to be performed by the tester followed by the result expected from the action.
· For non-interactive tests, list the steps to be performed. This usually involves running a command file, but may also consist of listing the steps required to use an emulator or other test tool.
· Expected results statements must describe only that which is visible to the tester. Processing which cannot be proven is not to be included.
· Create test data where applicable.
· Establish the expected results for each test case:
· The Expected Results section describes the outcome of an event that was triggered by a step in a test plan. For example it may be expected that after an IPC is sent from one process and successfully received by another, a database change is made. In this instance, the Expected Results section would describe how the database should look (i.e. the changes to a file/table caused by the IPC). Once all the test cases in a test plan are defined, update the Interface Points Tested and File/Table Interface Points sections of the test plan introduction page. It is not necessary to list every software component being used in the test cases, only the specific ones being tested by the test cases. (i.e., do not repeat software components tested fully by a previous test plan, unless the software component is being used for re-configuration of the system.)
· Test Setup Notes: Identify special instructions for the test case.
· List any requirements for the test cases in the Test Setup Notes section. For example, it might be stated that it will be necessary for the tester to backup the data files used in the test case so that they may be restored for running subsequent test cases. Where possible, create scripts to automate the execution of each test case. The name of this script should be listed in the Notes section of the test case.
· Develop procedures to execute (scripts) and evaluate each test plan (i.e., produce SQLCI reports to list the contents of tables).
· Identify command files that will back up or restore the data base to the state it was in at the start or completion of each test plan and list these command file names in the NOTES section of the test case.
· Create a Test Case Tracking document.
· After completing all the Integration Test Plans, create a Tracking document (see the appendix entitled Matrices, Logs and Indices) using a spreadsheet such as Microsoft Excel.
4.5 Conduct Integration Test
Integration testing will be performed according to the following:
· Integration tests will be run according to the Integration Test Plans by the Test Team Leader or Test Team Member.
· Actual results of the test runs are presented by printing documentation (reports, file dumps) or by demonstration (screen, panel displays).
· If any of the actual results do not agree with expected results, the person performing the test will complete a Problem Report (PR).
· After the necessary action has been taken to resolve the problem, the test run will be performed again from the beginning of the test step .The Test Plan may need to be updated, depending on the results of the test.
· Update the Tracking document at least once a day. As a test is completed, either successfully or unsuccessfully, the tester should update the Tracking document. The tester's initials are to be updated each time a different person performs the test. If a test step is completed without any problem reports, the test step is considered "closed". However, if a problem is raised after running a test step, the tester will indicate this in the tracking document. The tester updates the tracking document to reflect both the number of problem reports raised by, and the problem report PRS numbers associated with, the particular test step executed. (see tracking document template in the appendix entitled Matrices, Logs and indices). The Integration Test Team meets frequently to discuss the testing activities, possible conflicts and to review Problem Reports.
· The Integration Test Manager meets frequently with the Development Team Leaders to review Problem Reports, negotiate priorities for code fixes, and discuss support issues.
· When an error is found, do not spend a lot of time trying to debug the problem. Instead, raise a problem report providing as much details as possible so that the person or persons resolving the problem will know what to look for. Whenever possible, dump screens, logs, or tables to files or paper and forward a copy to whomever the problem reports are assigned. This will help everyone to get the problems reports answered as efficiently as possible.
5. SYSTEM TESTING PROCEDURES
5.1 Introduction
The goal of System Testing is to ensure that the system performs as per the functional requirements specified by client.
A system test covers the testing of functions within the system. System testing is performed once integration testing has been completed. System Testing procedures consist of:
· Creating Test Plans
· Creating test data
· Conducting tests according to the System Test Plan
· Reporting and reviewing the results of the test
Features to be tested during System Testing are:
· Functional Requirements
· Depending on the project, any regression tests deemed necessary
5.2 Documents Required
The following documents provide information required to create the System Test Plan and is recommended reading before starting the planning phase.
· High Level Design overview
· Problem Report Analysis Report
· Database Design Report
· FRAM
· Requirements Reports
· Change Requests
· Appropriate 3rd Party Interface Specifications
5.3 System Test Design Guidelines
The following are recommended guidelines when designing system tests:
· Design test cases to ensure that all requirements identified in the Functional Requirements Analysis Matrix document are tested by one or more test cases.
· In order to minimize the number of test cases required, design test cases to establish the presence of several related requirements.
· Each logical test case should test related functionality.
· Use test cases that already exist wherever possible.
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.
· Where possible, arrange the test cases in the order the function would be performed from a business perspective.
· Design test plans to run independently of other test plans.
· Identify a procedure to set up the database as required at the start of the test.
· Design test cases and test data that reveal errors in software.
· Design test data that will ensure all conditions and qualities of data edits are covered.
· Use live or representative data as much as possible in order to provide realistic functional tests. Any comments about setting up the test data are to be documented.
· Data for most reports should come from the data prepared for testing the interactive processes. It is acceptable to have the reports contain existing data from the database.
5.4 System Testing Steps
System Tests shall be run by the System Testing Team. A skeleton for the system test plan is available in Word for Windows. The procedure for system testing is as follows:
· Review all requirements and design documents.
· Attend system reviews presented by Development and Analysis Team members.
· Create and maintain a detailed System Test Project Plan.
· Divide the FRAM Requirements into logical groupings or scenarios. These scenarios should reflect the business or operational approach to the system.
· Define any necessary regression tests.
· Create a System Test Plan.
· Where possible, create scripts to automate the execution of a test case.
· Ensure the System Test Plan is reviewed by appropriate parties (Development and Quality Assurance).
· Verify that the System Test Environment has been created.
· Conduct the test as specified in the test cases.
· Identify any problems that are encountered or where the actual results do not agree with the defined expected results and complete a Problem Report.
· Record in the Tracking document the steps executed, relevant PRs, and test cases completed.
· Once all problems have been resolved, re-run the necessary tests.
· Update test plans after the testing is complete.
· Produce Post Project System Testing Reports
5.5 Create System Test Plan
· Obtain a copy of the Requirements Report, FRAM document, Database Design Report, and Detailed Design Report.
· Determine a table of contents for the system test plan and assign the individual test plan scenarios to testers.
· Inform QA and publishing of delivery dates for QA review and publishing. Inform QA of any special testing strategies which will be adopted.
· Review the above-mentioned documents for the test plan scenarios to be written.
· Schedule a testing overview with the analysis and/or development teams to gather the necessary information for writing the test plan scenario.
· Determine the test cases for the test plan scenario.
· Allocate the FRAM to the appropriate test case.
· Write the test plan scenario using the system test plan template.
· Submit a copy of the test plan scenario to the appropriate parties (Analysis and Development teams) for review. The appropriate parties include a System Test peer and development. Depending on the project, the client may participate in a system testing role and may also review the test plan scenario prior to publication.
· Submit a copy of the allocated FRAM to test plan/test case to the FRAM officer. Obtain an updated FRAM document allocated to test plan/test case.
· Submit a copy of the test plan scenario which has been reviewed in a previous step to QA. Along with the test plan scenario, submit a copy of the FRAM which has been allocated to test plan/test case level.
· Upon QA review, make any updates to the test plan scenario which are deemed appropriate.
· Resubmit the test plan scenario to QA for final review.
· Submit the test plan scenario to publishing.
· Create the System Test Tracking Report once all test plan scenarios have been reviewed by QA.
· Submit the System Test Tracking Report to Publishing
5.6 Conduct System Test
· Verify that the System Test Environment has been created and that it is functional.
· Create any test data necessary for executing the system test plan scenarios.
· Execute the system test plan scenarios as assigned to each test team member.
· Create a problem report for deviations from the expected results documented in the system test plan scenario.
· Interact with support team to help resolve problem reports.
· Update the tracking report to reflect test step execution and completion.
· Depending of the project, interface with the client testing prime to communicate the system test status and issues.
· Communicate the system test status and issues to management.
· Ensure execution of the system test plan as per acceptance criteria.
· Upon system test completion, refine system test plans for final publication.
· Produce Post Project System Test Reports.
5.7 Quality Records
The following system testing documents are kept as permanent records:
· Test Plans
· Client Access Memo
· Test Results
· Integration and/or System Test Report
· Status Reports
6. Performance Testing
6.1 Introduction
A performance test is planned and executed on all components for which performance requirements and targets have been agreed to with the client. The complexity of the Performance Test is a function of both the number of test cases required and the level of difficulty to set up and execute each test case.
6.2 Documents Required
The following documents provide information required to create the Performance Test Plan and are recommended reading before starting the planning phase.
· Standards and Procedures Manual
· Overview Design Report
· System Description
· Database Design Report
· Detailed Design Report
· Requirements Report
· FRAM
6.2 Performance Testing Guidelines and Steps
In general, the following steps highlight what is required:
· Ensure that all performance requirements and/or objectives, agreed to with the client, are known and documented.
· Define the test cases that are required for performance testing.
· Gather all information pertinent to volumetrics for issues of sizing as well as performance (i.e., estimated file sizes, normal and peak throughput, etc.).
· Create the Performance Test Plan.
· Ensure transaction drivers and other required utilities are developed, tested and configured prior to the commencement of testing.
· Determine the tools and methodologies required to measure the performance.
· Create the Performance Test environment.
· Create and populate the Performance Test database.
· Update all system and application configurations to reflect the test site environment.
· Execute test cases as specified in the Performance Test Plan
· If one or more performance requirements cannot be met, introduce changes to the system configuration and/or corrections to the affected unit(s).
· If the performance requirement(s) in question still cannot be met, complete a Problem Report.
· Once all problems have been resolved, re-run the necessary tests.
· Complete analysis of performance test results and document.
6.3 Quality Records
The following Performance Test documents are kept as permanent records:
· Performance Test Plan
· Performance Test Report
· Performance Test Results
Craig Borysowich (Chief Technology Tactician)
Some of my entries that continue to get heavy traffic after being posted over a year ago, is my '10 step guide to developing a test plan' and my 'Sample Test Plan Template' - which are also good lead-ins for the following post on software testing standards and procedures that should be a part of theDesign and Development Standards and Procedures document.
The purpose of this document is to describe the standards and procedures to follow during the software testing phases of the SYSTEM Z project. This document supports the section on Testing and Validation in the Integration and Methods Quality Manual.
1. Scope
These standards and procedures state the general standards and procedures to follow to plan and conduct software testing and validation. These standards and procedures may be changed via a change control mechanism that allows all those concerned to be notified of changes made to the steps.
2. SOFTWARE TEST PLANNING
2.1 Introduction
Software test planning is the process whereby the following are established for the testing of a given project deliverable:
· Testing requirements (scope),
· Testing approach,
· Testing tasks and deliverables,
· Estimates,
· Testing phases,
· Testing schedule,
· Completion criteria,
· Test environment and team roles and responsibilities.
2.2 Application Level Planning
High level software test planning is conducted within the project planning phase to establish the high level plan for testing.
2.2.1 Objectives
· To identify Testing Requirements (Scope):
· to identify the software to be tested
· to identify the testing objectives
· to identify the test phases (testing coverage) within the testing life cycle that is required
· To identify Testing Approach:
· to identify the methods and testing tools required
· to identify any client assumptions/dependencies/limitations
· To identify Testing Tasks and Deliverables:
· to identify the activities to perform within each testing phase.
· to identify the external (client) deliverable document.
· to identify the table of contents for each deliverable
· to identify the internal deliverable documents
· to identify document deliverable reviewer
· To compile Testing Estimates:
· to identify the budgetary estimate for each identified phase of software testing
· To determine Testing Schedules:
· to identify the start and end date for each phase of software testing
· to identify all testing phase overlaps in the schedule
· to identify delivery dates for all document deliverables
· To determine Testing Phase Completion Criteria:
· to identify the completion criteria of each identified software phase
· To determine Test Environments:
· to identify the software/hardware requirements for each test phase
· to identify the number of test environments
· To identify Test Team Roles and Responsibilities:
· to identify the overall testing management responsibility and for each test phase
· to identify client roles and responsibilities
2.2.2 Responsibilities
· Development Manager:
· ensures that proper analysis and planning is done for the unit testing phase
· Technical Services Manager:
· ensures that proper analysis and planning is done for performance testing
· Application Test Manager:
· ensures that proper analysis and planning is done for all other test phases
2.2.3 Inputs
· First Release:
· Statement of Compliance
· Contract Proposal
· System Blueprint
· Subsequent Releases:
· Previous Releases Internal and External Deliverables
2.2.4 Method
· see Testing Work Instructions
2.2.5 Working Documents
· Test Hardware and Software Requirements
· PWW plans
· Meeting Minutes
· Testing Work Instructions
2.2.6 Deliverables
· Program Plan
· Release Estimates
· SDE Requirements Report
· System Z Standards and Procedures
2.3 Test Phase Level Planning
2.3.1 Objectives
Software test planning is conducted at the testing phase level to establish a working plan for each phase.
· To identify Testing Requirements (Scope):
· to identify the testing phase objectives
· to identify the testing activities for the phase
· to identify software load within a phase
· to identify contents of the software load(s)
· to identify special testing requirements of critical components
· To identify detailed testing tasks
· To identify the estimates for each task within the phase
· To determine Testing Schedules
· to identify the testing start and end date for each software load
· to identify internal completion dates for internal/external deliverables
· to identify start and end dates for resources (both human and physical)
· to identify training dates required for testing staff
· To specify SDE requirements
· to identify any specify setup requirements for desktop workstations
· to specify the schedule for the setup of test environment which requires SDE support
· to specify requirements for special tools (e.g., PRS, Functional Requirement Analysis Matrix, etc.)
· To identify Test Team Roles and Responsibilities
· to identify team member's roles and responsibility
· to identify team member's skill set requirements
· to identify team member training requirements
2.3.2 Responsibilities
· Integration Test Manager:
· ensures that proper analysis and planning is done for Integration testing
· System Test Manager:
· ensures that proper analysis and planning is done for System testing
· Development Manager:
· ensures that proper analysis and planning is done for unit testing phase
· Technical Services Manager:
· ensures that proper analysis and planning is done for performance testing
2.3.3 Inputs
· First Release:
· Requirements Specification
· Functional Specification
· System Description
· System Blue Print
· Subsequent Releases:
· Test Hardware and Software Requirements
· Previous Releases Internal and External Deliverables
2.3.4 Method
· see Testing Work Instructions
2.3.5 Working Documents
· PWW plans
· Meeting Minutes
· Testing Work Instructions
3. UNIT TESTING PROCEDURES
The goal of unit testing is to assure that all functions and features of a single compilable unit of code perform as specified in the Design Specification.
A unit test covers the testing of a software unit, or a group of closely related units, as a single entity. Unit testing is performed in isolation, using test drivers to simulate higher level units, and/or stubs to simulate lower level units. Unit Testing Procedures consist of:
· Creating a Unit Test Plan
· Creating test data
· Conducting tests according to the Unit Test Plan
· Reporting and reviewing the results of the test
These procedures are performed by the team member responsible for programming and testing of the unit.
A Unit Test Plan is a set of test cases arranged in the sequence of chronological execution. The Unit Test Plan is created before the programming of the unit is started, and the test cases should cover the functional, input, output, and function interaction of the unit.
3.1 Documents Required
The following documents provide information required to create the Unit Test Plan and are recommended reading before creating the Unit Test Plan.
· Design Report
· Requirements Reports
· Change Requests
3.2 Unit Test Design Guidelines
The guidelines to be followed during the creation of Unit Test Plans are:
· A test case must exist for every branch in the code
· Design test cases and test data which reveal errors in software
· Design test data that will ensure all conditions and quality of data edits are covered
· Create test cases for special formulae and extreme conditions (e.g., Test case "File is Empty" shall be used for all files.)
· Test the interaction between units within the task
· To minimize the number of test cases, combine test cases into one if they test the same feature. (i.e., can cover a group of units or a full task)
· Use test cases which already exist wherever possible. Include the generic test plan
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together
· Where possible, arrange the test cases in the chronological order in which they will be performed
3.3 Unit Testing Steps
Unit tests are created and executed by the developer of the unit. The procedure for unit testing is as follows:
· Create a unit test plan following the Unit Test Plan guidelines
· Conduct the unit test as specified in the test cases
· Identify and fix or report any problems encountered
· Re-run the necessary tests
· Sign the Test Plan Cover Page (Tested By and Date)
· Package the Test Documentation and pass it to the development team leader
· The development team leader is to verify that the documentation is complete, sign the Test Plan Cover Page (Reviewed By and Date) and submit the package to Quality Assurance for review and Configuration Management for promotion
· Promote script or command files used to run the tests along with the unit
3.4 Create Unit Test Plan
· Identify Features to Test
· Using the Functional Specification, Preliminary and Detailed Design Specification (Unit Procedural Description) identify:
· All Functions performed by the Unit
· All Inputs to the Unit
· All Outputs from the Unit
· Define all ranges and discrete values of the test data necessary to run the tests
· Prepare the Unit Test Plan following the Unit Test Plan guidelines
Note: See the Appendix entitled Templates for the template to be used and a description of the cover page contents.
· Design a set of Test Cases
· Use the checklists for the five types of coverage and outlined functions, inputs and outputs to create the minimum set of test cases for testing the functionality of the unit
· For each test case identified in the first point:
· State the condition that will be tested by the test case (this should be used as the title of the test case)
· List the steps/actions to be performed in order to accomplish the test
· For each action performed identify the expected result
· Create test data necessary to create the condition being tested and for each piece of data, indicate the expected results
For example, a test case for an invalid id on a data entry screen could be named "Invalid id". The title states the condition of the test. The procedure for testing this condition should indicate in which data entry field the cursor should be positioned and what key should be pressed to trigger the edit. A table containing the various data elements to be entered can be attached and referenced by one of the steps in the procedure. This data table could also contain the expected results for each data item to be entered.
Note: Skeletons for the test plan and test case are available as templates in Word for Windows
3.5 Conduct Unit Test
Unit testing is performed according to the following procedures:
· Unit tests will be run according to the Unit test Plan created by the developer
· If any of the actual results do not agree with the expected results, the developer fixes the code and re-runs the test. There is a possibility that the Unit Test Plan will need to be updated if it is determined that the Unit Test Plan is not correct or no longer up-to-date
· Once all test cases have been successfully completed the developer signs the top page of the test plan, completes a promotion request form and passes the package to the developers team leader
· The team leader verifies that the unit test has been performed and the team leader (or developer) passes the promotion request to Configuration Management
4. INTEGRATION TESTING PROCEDURES
The goal of integration testing is to ensure that all interacting subsystems in a system interface correctly with one another to produce the desired results. Furthermore, in trying to attain this goal, integration tests will ensure that the introduction of one or more subsystems into the system does not have an adverse affect on existing functionality.
An integration test covers the testing of interface points between subsystems. Integration testing is performed once unit testing has been completed for all units contained in the subsystems being tested. Integration Testing Procedures consist of:
· Creating and integration test plan
· Creating test data
· Conducting tests according to the integration test plan
· Reporting and reviewing the results of the test
During this phase, the interaction between subsystems is tested. This includes interfaces through Inter Process Communications (IPC) and files. This phase is performed by an independent test team. This team prepares and executes integration tests, generates problem reports and is responsible for passing the integrated system on to the System Test Team for system testing. The Integration Test team then enters a support mode in which it will test problem reports generated by the System Test team before forwarding code fixes to the System Testing environment.
This phase is sometimes combined with the system test phase as per the client's request.
4.1 Documents Required
The following documents provide information required to create the Integration Test Plan and are recommended reading before starting the planning phase.
· System Blueprint
· High Level Design overview from the developers
· Detailed Design
· Entity Relationship Diagrams
· System Requirements Report
· Change Requests
4.2 Integration Test Design Guidelines
The guidelines to be followed during the creation of Integration Test Plans are:
· The number of new units or tasks to be tested by one Test Plan should not exceed five.
· To minimize the number of test cases, combine test cases into one if they test the same interface point.
· Use test cases which already exist. Portions of available Unit Test Plans or System Test Plans can be used where applicable.
· When testing few or minor changes to an existing subsystem, structure the Test Plan such that the bulk of it will be regression testing using an existing Integration or System Test plan. New test cases can then be added to cover the detailed testing of changes.
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together. Where possible, arrange the test cases in the order they will be performed.
· Whenever possible, design test plans to run independently of other test plans.
· Design test cases and test data which reveal errors in the interaction between the software components. (Check the various response codes to calls to external interfaces.).
4.3 Integration Testing Steps
Integration tests will be created and performed by designated members of each team. Each individual will be responsible for the preparation of all test cases, procedures and data, as well as for conducting and documenting the tests. Each individual will also be responsible for the specification of all additional tools and facilities required for the integration testing of their tasks. The procedure for integration testing is as follows:
· Review all relevant design documentation and attend all design overviews/walkthroughs.
· Create an integration test plan.
· Where possible, create scripts to automate the execution of the test case.
· Arrange to have Integration Test Plans reviewed by Development for technical accuracy. The Test Plans may have to be updated after these reviews to incorporate changes suggested by the Developers.
· Conduct the test as specified in the test cases.
· Identify any problems which are encountered or where the actual results do not agree with the defined expected results. Complete a Problem Report. (see the TBU Problem Report and System User Guide for the procedure to follow for handling problem reports.) Update Test Plan execution status in the tracking document (see the appendix entitled Matrices, Logs and indices).
· Once all problems have been resolved, re-run the necessary tests.
4.4 Create Integration Test Plan
This section provides a guide for creating an Integration Test Plan. Skeletons for the test plan, test case and results summary are available in Word for Windows. By using this template and the style codes defined, table of contents can be created that are used to create the tracking document.
· Identify subsystem interface points:
· The Design Reports identify subsystem interface points. This should provide a high level view of which subsystems are changing and what, if any, new subsystems will be created to bring the system in line with requirements.
· A review of the Detailed Designs is conducted to determine which units (and therefore, which subsystems) are changing .
· For new subsystems, or major changes to existing subsystems, the interface points must be identified by using the Detailed Designs--these contain IPCs, Tables/Files accessed and process descriptions which will help the tester to identify critical interface points.
· If the subsystem in question is not new and will not require major changes, then this points to the need for regression testing of existing interface points to test that the subsystem functions as it did before any changes were implemented .
· Divide the interface points into logical groupings (test plans). Draw the IPC diagram illustrating the interface points.
· Create test cases to test each interface:
· Enter a purpose for each test case. Identify the conditions being tested. Ensure that each statement in the purpose is proven in the Expected Results.
· Using the Detailed Design Report, identify the processes within the subsystems that are the actual interfaces. These could be messages passed between processes or data written by one process and read by another. List these processes under Interface Components Tested. If the interface is by file, identify the tables being read, written or updated and list them in the File/Table Interface Points section.
· List the steps to be followed in order to accomplish the purpose of the test:
· List the sub-test that identifies the interfaces being tested in each test case.
· Below each sub-test heading, list the steps required to accomplish the test.
· In test cases for interactive functions, describe the actions to be performed by the tester followed by the result expected from the action.
· For non-interactive tests, list the steps to be performed. This usually involves running a command file, but may also consist of listing the steps required to use an emulator or other test tool.
· Expected results statements must describe only that which is visible to the tester. Processing which cannot be proven is not to be included.
· Create test data where applicable.
· Establish the expected results for each test case:
· The Expected Results section describes the outcome of an event that was triggered by a step in a test plan. For example it may be expected that after an IPC is sent from one process and successfully received by another, a database change is made. In this instance, the Expected Results section would describe how the database should look (i.e. the changes to a file/table caused by the IPC). Once all the test cases in a test plan are defined, update the Interface Points Tested and File/Table Interface Points sections of the test plan introduction page. It is not necessary to list every software component being used in the test cases, only the specific ones being tested by the test cases. (i.e., do not repeat software components tested fully by a previous test plan, unless the software component is being used for re-configuration of the system.)
· Test Setup Notes: Identify special instructions for the test case.
· List any requirements for the test cases in the Test Setup Notes section. For example, it might be stated that it will be necessary for the tester to backup the data files used in the test case so that they may be restored for running subsequent test cases. Where possible, create scripts to automate the execution of each test case. The name of this script should be listed in the Notes section of the test case.
· Develop procedures to execute (scripts) and evaluate each test plan (i.e., produce SQLCI reports to list the contents of tables).
· Identify command files that will back up or restore the data base to the state it was in at the start or completion of each test plan and list these command file names in the NOTES section of the test case.
· Create a Test Case Tracking document.
· After completing all the Integration Test Plans, create a Tracking document (see the appendix entitled Matrices, Logs and Indices) using a spreadsheet such as Microsoft Excel.
4.5 Conduct Integration Test
Integration testing will be performed according to the following:
· Integration tests will be run according to the Integration Test Plans by the Test Team Leader or Test Team Member.
· Actual results of the test runs are presented by printing documentation (reports, file dumps) or by demonstration (screen, panel displays).
· If any of the actual results do not agree with expected results, the person performing the test will complete a Problem Report (PR).
· After the necessary action has been taken to resolve the problem, the test run will be performed again from the beginning of the test step .The Test Plan may need to be updated, depending on the results of the test.
· Update the Tracking document at least once a day. As a test is completed, either successfully or unsuccessfully, the tester should update the Tracking document. The tester's initials are to be updated each time a different person performs the test. If a test step is completed without any problem reports, the test step is considered "closed". However, if a problem is raised after running a test step, the tester will indicate this in the tracking document. The tester updates the tracking document to reflect both the number of problem reports raised by, and the problem report PRS numbers associated with, the particular test step executed. (see tracking document template in the appendix entitled Matrices, Logs and indices). The Integration Test Team meets frequently to discuss the testing activities, possible conflicts and to review Problem Reports.
· The Integration Test Manager meets frequently with the Development Team Leaders to review Problem Reports, negotiate priorities for code fixes, and discuss support issues.
· When an error is found, do not spend a lot of time trying to debug the problem. Instead, raise a problem report providing as much details as possible so that the person or persons resolving the problem will know what to look for. Whenever possible, dump screens, logs, or tables to files or paper and forward a copy to whomever the problem reports are assigned. This will help everyone to get the problems reports answered as efficiently as possible.
5. SYSTEM TESTING PROCEDURES
5.1 Introduction
The goal of System Testing is to ensure that the system performs as per the functional requirements specified by client.
A system test covers the testing of functions within the system. System testing is performed once integration testing has been completed. System Testing procedures consist of:
· Creating Test Plans
· Creating test data
· Conducting tests according to the System Test Plan
· Reporting and reviewing the results of the test
Features to be tested during System Testing are:
· Functional Requirements
· Depending on the project, any regression tests deemed necessary
5.2 Documents Required
The following documents provide information required to create the System Test Plan and is recommended reading before starting the planning phase.
· High Level Design overview
· Problem Report Analysis Report
· Database Design Report
· FRAM
· Requirements Reports
· Change Requests
· Appropriate 3rd Party Interface Specifications
5.3 System Test Design Guidelines
The following are recommended guidelines when designing system tests:
· Design test cases to ensure that all requirements identified in the Functional Requirements Analysis Matrix document are tested by one or more test cases.
· In order to minimize the number of test cases required, design test cases to establish the presence of several related requirements.
· Each logical test case should test related functionality.
· Use test cases that already exist wherever possible.
· Arrange test cases in the order that minimizes the effort required for test setup and that keeps related functions together.
· Where possible, arrange the test cases in the order the function would be performed from a business perspective.
· Design test plans to run independently of other test plans.
· Identify a procedure to set up the database as required at the start of the test.
· Design test cases and test data that reveal errors in software.
· Design test data that will ensure all conditions and qualities of data edits are covered.
· Use live or representative data as much as possible in order to provide realistic functional tests. Any comments about setting up the test data are to be documented.
· Data for most reports should come from the data prepared for testing the interactive processes. It is acceptable to have the reports contain existing data from the database.
5.4 System Testing Steps
System Tests shall be run by the System Testing Team. A skeleton for the system test plan is available in Word for Windows. The procedure for system testing is as follows:
· Review all requirements and design documents.
· Attend system reviews presented by Development and Analysis Team members.
· Create and maintain a detailed System Test Project Plan.
· Divide the FRAM Requirements into logical groupings or scenarios. These scenarios should reflect the business or operational approach to the system.
· Define any necessary regression tests.
· Create a System Test Plan.
· Where possible, create scripts to automate the execution of a test case.
· Ensure the System Test Plan is reviewed by appropriate parties (Development and Quality Assurance).
· Verify that the System Test Environment has been created.
· Conduct the test as specified in the test cases.
· Identify any problems that are encountered or where the actual results do not agree with the defined expected results and complete a Problem Report.
· Record in the Tracking document the steps executed, relevant PRs, and test cases completed.
· Once all problems have been resolved, re-run the necessary tests.
· Update test plans after the testing is complete.
· Produce Post Project System Testing Reports
5.5 Create System Test Plan
· Obtain a copy of the Requirements Report, FRAM document, Database Design Report, and Detailed Design Report.
· Determine a table of contents for the system test plan and assign the individual test plan scenarios to testers.
· Inform QA and publishing of delivery dates for QA review and publishing. Inform QA of any special testing strategies which will be adopted.
· Review the above-mentioned documents for the test plan scenarios to be written.
· Schedule a testing overview with the analysis and/or development teams to gather the necessary information for writing the test plan scenario.
· Determine the test cases for the test plan scenario.
· Allocate the FRAM to the appropriate test case.
· Write the test plan scenario using the system test plan template.
· Submit a copy of the test plan scenario to the appropriate parties (Analysis and Development teams) for review. The appropriate parties include a System Test peer and development. Depending on the project, the client may participate in a system testing role and may also review the test plan scenario prior to publication.
· Submit a copy of the allocated FRAM to test plan/test case to the FRAM officer. Obtain an updated FRAM document allocated to test plan/test case.
· Submit a copy of the test plan scenario which has been reviewed in a previous step to QA. Along with the test plan scenario, submit a copy of the FRAM which has been allocated to test plan/test case level.
· Upon QA review, make any updates to the test plan scenario which are deemed appropriate.
· Resubmit the test plan scenario to QA for final review.
· Submit the test plan scenario to publishing.
· Create the System Test Tracking Report once all test plan scenarios have been reviewed by QA.
· Submit the System Test Tracking Report to Publishing
5.6 Conduct System Test
· Verify that the System Test Environment has been created and that it is functional.
· Create any test data necessary for executing the system test plan scenarios.
· Execute the system test plan scenarios as assigned to each test team member.
· Create a problem report for deviations from the expected results documented in the system test plan scenario.
· Interact with support team to help resolve problem reports.
· Update the tracking report to reflect test step execution and completion.
· Depending of the project, interface with the client testing prime to communicate the system test status and issues.
· Communicate the system test status and issues to management.
· Ensure execution of the system test plan as per acceptance criteria.
· Upon system test completion, refine system test plans for final publication.
· Produce Post Project System Test Reports.
5.7 Quality Records
The following system testing documents are kept as permanent records:
· Test Plans
· Client Access Memo
· Test Results
· Integration and/or System Test Report
· Status Reports
6. Performance Testing
6.1 Introduction
A performance test is planned and executed on all components for which performance requirements and targets have been agreed to with the client. The complexity of the Performance Test is a function of both the number of test cases required and the level of difficulty to set up and execute each test case.
6.2 Documents Required
The following documents provide information required to create the Performance Test Plan and are recommended reading before starting the planning phase.
· Standards and Procedures Manual
· Overview Design Report
· System Description
· Database Design Report
· Detailed Design Report
· Requirements Report
· FRAM
6.2 Performance Testing Guidelines and Steps
In general, the following steps highlight what is required:
· Ensure that all performance requirements and/or objectives, agreed to with the client, are known and documented.
· Define the test cases that are required for performance testing.
· Gather all information pertinent to volumetrics for issues of sizing as well as performance (i.e., estimated file sizes, normal and peak throughput, etc.).
· Create the Performance Test Plan.
· Ensure transaction drivers and other required utilities are developed, tested and configured prior to the commencement of testing.
· Determine the tools and methodologies required to measure the performance.
· Create the Performance Test environment.
· Create and populate the Performance Test database.
· Update all system and application configurations to reflect the test site environment.
· Execute test cases as specified in the Performance Test Plan
· If one or more performance requirements cannot be met, introduce changes to the system configuration and/or corrections to the affected unit(s).
· If the performance requirement(s) in question still cannot be met, complete a Problem Report.
· Once all problems have been resolved, re-run the necessary tests.
· Complete analysis of performance test results and document.
6.3 Quality Records
The following Performance Test documents are kept as permanent records:
· Performance Test Plan
· Performance Test Report
· Performance Test Results
Subscribe to:
Posts (Atom)