By: Dina Hanbazaza
Outlines
Software testing is the
evaluation of a software
that is being developed,
to check its capability and
ability to deliver the
intended results.!
Introduction
Testing Process
Testing Methodology
RegressionTesting
Introduction
 At First Stage ---Testing is the process of executing a program
with the intent of finding errors
 At last Stage ---Testing is the process of demonstrating that
errors are not present (after the completion of all bugs)
 When you test a program, you want to add some value to it.
Adding value through testing means raising the quality or
reliability of the program. Raising the reliability of the program
means finding and removing errors.
 Therefore, don’t test a program to show that it works; rather, you
should start with the assumption that the program contains errors
and then test the program to find as many of the errors as
possible.
Why Testing?
 To find and correct defects.
 To check whether the User’s needs are satisfied.
 To avoid user detecting problems.
 Also to provide Quality Product.
Why does S/W have bugs?
 Miscommunication or No Communication
(That we are not clear about what an application should do or shouldn’t do)
 Time Pressure
 Programming Mistakes
 Changing Requirements
Misunderstandings about Testing
 Testing is debugging
 Testing is not the job of a programmer
 If programmers were more careful, testing would be
unnecessary.
 Testing activities start only after the coding is complete
 Testing never ends
 Testing is not a creative task
Testing Process
We can divide the
activities within the test
process into the following
basic steps:
Test Planning
Test Analysis and
Design
Test Execution
Test Evaluating
and Reporting
Testing Process
Test Evaluating
-Analyze root
Causes of
defects
-Identify actions
-WriteTest
Report
Test Execution
Perform
testing
Bug Fixing
BugTracking
(Re-test)
Test
Log
Test Analysis & Design
SRS study
CreateTest
Design
Review &
Approval
- Develop test
cases, test
scripts
- Prepare test
data
Test Planning
-Test Objective
-Test Schedule
-Resource Allocation
-Test Exit Criteria
Test Planning
Test Planning has following major tasks:
 To determine the scope and risks and
identify the objectives of testing.
 To determine the required test resources like people, test
environments, PCs, etc.
 To schedule test analysis and design tasks, test execution
and evaluation.
 To determine the Exit criteria.
Test Analysis & Design
Test Analysis and Design has following major tasks:
 To review the SRS (Software Requirement Specification)
to understand what the system should do.
 To identify test conditions and requirements
 To design Test Cases (success/failure criteria).
 To design the test environment set-up and
identify and required infrastructure and tools.
 To prepare the test data.
Test Execution
Test Execution has following major tasks:
 To execute test following the Test Methodology.
 To re-execute the tests that previously failed in order to confirm
a fix.This is known as confirmation testing or re-testing.
 To log the outcome of the test execution and versions of the
software under tests. (Test Log)
 To Compare actual results with expected results.
TestEvaluatingandReporting
This step has the following major tasks:
 To check the test logs against the exit criteria specified in
test planning.
 To analyze root causes of defects and Identify necessary
actions.
 To write aTest Report.
WhatareSoftwareTestingMethodologies?
Software testing methodologies are the different
approaches and ways of ensuring that a software
application is fully tested.
Software testing methodologies encompass in two parts:
FunctionalTesting
The functional testing part of
a testing methodology is
typically broken down into
four components - unit
testing, integration testing,
system testing and
acceptance testing – usually
executed in this order.
AcceptanceTesting
SystemTesting
IntegrationTesting
Unit Testing
UnitTesting
 Purpose: To verify that the component/module
functions work properly.
 Check:
 internal data structures
 Logic
 boundary conditions for input/output data
 Method:White box testing
 Done by: Developers
IntegrationTesting
 Purpose:To verify that modules/components witch have
been successfully unit tested when integrated together
to perform specific tasks and activities work properly.
 This testing is usually done with a combination
of automated functional tests and manual testing
 Method: Black box testing
 Done by: IndependentTestTeam
SystemTesting
 Purpose:Verifies that all system elements work properly
and that overall system function and performance has
been achieved.
 This test is carried out by interfacing the hardware and
software components of the entire system (that have
been previously unit tested and integration tested), and
then testing it as a whole.
 Method: Black box testing
 Done by: IndependentTestTeam
AcceptanceTesting
 Purpose:To ensure that the software that has been developed
operates as expected and meets all user requirements.
 There are two types of acceptance testing:
 AlphaTesting: It is carried out by the members of the
development team, known as internal acceptance testing.
 BetaTesting: It is carried out by the customer, known as
external acceptance testing.
 Method: Black box testing
Enhancements
& Fixes Regression
Testing
WhatisRegression Testing?
Regression testing is type of testing carried out to ensure that changes
made in the fixes or any enhancement changes are not impacting the
previously working functionality.
RegressionTesting is required when there is a
 Change in requirements and code is modified according to the new
requirement.
 New feature is added to the software
 Defect fixing
 Performance issue fix
UnitTestingLevel(Semantic)
Area to focus during Unit level:
 Test cases which have frequent defects.
 Functionalities which are more visible to
the users.
 Test cases which verify core features of
the product.
 Test cases of Functionalities which has
undergone more and recent changes.
 Sample of Successful test cases.
 Sample of Failure test cases.
Testing Environment: Developer Interface
Build tests
for version x
Test
data
Run tests
for version x
Build results
for version x
Compare Verdict
IntegrationTestingLevel(Semantic)
Should to testing Module:
M8, M5, M1, and Main at the Unit level
Re-integration:
1, 2, 3 at the Integration level
Testing Environment : Developer Interface
Changed Module
MainA
M1 M3M2
M7M6M5M4
M8
1
2
3
SystemTestingLevel(Semantic)
Changed Module
MainA
M1
M5
M8
1
2
3
SystemA System B
Main B
Should to testing Module:
Main A and Main B at the System level
Testing Environment : Developer Interface
4
RegressionProcessinDeveloperInterface(Syntax)
UnitTesting Level
IntegrationTesting Level
AcceptanceTesting Level
AcceptanceTestingLevel(Semantic)
Once the user have a regression test (in Acceptance level), it is important to
update it each time (Redevelop) ...
• Fix a bug
• Add, change or remove functionality
• Change platform
Testing Environment : Testing Interface
(PATSTCMD Machine)
ReleaseStep(Syntax)
 Once the user finish
regression testing in
Testing Interface, product
can be Release to
Production
 Finally, product can be
send to LIVE
Finally
Have a good interaction, Understand the tasks,
Close the bugs, And at last give the quality
product, It makes us to reach what we have
aimed and also it will reach our company to the
top level.
THANKYOU

Testing methodology

  • 1.
  • 2.
    Outlines Software testing isthe evaluation of a software that is being developed, to check its capability and ability to deliver the intended results.! Introduction Testing Process Testing Methodology RegressionTesting
  • 4.
    Introduction  At FirstStage ---Testing is the process of executing a program with the intent of finding errors  At last Stage ---Testing is the process of demonstrating that errors are not present (after the completion of all bugs)  When you test a program, you want to add some value to it. Adding value through testing means raising the quality or reliability of the program. Raising the reliability of the program means finding and removing errors.  Therefore, don’t test a program to show that it works; rather, you should start with the assumption that the program contains errors and then test the program to find as many of the errors as possible.
  • 5.
    Why Testing?  Tofind and correct defects.  To check whether the User’s needs are satisfied.  To avoid user detecting problems.  Also to provide Quality Product.
  • 6.
    Why does S/Whave bugs?  Miscommunication or No Communication (That we are not clear about what an application should do or shouldn’t do)  Time Pressure  Programming Mistakes  Changing Requirements
  • 7.
    Misunderstandings about Testing Testing is debugging  Testing is not the job of a programmer  If programmers were more careful, testing would be unnecessary.  Testing activities start only after the coding is complete  Testing never ends  Testing is not a creative task
  • 9.
    Testing Process We candivide the activities within the test process into the following basic steps: Test Planning Test Analysis and Design Test Execution Test Evaluating and Reporting
  • 10.
    Testing Process Test Evaluating -Analyzeroot Causes of defects -Identify actions -WriteTest Report Test Execution Perform testing Bug Fixing BugTracking (Re-test) Test Log Test Analysis & Design SRS study CreateTest Design Review & Approval - Develop test cases, test scripts - Prepare test data Test Planning -Test Objective -Test Schedule -Resource Allocation -Test Exit Criteria
  • 11.
    Test Planning Test Planninghas following major tasks:  To determine the scope and risks and identify the objectives of testing.  To determine the required test resources like people, test environments, PCs, etc.  To schedule test analysis and design tasks, test execution and evaluation.  To determine the Exit criteria.
  • 12.
    Test Analysis &Design Test Analysis and Design has following major tasks:  To review the SRS (Software Requirement Specification) to understand what the system should do.  To identify test conditions and requirements  To design Test Cases (success/failure criteria).  To design the test environment set-up and identify and required infrastructure and tools.  To prepare the test data.
  • 13.
    Test Execution Test Executionhas following major tasks:  To execute test following the Test Methodology.  To re-execute the tests that previously failed in order to confirm a fix.This is known as confirmation testing or re-testing.  To log the outcome of the test execution and versions of the software under tests. (Test Log)  To Compare actual results with expected results.
  • 14.
    TestEvaluatingandReporting This step hasthe following major tasks:  To check the test logs against the exit criteria specified in test planning.  To analyze root causes of defects and Identify necessary actions.  To write aTest Report.
  • 16.
    WhatareSoftwareTestingMethodologies? Software testing methodologiesare the different approaches and ways of ensuring that a software application is fully tested. Software testing methodologies encompass in two parts:
  • 17.
    FunctionalTesting The functional testingpart of a testing methodology is typically broken down into four components - unit testing, integration testing, system testing and acceptance testing – usually executed in this order. AcceptanceTesting SystemTesting IntegrationTesting Unit Testing
  • 18.
    UnitTesting  Purpose: Toverify that the component/module functions work properly.  Check:  internal data structures  Logic  boundary conditions for input/output data  Method:White box testing  Done by: Developers
  • 19.
    IntegrationTesting  Purpose:To verifythat modules/components witch have been successfully unit tested when integrated together to perform specific tasks and activities work properly.  This testing is usually done with a combination of automated functional tests and manual testing  Method: Black box testing  Done by: IndependentTestTeam
  • 20.
    SystemTesting  Purpose:Verifies thatall system elements work properly and that overall system function and performance has been achieved.  This test is carried out by interfacing the hardware and software components of the entire system (that have been previously unit tested and integration tested), and then testing it as a whole.  Method: Black box testing  Done by: IndependentTestTeam
  • 21.
    AcceptanceTesting  Purpose:To ensurethat the software that has been developed operates as expected and meets all user requirements.  There are two types of acceptance testing:  AlphaTesting: It is carried out by the members of the development team, known as internal acceptance testing.  BetaTesting: It is carried out by the customer, known as external acceptance testing.  Method: Black box testing
  • 23.
    Enhancements & Fixes Regression Testing WhatisRegressionTesting? Regression testing is type of testing carried out to ensure that changes made in the fixes or any enhancement changes are not impacting the previously working functionality. RegressionTesting is required when there is a  Change in requirements and code is modified according to the new requirement.  New feature is added to the software  Defect fixing  Performance issue fix
  • 24.
    UnitTestingLevel(Semantic) Area to focusduring Unit level:  Test cases which have frequent defects.  Functionalities which are more visible to the users.  Test cases which verify core features of the product.  Test cases of Functionalities which has undergone more and recent changes.  Sample of Successful test cases.  Sample of Failure test cases. Testing Environment: Developer Interface Build tests for version x Test data Run tests for version x Build results for version x Compare Verdict
  • 25.
    IntegrationTestingLevel(Semantic) Should to testingModule: M8, M5, M1, and Main at the Unit level Re-integration: 1, 2, 3 at the Integration level Testing Environment : Developer Interface Changed Module MainA M1 M3M2 M7M6M5M4 M8 1 2 3
  • 26.
    SystemTestingLevel(Semantic) Changed Module MainA M1 M5 M8 1 2 3 SystemA SystemB Main B Should to testing Module: Main A and Main B at the System level Testing Environment : Developer Interface 4
  • 27.
  • 28.
    AcceptanceTestingLevel(Semantic) Once the userhave a regression test (in Acceptance level), it is important to update it each time (Redevelop) ... • Fix a bug • Add, change or remove functionality • Change platform Testing Environment : Testing Interface (PATSTCMD Machine)
  • 29.
    ReleaseStep(Syntax)  Once theuser finish regression testing in Testing Interface, product can be Release to Production  Finally, product can be send to LIVE
  • 30.
    Finally Have a goodinteraction, Understand the tasks, Close the bugs, And at last give the quality product, It makes us to reach what we have aimed and also it will reach our company to the top level.
  • 31.