Quality Assurance: Release Process & Strategy
Process General Details
Process Name | Testing Process for a Release |
---|---|
Process Description | The Quality Assurance process is the final gate before a release goes to market, to assess and collectively work on ensuring that a good quality release is developed and deployed, as per the requirements set by the Product management team. |
Process Scope | The testing cycle for a release is a multi-step process. It is initiated when a release scope is identified and continues till the final deployment in market, and post release bug analysis related to that release.
|
Process Roles | QA Lead, Tester |
Process Boundary | The first discussions on scoping of a release marks the beginning of the QA cycle. A period ending 2 weeks from the deployment to production marks the ending of that release cycle for QA |
Process Stages | The stages are:
|
Process Flowcharts
1. Release Scoping: QA role & participation
Input | Discussions related to scope of a release is the primary trigger here |
---|---|
Output |
|
Role(s) Involved | QA, Release Manager, Product Manager, Tech Manager |
Stage Activities | QA to be involved in the initial scoping discussions. This will enable the team to have a clear view on what is the purpose of a particular release, the exact asks from the Product side & the proposed timelines against which the release is planned
|
2. Test Case Design
Input | Completed, finalized scope, with PRD's written and linked to the respective story in JIRA is the basic requirement to start the stage |
---|---|
Output |
|
Role(s) Involved | QA, Product Manager |
Stage Activities | QA must be a participant in / active recipient of any discussions which are related to the stories & enhancements tagged to the release. For this they should engage actively in all scrum-of-scrum meetings and grooming sessions.
|
3. Test Case Execution in Staging Environment (System testing)
Input | Deployment of the build after code-freeze is the trigger for this stage |
---|---|
Output |
|
Role(s) Involved | QA, Tech Manager, Product Manager |
Test Environment
Testing will be done in 2 environments: First cycle will be in Staging (Stage 3: System testing cycle) & second cycle will be in Pre-prod (Stage 4: UAT cycle).
Entry Criteria QA | Responsibility | Remarks | |
---|---|---|---|
1 | Agreed Scope for the release (Functional + Technical) / Actually delivered | Release Manager | |
- List of added items to the agreed scope of Rel | |||
- List of items which have missed the agreed scope of Rel, aligned with PM | |||
- List of partially delivered items which can be deployed and are testable | |||
- All items delivered should be in Released status | |||
2 | Unit Test report with 75% coverage for each module | Program Manager | Exceptional approvals to be done on case-to-case basis |
3 | Deployment tracker to be shared and signed off from DC | Release Manager | |
4 | Code Freeze > no new features / Bugs will be delivered post delivery to QA | Release Manager | Exception is hotfix |
Value Add | For end user facing story, does it have developer documents in place | Documentation | Identify a date on which the same will be made available |
API documentation for any new API’s added or existing ones modified needs to be updated | Documentation |
Each build will follow the process listed below.
E-mails:
Post deployment, send a mail on:
how many items in the scope & in the Releases status, how many in Ready for Release status & how many are still open.
Ask for the sanity execution results
Raise any mismatch found in the scope delivered, as against scope promised
Re-check delivery timelines, & raise a flag if schedule is impacted
Share a daily status update with all stakeholders regarding the work done update > are we on track / delayed.
Share the Test case execution status: Feature & Solution wise
Share the no. of test cases which have passed/failed/are blocked
Share information related to the defects, mapping against solution type and severity
If delayed, state the reason why
Raise known risks if any, along with mitigation plan
Types of testing done:
Sanity Test Execution: Every new code deployment, will be followed by a quick sanity test that will assess the stability of code and test environment to ascertain if the build is fit-for-test.
Automation plays a key role here; first the automated suite is run and then all failed test cases are checked manually
Functional Testing:
New Feature
Regression Testing
P1 scenario testing
Backward Compatibility Testing (up to last 3 builds)
Similar activity will be done for the App as well
Test case Execution task:
Run the automated sanity suite- retest failed test cases manually and raise the relevant defects
If sanity fails due to more than 5 S1 blockers, reject the build
Use the excel to allocate work within team (solution-wise owner to take ownership)
Initiate execution of test cases
Raise defects found in JIRA, tag it to the main issue ID.
All defects raised will be either resolved or converted to bugs/ enhancements before release sign-off is given for testing
If a defect is raised & needs to be converted into a bug/enhancement, then follow the specified process
Important points to note regarding defects:
In case any outsourced individual comes in for a temporary engagement, they will not be allowed to report defects in JIRA directly; these will be shared with the solution owner to check validity of the defect.
Defect Triage meeting: Sync up with Rm to schedule a meeting with the Teach Managers to get ETA on defects raised.
All defects raised need to be closed/ converted to bugs/ converted to enhancements before release sign-off can be given.
Exit Criteria: QA | Responsibility | Remarks | |
---|---|---|---|
1 | For the Delivered Scope | QA | |
- All items in scope which were in Released status should be in closed status | |||
- All S1/S2 defects on New Feature/Enhancement/Bugs should be closed | PM should be notified on a daily basis in case any defect is not being taken up for fixing, for a final decision on the same | ||
- All regression defects should be closed irrespective of severity | |||
- List of Tech items not tested by QA to be shared | |||
- All defects should be in closed status. S3 defects not being taken up should be converted to bugs after aligning with PM | If there is any S1/S2 defect being converted to bug it has to be an exception in alignment with PM |
4. Test Case Execution in Pre-Pod Environment (UAT)
A meeting with the DevOps team should be done to arrive at the final tracker for the deployment to prod. The same one will be used in pre-prod as well (any exceptions should be called out).
After sign off from Staging environment, new feature and user experience testing will be done before final sign off for the release. Not in scope: No Regression testing will be done.
Entry Criteria: UAT | Responsibility | Remarks | |
---|---|---|---|
1 | Agreed Scope for the release (Functional + Technical) / Actually delivered | Release Manager | |
- List of items which have missed the agreed scope of Rel, aligned with PM | |||
- List of partially delivered items which are deployable and testable | |||
- All items (features, stories & Bugs) delivered should be in Closed status | |||
- List of Tech items not tested by QA to be shared | Exceptional approvals to be done on case-to-case basis | ||
- All defects should be in closed status | |||
2 | Deployment tracker to be shared and signed off from DC | Release Manager | Exception is hotfix |
E-mails:
Share a daily status update with all stakeholders regarding the work done update > are we on track / delayed.
Share the Test case execution status: Feature & Solution wise
Share the no. of test cases which have passed/failed/are blocked
Share information related to the defects, mapping against solution type and severity
If delayed, state the reason why
Raise known risks if any, along with mitigation plan
Test case Execution task:
Run the automated sanity suite- retest failed test cases manually and raise the relevant defects
If sanity fails due to more than 5 S1 blockers, reject the build
Initiate execution of new feature test cases
Complete exploratory testing of all solutions to be done
Execute the automated regression suite
Complete execution of backward compatibility test cases (up to last 3 builds)
Complete execution of P1 test cases written for all P1 bugs raised in the past up to last 3 releases
If a defect is raised & needs to be converted into a bug/enhancement, then follow the specified process
HOTFIX VERIFICATION:
In case a hotfix is needed for deployment of a fix for a P1 bug:
The Release Manager must call for a meeting with the PM representative, Engineering team, Dev Ops team and QA team to understand the work and inter-dependencies involved
Identify the impact area both from engineering and QA perspective for providing a holistic fix for the bug; ensure that the RCA is already done/ planned to be done
The timelines and scope of the hotfix should be arrived at and circulated with all relevant stakeholders; any changes to either should be duly updated to all
It should be kept in mind that 2 or more hotfixes should not be given with a very small gap of each other
Hotfix gets deployed directly on pre-prod and does not go through a testing cycle in Staging
Identify the set of test cases which need to be executed for testing, update the same in the Jira ticket as well.
Reach out the PM and TM for a sign off on the test cases shared, update as per review comments
Once deployment is complete, plan for testing; if deployment is scheduled at night, identify the set of people who would be working at night time
Send a status update mail:
Identify test cases which have failed, if any
Share the scope of testing done, including the identified test cases plus sanity plus backward compatibility etc.
5. Final Sign-off and Production Deployment
Exit Criteria: UAT | Responsibility | Remarks | |
---|---|---|---|
1 | For the Delivered Scope | Implementation Team | |
-All items in scope which where in Released status should be in closed status | |||
- All S1/S2 defects on New Feature/Enhancement/Bugs should be closed | PM should be notified on a daily basis in case any defect is not being taken up for fixing, for a final decision on the same | ||
- All regression defects should be closed irrespective of severity | |||
- List of Tech items not tested by QA to be shared | |||
- All defects should be in closed status. S3 defects not being taken up should be converted to bugs after aligning with PM | If there is any S1/S2 defect being converted to bug it has to be an exception in alignment with PM | ||
2 | Documentation of Release Notes should be complete | Documentation | |
3 | Sign off to be given post Go/No Go meeting | Implementation Team |
Process Metrics
Regression traceability index: Completeness of Regression test suits
Functional traceability index: Test Coverage of New Functionalities being built in the release
Test efficiency: Total defects identified by QA against the total defect introduced by the engineering team during the development process
Defect Leakage Rate to Production -> Rate at which the defects leaked into Production undetected
Defect Leakage Rate to UAT-> Rate at which the defects leaked into UAT undetected
Planned Functional Test cases Vs Actual Functional Test Cases
Planned Regression Test cases Vs Actual Regression Test Cases