Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Version History

« Previous Version 6 Next »

Terms, Abbreviations and Definitions 

Term 

Description 

UI  

User Interface  

UAT  

User Acceptance Testing  

BUAT 

Business User Acceptance Testing 

CI/CD  

Continuous Integration/Continuous Delivery  

Objectives 

The current document describes the testing methodologies, and testing flow which will be applied during all stages of Design system implementation .  

The key objectives are as follows: 

  • To establish and involve testing processes in the project 

  • Determine project implementation 

  • Determine testing types which will be applicable to each testing task 

  • Identify needed tools, determine testing phases during the project 

  • Identify testing metrics 

  • Identify what will be tested and what will be out of the testing scope 

The document doesn’t describe detailed functional requirements and design for the site.. This information will be provided in the separate documentation  - product specification (/wiki/spaces/FH2/pages/2090074367)  and detailed design (Design in Invision). 

This document is the live one and should be adjusted promptly to conform to the current state of the application requirements. 

Testing Scope 

Features to be tested 

  • Container 

  • Grid 

  • Section 

  • Typography  

  • Image 

  • Icon 

  • Heading Block 

  • Link 

  • Button 

  • CTA Block 

  • Accordion Item 

  • Accordion List 

  • Page Header 

  • Text box 

  • Navigation Link 

  • Navigation 

  • Logo 

  • Basic Header (i.e. excl. Site Search) 

  • Subtle Button variant  

  • Footer 

  • iFrame  

  • Carousel 

  • Product Header 

  • Overview 

  • Product Card 

  • Product Lists 

  • Picto Card 

  • Picto List (aka Ingredients/Benefits section) 

  • Image + Text 

  • Article Header 

  • Article Decoration (e.g. Pull Quotes, etc) (Image Carousel only in Figma) 

  • Embedded Video  

  • Page Card 

  • Page List 

  • Teaser 

  • Sitemap 

  • Product How To + Safety Block (How To block, reference below, seems a bit complicated for markets to use it) 

  • Promo Strip 

 Testing types in scope 

  • Code review - peer review and acceptance of developed code by RB side 

  • Unit testing (automated) - developers team responsibility with coverage at least 80 % 

  • UI compatibility and cross browser testing in manual way; 

  • Smoke testing on preview build of each new version that is deployed; 

  • Accessibility testing  (automated and manual) 

  • Visual Regression testing in semi-auto way[Probably will be changed] 

  • Functional testing 

 Testing types out of scope 

  • Performance testing 

  • Loading testing of server side is out of scope  

  • Testing of content - DA Team will copy and port content provided by the customer/brand team. So the correctness of the provided content is on the customer/brand team  

  • Testing of functionality of third parties, only the testing of  integration is in our scope  

 Test Approach 

Testing flow 

All test activities will be divided into four phases:  

  • Analysis & Planning Phase  

  • Test Phase 

  • Stabilization Phase 

  • UAT 

*This flow will be implemented for every component 

 Analysis & Planning Phase 

The purpose of this phase is to cover requirements provided by the customer  with test documentation such as checklists, which will be used in the Test phase and during the Stabilization testing phase. 

List of activities are to be performed during this phase: 

  • Requirements clarification for and checklists creation; 

  • Consideration of requirements testability and request for development support if needed; 

  • Test documentation creation / change / support functional and non-functional checklist's 

 Test Phase 

The purpose of this phase is to validate features against requirements using the test documentation created in the elaboration phase. 

List of activities to be done during this phase: 

  • Unit testing (automated) - developers team responsibility; 

  • UI compatibility and cross browser testing in manual way  

  • Accessibility testing; 

  • Bugs verification 

  • Reporting 

  • Test documentation support 

Stabilization Phase 

The purpose of the phase is to ensure that all set of components works together as expected according to requirements, any change doesn’t impact existing functionality. This phase includes regression testing and the final smoke testing (before delivering the application to the customer for acceptance). 

It is supposed that regression mostly will be done with developed manual UI tests.  

UAT + Go-live phase 

Test activities will be done during the phase: 

  • Review issues reported during the UAT by the customer 

  • Verification of the fixes and changes that will be made during the UAT 

  • Performing of the accessibility assessment of the library 

  • Performing the smoke testing of the components before it will be shipped to the production (on UAT env) 

  • Performing the final smoke testing of the application after the project is delivered to a stakeholder 

Each component must be get sign off and approve by this responsible people: 

  • Nathan McKean 

  • Katarzyna Lewczyk 

This sign of should be tracked in JIRA in specific story for each component. 

Test Types and Levels 

Unit testing 

 The tests will be developed by DataArt  dev team. These tests will be included in CI/CD pipeline and will be executed  as a part of the build process. Failed unit tests will prevent the creation of build artefacts (application build). 

Requirements: 

Test coverage for unit tests > 80%. 

Tools to be used: 

  • Jest Testing Library  

Cross browser and compatibility testing 

The purpose of the testing is to ensure that all components looks in accordance with the approved design in all agreed browsers. Such as the application is to be developed in mobile-first approach, it is highly important to verify the application on different mobile devices that are agreed with the customer. 

The verification firstly will be done in manual way to detect and eliminate all layout issues. 

Requirements: 

The layouts should be responsive. The following systems, browsers and resolution the layout should be matched: 

  • Desktop (more than 1440px viewport width) 

  • Windows: 

  • Chrome (the latest version for Apr-20 2022) 

  • MacOS: 

  • Safari (the latest version for Apr-20 2022) 

  • Chrome (the latest version for Apr-20 2022) 

  • Mobile (more than or equal 360px viewport width) 

  • Android (at least ver. 8 – i.e. Samsung Galaxy S9/S10) 

  • Chrome (the latest version for Apr-20 2022) 

  • iOS (at least ver. 10 – i.e. iPhone 8/XR/11 Pro) 

  • Safari (the latest version for Apr-20 2022) 

Tools to be used: 

  • Browser DevTools  

  • Percy/Chromatic 

*Tools must be approved 

 Accessibility testing 

The purpose of the accessibility testing is to make sure that components conform to WCAG 2.0 standards.  The accessibility testing is to be started in the design review step -  all colors from the approved design will be checked on conformance with color contrast ratio by WCAG 2.0 standards. This testing type will be performed in a manual way and using accessibility automation tools.  

Tools to be used: 

  • StoryBook 

  • AxeCore 

UI testing and Visual Regression testing 

The purpose of UI testing is to check that created UI Library matches the design  

  • Percy/Chromatic 

Smoke testing 

The goal of the smoke testing is to execute a limited suite of the checks which verify main functionalities and UI in order to make sure that a deployed build is not broken at all. This testing will be performed manually. 

Functional testing 

The purpose of functional testing is to make sure that all modules working properly according to requirements. Functional testing will be performed in a manual way.  

Tools to be used: 

  • StoryBook 

Test Environment 

It is supposed that during the application implementation and releasing the following environments will be used 

  • Storybook development env  

Reporting 

This section enlists the reports to be created during the testing activities. 

UI checklist 

Report is provided by the test team for the sprint. UI checklist is expected to contain the following data: 

  • List of modules(atoms/molecules/organism)  

  • Confirmation status according to design of this module 

  • Browsers and devices on which the module was tested 

  • The person who reviewed the module 

The report is planned to be delivered as a document added into the Confluence. The recipients are the Customer’s side, PM and management personnel. 

Accessibility checklist 

The report is provided by the test team for the sprint. Accessibility checklist is expected to contain the following data: 

  • List of modules(atoms/molecules/organism)  

  • Confirmation status according to Accessibility testing 

  • Browsers and devices on which the module was tested 

  • The person who reviewed the module 

The report is planned to be delivered as a document added into the Confluence. The recipients are the Customer’s side, PM and management personnel. 

Visual Regression report 

Report is provided by the test team for the sprint. It’s planned to include the following information into the report: 

  • Comparing screenshots of created module and its original designs  

  • Auto generated report 

  • Com 

The report is planned to be created in Visual regression tool. 

Functionality checklist 

Report is provided by the test team for the sprint. Functionality checklist is expected to contain the following data: 

  • List of modules(atoms/molecules/organism) which belong to the sprint 

  • Confirmation status according to functional testing 

  • Browsers and devices on which the module was tested 

  • The person who reviewed the module 

The report is planned to be delivered as a document added into the Confluence. The recipients are the Customer’s side, PM and management personnel. 

Issue and test documentation management  

Issues management 

It is planned to store and track all issues(Epics, Stories, Bugs, Tasks) in Jira. 

All features will be organized into Stories. Subtask for development test cases development, automation tests implementation will be created under each story. 

All found bugs during the testing will be linked to a relevant Epic and to Story. All detected issues will be marked with the relevant bug type.  

The following bug types are going to be used: 

Bug Type 

Description 

UI 

The issue relates to inconsistencies of modules to design 

Accessibility 

The issue relates to WCAG accessibility standards 

Functionality 

The issue relates to broken functionality 

Issues priority definition:  

Priority 

Definition 

Maximum Allowable 

Urgent 

This is a failure that is preventing either large or core areas of the Product from functioning. This defect needs to be fixed and deployed as soon as possible. 

 

Highest 

This is a failure that is preventing either large or core areas of the Product from functioning. A work around may exist as a short-term satisfactory solution. Some These  defects should be fixed before going live. 

 

High 

This is a serious failure that must be fixed before going live. High defects badly affect core areas, causing important or highly visible areas to fail. It is not normally possible to work around High defects. High defects should be fixed before going live. 

10% 

Medium 

A failure that is causing an error in the application functionality. It is of lower impact or in a less visible area than a High failure. Medium defects normally have a workaround. They usually impact only a few test cases and will not stop QA testing, and staying on schedule 

10% 

Low 

Defects have a low impact in less visible areas of the site, only affect the fringes of a business process or only happen in unlikely circumstances. Low defects usually only affect a single test case and will not stop QA testing, and staying on schedule. 

10% 

Lowest 

Defects have a low impact in less visible areas of the site, only affect the fringes of a business process or only happen in unlikely circumstances. Low defects usually only affect a single test case and will not stop QA testing, and staying on schedule. 

10%  

Checklist management 

For the checklist creating Confluence will be used. 

Accessibility checkpoints management 

Accessibility checklist will contain only checkpoints that have to be executed in a manual way. These checks will be store in Jira as Test case issue type and marked with ‘Accessibility’ label  

Suspension and Resumption Criteria 

Testing of Test Items will be suspended if: 

 Suspension criteria: 

Non-working environment or blocker 

 Resumption requirement: 

Bug was fixed, environment working properly 

 Entry criteria: 

Requirements and design for project is ready 

Access to tools for testing and development 

 Exit criteria: 

All duties of the DataArt team are done 

Test Deliverables 

The following test artifacts are planned to be created within the project: 

  • Check lists ( UI, Accessibility, Functionality)  

  • Bug reports 

  • Visual Regression Testing reports  

Detailed information about each document type is described below along with approaches and examples. 

Check Lists 

Set of functional and non-functional tests documented as simplified list of checks without strict steps to reproduce, pre-requisites, etc. 

Requirements coverage by tests is to be traced by mapping all requirements from specification to appropriate checks in Check List document. 

Bug Reports 

Report describing application’s flow, error, failure or fault that causes it to produce an incorrect or unexpected result, or behave in unintended way. 

All bug reports are to be posted to project bug tracking system (Jira) with appropriate severity for further processing by the Customer, Project Manager and Development team. 

Visual Regression Testing reports 

 Reports showing d differences or similarities between modules and their original design. Report should be automatically generated after every commit 

Risks 

 

Description 

Mitigation strategy 

Impact 

Probability 

Requirements inconsistency   

To gather requirements properly, update in constantly after each review, use only reviewed requirements 

High 

Medium 

Changing requirements 

To create adaptive documentation to prevent time loosing  

High 

Medium 

Changing design 

Low 

Medium 

Non-working environment 

High 

Low 

 Approvals 

The following people are required to approve the Test Strategy: 

Approver role 

Approver 

Product Owner 

Nathan McKean 

Product Manager  

Robert Ruszczyk 

QA Chapter Lead  

Maciej Sobociński 

Client Director 

Xeniya Akimova 

Delivery manager 

Nikita Pyshnyak 

Project Manager 

Sergey Kiselyov 

Lead Developer 

Anton Merkulov 

Tech Lead 

Pawel Ploneczka 

 

  • No labels