Web Summit Logo
Let’s meet on Web Summit
location icon Lisbon, 11-14 November
Zhanna Pelenska photo

Zhanna Pelenska

QA, Manager

Alexandra Nagy photo

Alexandra Nagy

QA Lead

let’s connect
Luxe Quality logo

Aug 03, 2023 3 min read

Enhancing Quality and Performance Through Manual Testing Implementation

E-Commerce

Platforms:

Web, Mobile

Country:

USA

Implementation time:

May 2020 – Jun 2023
Cover Pinnacle IT Assests

Subscribe to Our Newsletter

Stay tuned for useful articles, cases and exclusive offers from Luxe Quality! 

about company

Pinnacle IT Assests is a leading provider of pre-owned, high-end networking, server, and storage equipment. With a wealth of knowledge and over 25 years of combined experience in the refurbished IT hardware industry, this company stands at the forefront of solving complex hardware challenges for IT managers.

Specializing in used servers, networking hardware, fiber optics, components, and enterprise storage solutions, they offer a comprehensive range of options to meet diverse needs. From Dell EMC and HPE to Nimble, NetApp, Cisco, and more, their extensive experience spans renowned brands in the enterprise equipment industry.

before

The project was launched in the market without a thorough testing process. As a result, any bugs that emerged were addressed and resolved by the developers afterward.

challenges and solutions

Before our QA engineer joined the project, testing was carried out, and some documentation existed, such as the Environment list, Original Requirements, Testing Checklist and Release Notes. This documentation provided a foundation for our testing activities and helped us understand the project requirements and scope. The primary focus of our team was to check new functionality and perform Regression and Smoke testing.

The QA team comprised two specialists, and the overall team consisted of one Backend developer, two Frontend developers, two QAs (including our new QA specialist), and a Project Manager (PM).

We will share with you some aspects of the work on this project.

Challenges

Solutions

Frequent performance improvements of the application aim to identify the optimal solution

Performance verification based on defined metrics is conducted after each change

Ensuring consistent performance across multiple environments during testing

Conducting performance testing in various environments, including Dev staging, Dev customer, and production

It is envisaged the application will work consistently across all browsers

BrowserStack was utilized as a testing tool for cross-browser testing, ensuring the application's compatibility across different browsers and devices. This allowed developers and testers to assess how the web application performed on various browsers and mobile devices, ensuring its compatibility and functionality across different platforms

Ensuring comprehensive documentation creation by the team

Our team created test cases and bug reports

technologies, tools, approaches

Our team conducted manual testing. No classic stack of technologies for automated testing was involved. However, we would gladly tell you about the technologies directly related to the testing process.

  • BrowserStack: Cross-browser and cross-platform testing tool used to ensure compatibility and consistent performance across different browsers and devices.
  • Magento: E-commerce platform utilized for developing and deploying the project.
  • Jira: Project management and issue tracking tool for efficient task management and collaboration.

results

  • Effectiveness of the testing process: Approximately 50 test cases were written. The testing process identified and reported 35 bugs that were addressed and fixed.
  • Improved application performance: The user can quickly switch between functionalities, and all the integrated services are optimized, ensuring a seamless and continuous user flow. This optimization allows for a fast and uninterrupted user experience.
  • Cost Savings: Identifying and resolving issues during the development phase through testing helps avoid expensive post-release bug fixes and maintenance. As a result, the client could save on development and support costs.
  • Usability testing was provided: Thanks to these works, the application interface was improved for better clarity and convenience. The correct user flow always helps to increase the conversion rate.
  • The application was successfully released and continues to progress in the market.

The mentioned application testing and performance improvements positively impacted the client's product. The application became more stable, responsive, and compatible, increasing user satisfaction, higher retention rates, and potential cost savings. These enhancements improved the overall user experience and helped the client stay competitive.

Implementation Steps

1. Requirements Gathering

The first step in the implementation process was gathering all the requirements. The team worked closely with the client to understand their needs and expectations.

2. Manual Testing Setup

The QA team set up the manual testing environment, ensuring all necessary testing tools and resources were in place. They also reviewed the existing documentation, such as the Environment list, Original Requirements, Testing Checklist, and Release Notes, to understand the project scope comprehensively.

3. Test Planning and Execution

The team devised a comprehensive test plan that included different types of testing, such as regression, functional, usability, smoke, and cross-browser. They created test cases to cover various scenarios and executed approximately 50 test cases.

4. Performance Testing

The application required frequent performance improvements, so the team conducted performance verification after each change. They defined performance metrics and conducted performance testing in various environments, including Dev staging, Dev customer, and production, to ensure consistent performance across different setups.

5. Cross-Browser Compatibility

To ensure that the application worked consistently across all browsers, BrowserStack was utilized as a testing tool for cross-browser testing. The team tested the application on various browsers, identifying and fixing compatibility issues.

 6. Bug Reporting and Documentation

Throughout the testing process, the QA team identified and reported 35 bugs. They documented the test cases and bug reports to provide clear and comprehensive documentation.

7. User Flow and Usability Improvements

Usability testing was performed to enhance the application's interface clarity and convenience. By analyzing user flow, the team identified areas for improvement, leading to a better user experience and increased conversion rate.

8. Continuous Improvement

Continuous communication with the team and stakeholders addressed any quality improvements or changes in the application.
Services provided
  • Manual testing
  • Smoke testing
  • Regression testing
  • Functional testing
  • Usability testing
  • Cross-browser testing
  • Cross-platform testing
QA Technologies used
  • true icon

Your project could be next!

Ready to get started? Contact us to explore how we can work together.

Other Projects

READ MORE
IoT

SpiderDoor

Photo SpiderDoor site

MORE ABOUT PROJECT

IoT

SpiderDoor

USA 

Web, Mobile (iOS) 

Implementation time:

Nov 2020 – Nov 2021 

About project:

SpiderDoor offers wireless gate access systems that enable remote facility management.

Services:

Manual and Automated Testing, Functional Regression Exploratory Acceptance Testing, Non-functional Usability Testing 

Automated Testing – JS+ WebdriverIO + Appium + Xcode, Postman for API testing

Result:

23 test cases were created, all of which were automated, ensuring rapid and consistent testing for future releases.

FULL CASE STUDY

Start-Up

Lumina Solutions

Photo Lumina Solutions site

MORE ABOUT PROJECT

Start-Up

Lumina Solutions

USA

Web, Mobile

Implementation time:

Dec 2022 – present

About project:

Lumina Solutions is an innovative technology company specializing in AI solution development that analyzes finance.

Services:

Manual, Automated, Functional, Smoke and Usability Testing.

Software Development.

Result:

Agile processes cut release times by 70% and improved forecasting accuracy by 20%, while over 780 automated test cases boosted code coverage to 90%.

FULL CASE STUDY

E-Commerce

DepreciMax

Photo DepreciMax site

MORE ABOUT PROJECT

E-Commerce

DepreciMax

Australia

Web

Implementation time:

Apr 2022 - present 

About project:

The project allows for detailed modeling of fixed asset depreciation and lease calculation rules for accounting and tax.

Services:

Manual - Regression, Smoke, Functional, Integration testing, Usability, UI/UX testing 

Automation testing 

Result:

750+ test cases, 450 of which are automated, 80% of functionality is covered by automation

FULL CASE STUDY