- Home
- Case Studies
- Manual Testing for Eclat
Sep 07, 2023 3 min read
Manual Testing for Eclat
Platforms:
Web, MobileCountry:
NetherlandsImplementation time:
Aug 2022 - Sep 2022
Subscribe to Our Newsletter
Stay tuned for useful articles, cases and exclusive offers from Luxe Quality!
About Company
Eclat is an online marketplace platform that provides a unique shopping experience for fashion enthusiasts worldwide. Their platform helps users discover styles that enhance their confidence and allow them to express themselves authentically.
Before
The project was introduced to the market without undergoing a comprehensive testing process, and all testing was performed by the developer's team and product owners.
Challenges and Solutions
We were called to thoroughly test the newly developed functionality and conduct integration tests, E2E tests. Our testing process introduced advanced test design techniques, which allowed us to identify previously undiscovered bugs.
Let us share with you some aspects of the work on this project.
Challenges | Solutions |
|---|---|
To identify and log bugs both in existing and newly created functionality | More than 50 bug reports were created. To identify and log bugs both in existing and newly created functionality. Monitoring the bug's status and communicating with developers for updates and clarifications |
To make sure that the application works consistently and all required browsers and platforms | BrowserStack facilitated cross-platform testing, ensuring the application's compatibility and performance across browsers and devices to meet client requirements |
Features Of The Project
The application was initially developed for local use and was available only in Dutch. Therefore, our testers meticulously scrutinized the text and flow for clarity and comprehensibility, utilizing translation services where necessary to provide accurate translation into English.
Technologies, Tools, and Approaches
Our team conducted manual testing only. While the automated testing was out of scope in this project, we would like to tell you about the technologies directly related to the testing process.
- BrowserStack: Cross-browser and cross-platform testing tool used to ensure compatibility and consistent performance across different browsers and devices.
- Jira: Project management and issue tracking tool for efficient task management and collaboration.
- Notion: The connected workspace for closer collaboration with the dev team.
- MongoDB Compass: A tool for conducting database testing within the project.
Results
- Effectiveness of the testing process: More than 50 bug reports were created. They contained detailed descriptions of identified defects, including their nature, severity, and potential impact on the web application's functionality.
- Improved application performance: The user can now quickly switch between functionalities. All the integrated services are optimized, ensuring a seamless and continuous user flow. This optimization resulted in a smooth user experience.
- Cost Savings: Identifying and resolving issues during the development phase through testing helped avoid expensive post-release bug fixes and maintenance. As a result, the client could save on development and support costs.
- The application was successfully released and continues to progress in the market.
Implementation Steps
1. Requirements Gathering
The first step in the implementation process was gathering all the requirements. The team worked closely with the client to understand and document their needs and expectations.
2. Manual Testing Planning and Setup
Configured test environments and installed necessary tools, browsers, and software required for testing. Set up devices and browsers to ensure cross-platform compatibility.
3. Test Execution
Followed the necessary steps, inputting relevant data as required. Compared actual results against expected results and noted any discrepancies.
4. Regression Testing
Conducted regression testing to make sure fixed issues did not introduce new problems every sprint.
5. Cross-Browser Compatibility
The team used BrowserStack to test the application on various browsers and platforms, identifying and fixing compatibility issues.
6. Bug Reporting
Our team documented over 50 bug reports throughout the testing process to provide clear and comprehensive documentation.
7. Test Closure
Reviewed the overall testing results to ensure coverage and completion. Shared testing results and insights with project stakeholders.
8. Continuous Improvement
Continuous communication with the team and stakeholders addressed any quality improvements or changes in the application.
- Manual testing
- Functional testing
- System testing
- Integration testing
- BrowserStack

Your project could be next!
Ready to get started? Contact us to explore how we can work together.
Other Projects
Read moreDigital Connectivity Company
USA
•Web, Mobile
About project:
A digital connectivity company offering mobile, internet, and digital communication services.
Services:
- Manual and Automated testing, API, Security, Usability, Cross-browser, Cross-platform testing
- Automated testing -TypeScript + WebdriverIO + Mocha + Appium
Result:
350+ automated regression tests integrated into the CI/CD pipeline, ~50% fewer complaints from clients to support.FULL CASE STUDY
Telecommunications Provider
USA
•Web, Mobile
About project:
The client is a telecommunications provider offering broadband, mobile, and cloud communication services.
Services:
- Manual and Automated testing, API, Smoke, Regression, Performance, Security, Usability, Cross-platform testing
- Automated testing -TypeScript + WebdriverIO + Mocha + Appium
Result:
~70% of regression tests automated, reducing manual QA's involvement in regression cycles by 60%.FULL CASE STUDY
E-Commerce Retailer
USA
•Web, Mobile
About project:
An online E-commerce retailer that provides customers with a seamless online shopping experience through its web and mobile platforms.
Services:
- Manual and Automated testing, API, Usability, Cross-browser, Cross-platform testing
- Automated testing -TypeScript + WebdriverIO + Mocha + Appium
Result:
~80% drop in user-reported issues, critical checkout errors reduced to near zero, predictable, on-time releases for all major updates.FULL CASE STUDY


