Cross Browser/Device Testing
Overview
Cross-browser and device testing ensures that our applications deliver a consistent and reliable user experience across the wide variety of browsers, operating systems, and devices used by our supporters. This guide outlines our approach to non-functional testing, with a focus on cross-browser and cross-device compatibility.
Tools & Platforms
To support robust cross-browser and device testing, we utilise a range of tools and platforms that enable efficient, scalable, and collaborative testing across diverse environments. To gain access to these tools, please contact Raj or a lead QA for an account to be created.
BrowserStack
What is BrowserStack?
- Provides access to a cloud-based grid of real browsers and devices.
- Used for both manual and automated testing across desktop and mobile platforms.
- Enables testing on legacy browsers and less common device/OS combinations.
For more information on BrowserStack, please see https://www.browserstack.com/docs/
We use BrowserStack to support testing on a range of different browsers and devices. At CRUK, we actively encourage team members to participate in cross-browser and device testing, especially when preparing an application for release. This collaborative approach helps us catch environment-specific issues early and ensures broader coverage. BrowserStack plays a key role in this process, offering engineers the flexibility to test applications on a wide range of real devices and browsers without needing physical access. One of its key strengths is its ability to support collaborative testing, especially during Crowd Testing activities. It enables multiple engineers to simultaneously test an application on different browser and device combinations, making it an invaluable tool for broad compatibility checks and rapid feedback during release cycles.
Applitools
What is Applitools?
- AI-powered visual testing tool.
- Detects visual regressions across different browsers and screen sizes.
- Can be integrated into CI/CD pipelines for automated visual checks.
For more information on Applitools, please see https://applitools.com/docs/
Each development team at CRUK may implement Applitools differently depending on their project setup and workflow. However, a common foundational practice is the use of having a master baseline, which represents the expected visual state of an application — essentially the "as-is" design which is then compared to branch with subsequent changes to the application.
When changes or improvements are made to the application, a new branch is created and tested against this master baseline. Applitools then compares the visual output of the updated branch to the baseline, highlighting any differences. Once Applitools detects visual differences between a new branch and the master baseline, these changes are manually reviewed by an engineer. Each difference is assessed to determine whether it reflects an intentional and acceptable update or an unintended regression. The engineer then approves or disapproves the changes based on their context — such as alignment with design specifications, impact on user experience, and consistency across browsers and devices. This manual review step ensures that visual integrity is maintained throughout the development lifecycle.
At CRUK, we also leverage Applitools to support design sign-off processes in collaboration with the UX team. By adding UX designers as viewers on the platform, we enable them to easily review visual changes and confirm alignment with design expectations. This collaborative workflow streamlines the approval process and ensures that visual updates meet both functional and aesthetic standards before release.
Other Tools
- Use of Real Devices - while there is no strict rule for testing on real devices at CRUK, we make a conscious effort to test applications on personal devices and any older hardware we have access to. This helps us validate real-world performance, touch interactions, and browser behaviour that may not be fully replicated in emulators or cloud-based platforms. Testing on actual devices adds an extra layer of confidence, especially for mobile responsiveness and device-specific quirks.
- Developer-mode browser based tools - Developer tools built into modern browsers — such as Chrome DevTools, Firefox Developer Edition, Safari Web Inspector, and Edge DevTools — are essential for inspecting and debugging applications during cross-browser and device testing. These tools allow engineers to simulate different screen sizes, inspect network activity, analyse performance, and identify layout or styling issues specific to each browser. They are especially useful for quick, targeted testing and troubleshooting in real-time.
Coverage
In terms of browser coverage, we cover popular browsers such as Chrome, Safari, Mozilla Firefox and Microsoft Edge which can be downloaded to your local machine. We use Datadog analytics to monitor and analyse the most commonly used browsers and devices by our supporters. Other project teams also use Google Analytics reports to collect this data. This helps us:
- Prioritise testing on high-traffic platforms.
- Identify emerging trends in user environments.
- Adjust our testing matrix dynamically based on real usage data.
At CRUK, we use Datadog to monitor real-world usage of our applications. When issues are reported or alerts are triggered, Datadog provides detailed information about the device and browser versions involved. This insight is invaluable for identifying gaps in our test coverage and ensuring that we prioritise testing on the platforms our Supporters actually use. By aligning our cross-browser and device testing strategy with real usage data, we can deliver more reliable and user-focused experiences. Please note, data in DataDog is only available for the last 3 months.
Slack Channels Using Third Party Tools
- cruk-applitools - For queriers related to using Applitools
- crux-BrowserStack - For queriers related to using BrowserStack