Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News Report Validates Impact of Visual AI on Test Automation

Report Validates Impact of Visual AI on Test Automation

This item in japanese

Empirical data from 288 quality engineers across 101 countries provide insight and credibility behind a report demonstrating the benefits of Visual AI in the field of test automation. The report comes from Applitools, a company that sells functional and visual testing tools using visual AI.

Visual AI is a form of computer vision which looks at applications in the same way as the human eye and brain do with the advantage that it won't tire. AI tools add another facet to the increasing array of automation assistance that enables test engineers to keep up with the rapid development and demands of CI/CD behind today's modern apps.

The report claims that with having Visual AI onboard, quality engineers were able to create test cases 5.8x faster leading to an expansion in test coverage, increase test code stability by 3.8x and increase in the number of bugs caught in pre-production as a factor of enabling testers to provide more coverage with a 45% increase in efficiency.

The statistics in the report present an impressive set of performance benefits when compared to the leading open-source test frameworks (Cypress, Selenium and WebdriverIO). Although code-based frameworks are, by their nature, open configurable and less refined, they do provide a baseline for comparison that the vast majority of quality engineers and testers are familiar with. The report provides details around the processes and techniques that the Applitools platform used in this study that drives up these efficiencies and validates the statistics. When considering a systems approach to DevOps environments, investing in test automation tools is not an area to skimp on so looking beyond the users' enthusiasm for AI automation, there is a serious business case that managers can evaluate.

The market for test automation frameworks and tooling has seen a steady number of entrants each with a different take on how they approach testing, organisation, collaboration and reporting. AI-powered cognitive vision presents some rather novel attributes which only reports differences that are perceptible to users, ignoring invisible rendering, size and positing differences. As companies work increasingly in the UI/UX web and mobile domains, having consistency across presentation layers is of significant benefit, making the job of quality teams more effective. This report verifies that visual AI is a serious "efficiency multiplayer for automated testing" providing testers with a "superpower" that accesses every screen and page in minutes but without getting tired or making mistakes.

Applitools used a hackathon approach to provide real data for the report; this provides a data set from 288 independent test engineers who collectively spent over 80 workweeks of their time recording their progress. With this amount of data points, it starts to resemble real-world scenarios where different verticals, dev teams and cultures have been hands-on with the tool and investigating the benefits of visual AI as a specific branch of test automation. The main outcome of the report is that using an AI tool to monitor the rendering of UI/UX elements across the whole range of devices and browsers can improve the overall performance of the test-team. Saving time and driving up quality or, in this case, accuracy, do free up time and there are obvious benefits in this for test teams; this report provides some data and cases illuminating the case for visual AI.

Rate this Article