Contact Us

Image comparing tool usage for the UKAD website testing

Image comparing tool usage for the UKAD website testing

During our manual testing of the UKAD website, we encountered a significant challenge. With a sprawling web presence exceeding a hundred pages, conducting individual reviews for each deployment proved excessively time-consuming. Moreover, the inherent monotony of such a task increased the risk of human error creeping in. It was evident that a more efficient approach was required.

In response, we opted to harness the capabilities of the pixel-by-pixel comparison tool, aShot. This tool presents a game-changing solution, enabling us to automate the testing of our GUI state. By doing so, we not only save time but also mitigate the potential for errors that arise from mundane and repetitive tasks. This transition reflects our commitment to streamlining processes and elevating the overall quality of our web development endeavors.

Here is a list of the used tools:

  • Java 7
  • Ashot
  • Selenium WebDriver
  • Jenkins
  • Maven
  • TestNG
  • Xenu

First of all, we need to define the exact list of the pages to test. For this purpose we use Xenu. This tool simply check website URL, gets the URLs list, then filters not necessary entries (see the screenshot below)

Now, we have to check the site manually to ensure that our current GUI state has no issues. It's necessary because the reference images will be created based on the current state of the website.

If the current site version has no issues we are ready to start the first part of the tool called UkadWebSite_CreateExpectedScreens. We have a separate Jenkins job on a remote server for this purpose. This part will perform the next steps with the following order:

  • Open each page from the list in Chrome web browser
  • Make a screenshot of the whole page from header to footer 
  • Save the result in a separate .png file on the server with specific name which include name of the page, ‘expected’ label and window size.

In that way we are getting reference files for our testing.

Now, when expected images are saved and any changes on site will be performed, the second part of the tool should be used. We have a separate Jenkins job for it, which is called UkadWebSite_CompareScreens. This part of the tool performs the next actions:

  • Open each page from the list in Chrome web browser
  • Make a screenshot of the whole page from header to footer 
  • Save the result in a separate .png file on the server with the name, which include: name of the page, ‘actual’ label and window size
  • Compare pixel-by-pixel expected and actual files 
  • If some pixels of the reference and current files are different, tool marks them with red color
  • Save the test result to a separate .gif file, which includes expected, actual and difference .png files. The difference of states is highlighted with red colour and blink on the gif file
  • Send a common .gif file which includes actual, expected and difference images to email address, which is saved in the system

In the screenshot below we can see actual, expected and different images from left to right. On the rightmost image we see that the difference between actual and expected images is highlighted with red colour. 

By adopting this strategy, we effectively diminish the time-intensive nature of manual Quality Assurance (QA) procedures that entail reviewing each individual page of our website. This shift holds multiple advantages. Firstly, we gain the flexibility to schedule our tests during non-working hours, including weekends, nights, and holidays. This tactical timing ensures our website's robustness is continually evaluated without disrupting normal operations.

Moreover, the integration of the aShot tool underpins our commitment to precision. Its remarkable accuracy guarantees a 100% error-spotting rate within our automation framework. This marked enhancement in efficiency renders automated testing substantially more effective than manual alternatives. Embracing this approach underscores our dedication to delivering a seamless digital experience to our users while optimizing our operational efficiency.

Author: Vikentiy Kelevich

  • QA
  • AQA
  • Quality Assurance
Back to Articles

It's time to talk!