Email Subscription Form

Saturday, July 21, 2018

Cross-Browser Testing

In today's Agile world, with two-week sprints and frequent releases, it's tough to keep on top of testing.  We often have our hands full with testing the stories from the sprint, and we rely on automation for any regression testing.  But there is a key component of testing that is often overlooked, and that is cross-browser testing.

Browser parity is much better than it was just a few years ago, but every now and then you will still encounter differences with how your application performs in different browsers.  Here are just a few examples of discrepancies I've encountered over the years:


  • A page that scrolls just fine in one browser doesn't scroll at all in another, or the scrolling component doesn't appear
  • A button that works correctly in one browser doesn't work in another
  • An image that displays in one browser doesn't display in another
  • A page that automatically refreshes in one browser doesn't do so in another, leaving the user feeling as if their data hasn't been updated

Here are some helpful hints to make sure that your application is tested in multiple browsers:

Know which browser is most popular with your users

Several years ago I was testing a business-to-business CRM-style application.  Our team's developers tended to use Chrome for checking their work, and because of this I primarily tested in Chrome as well.  Then I found out that over 90% of our end users were using our application in Internet Explorer 9.  This definitely changed the focus of my testing!  From then on, I made sure that every new feature was tested in IE 9, and that a full regression pass was run in IE 9 whenever we had a release.  

Find out which browsers are the most popular with your users and be sure to test every feature with them.  This doesn't mean that you have to do the bulk of your testing there; but with every new feature and every new release you should be sure to validate all of the UI components in the most popular browsers.

Resize your browsers

Sometimes a browser issue isn't readily apparent because it only appears when the browser is using a smaller window.  As professional testers, we are often fortunate to be issued large monitors on which to test.  This is great, because it allows us to have multiple windows open and view one whole webpage at a time, but it often means that we miss bugs that end users will see.  

End users are likely not using a big monitor when they are using our software.  Issues can crop up such as: a vertical or horizontal scrollbar not appearing, or not functioning properly; text not resizing, so that it goes off the page and is not visible; or images not appearing or taking too much space on the page.  

Be sure to build page resizing into every test plan for every new feature, and build it into a regression suite as well.  Find out what the minimum supported window size should be, and test all the way down to that level, with variations in both horizontal and vertical sizes.  

Assign each browser to a different tester

When doing manual regression testing, an easy way to make sure that all browsers you want to test are covered is by assigning each tester a different browser.  For example, if you have three testers on your team (including yourself), you could run your regression suite in Chrome and Safari, another tester could run the suite in Firefox, and a third tester could run the suite in Internet Explorer and Edge.  The next time the suite is run, you can swap browsers, so that each browser will have a fresh set of eyes on it.  

Watch for changes after browser updates

It's possible that something that worked great in a browser suddenly stops working correctly when a new version of the browser is released.  It's also possible that a feature that looks great in the latest version of the browser doesn't work in an older version.  Many browsers like Chrome and Firefox are set to automatically update themselves with every release, but some end users may have turned this feature off, so you can't assume that everyone is using the latest version.  It's often helpful if you have a spare testing machine to keep browsers installed with the next-to-last release.  That way you can identify any discrepancies that may appear between the old browser version and the new.  

Use visual validation in your automated tests

Generally automated UI tests focus on the presence of elements on a web page.  This is great for functional testing, but the presence of an element doesn't tell you whether or not it is appearing correctly on the page.  This is where a visual testing tool like Applitools comes in.  Applitools coordinates with UI test tools such as Selenium to add a visual component to the test validation.  In the first test run, Applitools is "taught" what image to expect on a page.  Then in all subsequent runs, it will take a screenshot of the image and compare with the correct image that it has saved.  If the image fails to load or is displaying incorrectly, the UI test will fail.  Applitools is great for cross-browser testing, because you can train it to expect different results for each browser type, version, and screen size.  

Browser differences are something that can greatly impact the user experience!  If you build in manual and automated systems to check for discrepancies, you can easily ensure a better user experience with a minimum of extra work.  

You may have noticed that I didn't discuss the mobile experience at all in this post.  That's because next week, I'll be focusing solely on the many challenges of mobile testing!









7 comments:

  1. Thank you Kristin for writing this up, and I totally agree on all points, especially as a testing team we should be aware of what kind of browsers that real users use in live and we should plan our test activites accrdingly.One thing we do at our company is we have deployed a third party tool called 'Pendo' (which is basically helps to put help-contents in our web app for users), and also gives informations to us about the user activities like what application pages were commonly used in production, what browser were used in large, their metrics details etc. that helps us to know things better and help us to plan testing and activites... :)

    ReplyDelete
  2. We are amongst the Best Mobile App Development Company in India that offer IOS, Android Mobile Application Development services that look Great and act Smart.Video on Demand Apps

    ReplyDelete
  3. Nice and interesting post, I appreciate your hard work. keep it up…!!!Thanks for such useful information, It is true that now if you want to grow your business you will surely need the mobile app testing services for your business. But for that purpose everyone needs best mobile app testing companies.

    ReplyDelete
  4. Iqra Technology is a Salesforce and Microsoft partnered company. We also work in Mobile App Development including Flutter, Python, Android / IOS Development, Xamarin

    ReplyDelete

New Blog Location!

I've moved!  I've really enjoyed using Blogger for my blog, but it didn't integrate with my website in the way I wanted.  So I&#...