In the first two articles of our blog series, “Web Analytics 101: The Digital Publisher’s Guide to Better Data,” we discussed invalid traffic, how to identify bots and the four benefits of cleaner website data.
Here we look at specific steps publishers can take to identify invalid traffic and gain a better understanding of how human audiences engage with their content.
1. Use automatic bot filtering tools
Many analytics platforms have built-in tools to filter bot traffic. While these filters might not catch all bots, they are a good starting point to achieve cleaner data. Often these tools must be activated by the website’s administrator to begin filtering bots. Google Analytics, Adobe Analytics and many ad services and fraud detection companies find bots through the IAB/ABC International Spiders and Bots List, which is managed and maintained by AAM.
2. Identify traffic sources and spikes
Web analytics platforms provide performance data such as the number of visitors and pages viewed, as well as where these visitors came from. This is usually found under the ‘acquisition’ category that lists if traffic originated from organic search, email, a referring website or another source.
It also helps to identify sources of traffic spikes, which occur when a greater number of visitors than usual hits a page. By identifying the source of a traffic spike, publishers can determine if it was caused by legitimate actions such as an article being shared on another website or social channel, or if it was caused by bots. Since referrer information can be spoofed by bad actors, other identifiers also need to be considered to accurately determine which marketing efforts are driving human traffic so that successful efforts can be duplicated.
3. Create custom bot filters
Once bots are identified, publishers can create custom filters to remove this traffic from their data. Filtering bot traffic gives publishers a more accurate look at how humans are interacting with their pages, which leads to better decisions about content and promotions. Bots are identified by characteristics such as the time spent on page, bounce rate, the time of day the visit occurred and other factors. Identifying unusual traffic patterns and creating custom filters to remove bots from analytics data ensures that publishers gain a more accurate look at audience engagement.
4. Check website tags
Analytics providers require that publishers install a tag on their website to allow them to collect data. If a tag is inadvertently installed more than once, data analytics will be inflated, or if installed incorrectly, it might produce incomplete data. Publishers should check tag containers to ensure there is only one tag for each tracking suite so data isn’t included twice in reports.
5. Participate in a third-party website audit
If your team needs assistance performing any of the above steps, a third-party audit can help. The AAM Digital Publisher Audit analyzes publishers’ websites to ensure they are attracting human audiences for their advertisers. Participating in the audit also helps publishers get a more accurate look at their data. AAM’s team helps publishers identify bots, create custom filters and ensure that websites are tagged correctly for traffic reporting. With more than 25 years of digital audit experience, our team stays on top of industry updates and changes so publishers have an independent third-party partner.
Learn more about how digital audits help publishers improve site analytics in the fourth article in our “Web Analytics 101” series, How Digital Publisher Audits Improve Site Analytics.