First Google Webmaster Tools Update in 2014

Cyrus Shepard (via Thomas Høgenhaven) has just reported an exciting update in Google Webmaster Tools. When you log in and navigate over to “Search Queries” you’ll see a timeline annotation saying: “An improvement to our top search queries data was applied retroactively on 12/31/13”:

GWT Update 2014

Click on “Top Pages” and you’ll notice a vast amount of new and improved keyword data for your URLs:

Keyword Data

According to Thomas Høgenhaven the data is now accurate and not just rounded numbers.

There are currently noticeable performance issues with the webmaster tools, particularly when the URL is expanded. Loading keywords seems to take a while and sometimes it times out.

Data Comparison

We’ve compared our query data backups for the same website from 1 December 2013 to 31 December 2013 and found that the data before and after the update is indeed different:

[two_columns ]Before[/two_columns] [two_columns_last ]After[/two_columns_last]
[two_columns ]Impressions: 552,000
Clicks: 7,350 [/two_columns] [two_columns_last ]Impressions: 548,941 (difference: 3059)
Clicks: 7,305 (difference: 45)[/two_columns_last]

Data Accuracy Benefits

The main benefit in increased accuracy of webmaster tools data is the ability to predict scenarios and focus your priorities on keywords and pages with greater potential to grow and generate revenue. We’ve  built our own version of the phrase potential calculator available on http://www.phraseresearch.com. The tool takes raw unaltered Google Webmaster Tools query export CSV and looks up position for each keyword. The end result is a table sortable by potential score, or in other words, a list of keywords likely to bring greatest benefit with the least amount of effort.

Random URL Test

We picked a random, low traffic URL which would have been affected by the recent improvements in keyword reporting and compared the data in Google Analytics, Google Webmaster Tools and Raw Server Log File. Here are the results:

[three_columns ]Webmaster Tools
Unique Visitors: N/A
Google Organic: 47
Encrypted: N/A
Non-Encrypted: N/A
Phrases: [how to visualise] (2), [large websites] (1)
[/three_columns]

[three_columns ]Google Analytics
Unique Visitors: 151
Google Organic: 48
Encrypted: 48
Non-Encrypted: 0
Phrases: not provided (48)
[/three_columns]

[three_columns_last ]Server Log Files
Unique Visitors: 283
Google Organic: 53
Encrypted: 52
Non-Encrypted: 1
Phrases: [how to visualise] (1)
[/three_columns_last]

Why is data not matching?

I hear many complaints that Google Webmaster Tools data is not reliable enough to be actionable. While assessing the accuracy of the data many use Google Analytics as a benchmark. Can you see how this could be a problem? Google Analytics data is not and has never been about absolute accuracy and many factors such as data sampling at large volumes or technical implications including presence and reliability of javascript on tracked URLs. As usual, if you want absolute accuracy you can always turn to log file analysis. Our tests are yet to show a report in Analytics that matches log file data.

Privacy Considerations

Google Webmaster Tools used to omit queries with less than ten clicks/impressions and last year I was told that one of the reasons for this was to protect user privacy (example: somebody typing in their email, password or other sensitive information and somehow reaching your site).

To protect user privacy, Google doesn’t aggregate all data. For example, we might not track some queries that are made a very small number of times or those that contain personal or sensitive information. Source: Google

Now that the click limit has been been lifted, there simply has to be another safeguard in place. Perhaps Google’s systems have the ability to detect unusual queries which deviate from the usual pattern on a semantic level and exclude them from Webmaster Tools reports. We’ll try to get an answer from Google and report back.

Phrase Clustering

Another thing which is not obvious straight away is that Google Webmaster Tools data is merged info similar groups. For example any of the surrounding search terms end up displayed as the search phrase in the centre of the following diagram:

Keyword Merging

At first you may think that any of the four phrase variations are missing from your data set. Instead they’re camouflaged behind the “canonical phrase” and contribute towards its total numbers including:

  1. Impressions
  2. Clicks
  3. Position
  4. CTR
  5. Change Parameters

To be fair, most of the above queries are likely to trigger the same document in search results anyway and it makes sense to show the cumulative figure instead of inflating the report with search phrase variants.

Data Limit

Finally Google Webmaster Tools top search queries are limited to 2000 search phrases:

Specific user queries for which your site appeared in search results. Webmaster Tools shows data for the top 2,000 queries that returned your site at least once or twice in search results in the selected period. This list reflects any filters you’ve set (for example, a search query for [flowers] on google.ca is counted separately from a query for [flowers] on google.com). Source: Google

This means that you are likely to recover only a fraction of the lost keyword data if you run a very large site (for example an online store or a catalogue). Note: We checked the data and it appears as if the original limit has been lifted, but the question remains, are there any limits currently in place?

More on this subject: https://dejanmarketing.com/tail-chase/

Official Announcement

Google’s Webmaster Central Blog has just covered the update on a new post titled “More detailed search queries in Webmaster Tools” saying:

“The search queries feature gives insights into the searches that have at least one page from your website shown in the search results. It collects these “impressions” together with the times when users visited your site – the “clicks” – and displays these for the last 90 days.”

According to John Mueller from Google this update will completely roll out in the next few days.

In other news Google is currently also rolling out a separate update that will show better search query data for websites with separate mobile sites.

“If you have a separate site for smartphones (eg m.example.com) then this will make it a bit easier for you to keep track of the queries going there.”

 

Dan Petrovic, the managing director of DEJAN, is Australia’s best-known name in the field of search engine optimisation. Dan is a web author, innovator and a highly regarded search industry event speaker.
ORCID iD: https://orcid.org/0000-0002-6886-3211

0 Points