Google Webmaster Central Hangout: 15-02-2012
Google webmaster central hangouts are a great way for webmasters to ask questions and provide feedback to Google’s webmaster team. How much can you fit into a single session? A lot apparently.
For those who have missed this week’s hangout we’ve prepared a summary of discussed topics.
- Google Webmaster Tools
- Feature Requests & Ideas
- Crawling & Indexation
- Site Speed
- URL Discovery
- Server Load
- Crawling Prioritisation
- Crawl Rate Scheduling
- Risks of Variable (constantly changing) Content
- New Penalty Type?
- Link Devaluation
- YouTube Changes
- Result Sorting
- +1 Canonicalisation
- Hangout & Blog Topics:
- Faceted Navigation
- Large Websites
- Web Search Pipeline
- Q&A From Google Forums
- Topic List & Voting
- Affiliate Marketing & Content Duplication
- Google+ features, how it works and future integration with search
- Authorship, including verification procedures, problems and impact of authorship on search
- Faceted navigation, best practices
- Managing SEO on very large websites
- Taking questions from Google webmaster forums and answering in hangouts
- Being able to submit and up-vote topics for hangouts and blog posts by Google
The discussion followed with series of interesting questions around the nofollow tag and its use within robots.txt and introduction of rel nofollow link based integration to prevent spam and link buying practices.
Does Google follow and index no follow links? Nofollow for internal pages makes no sense. Why should it not pass PageRank? Does speed affect crawling and indexing? Yes. Nofollow basically prevents PageRank from passing. Discovery is allowed however. PageRank affects crawling depth (still) but now in combination with different additional factors. Google is no longer limited in how many URLs they discover but do use PageRank and other metrics to decide what goes into index and gets cached. Websites with low PageRank get a low crawl budget. Website speed affects the crawl rate in cases where it severely affects user experience and ability to crawl. Otherwise Google is pretty persistent. Two types of page speed ‘response time’ and ‘rendering time’ wich is more to do with user experience than search engine accessibility.
Can we have scheduled or timed crawling to prevent Google from hammering the site on certain days and times?
The feature doesn’t exist, but webmasters have the ability to manually control it via Google Webmaster Tools.
Having one portion of your page content frequently updating (on each new crawl). Could this be a problem or cause a penalty?
Not really a big issue.
Is there a new type of link devaluation replacing the usual penalty for websites with partially spammy link profiles?
Not a new feature, Google has always done it. Links they typically ignore are the ones related to spammed out websites (forums, comments…etc). Nothing to do with low barrier to entry.
Why is YouTube placing a nofollow tag on new design pages? Don’t they trust their publishers anymore?
Not sure. Ask YouTube team, Google doesn’t really mess with their decisions directly.
How does G+ search order results (non personalised)?
No answer. Smile.
Does Google+ consolidate all +1’s into a single canonical URL?
Yes, +1 picks up rel canonical automatically and increases +1 count towards the main URL variant.
Downloading data from Google Webmaster Tools, what’s the best way?
There is a PHP script made that does that, also a Python version available but may be a bit hard to implement by small business owners due to complexity. We suggested they allow webmasters to save data to Google Docs from Google Webmaster Tools. Alternatively opt-in to preserve your data longer than average. The reason Google Webmaster Tools erases data after 40 (or so) days is due to the fact that most webmasters sign up for an account but don’t really use it and observe traffic and ranking data which is waster space and resources for Google.
Will there ever be Google Webmaster Tools Premium?
Will authorship be supported in Italy soon?
Not sure, but hopefully.
Can affiliate URL content outrank original content publisher and cause SEO problems?
“Many of our affiliates require a feed containing product info. I have heard of some ecommerce sites being hit by a duplicate content penalty when affiliates have used their original content. What would you advise in this situation? Not offering them any content is not an option.”
Google attempts to serve only one relevant copy. Winner is best the best site in Google’s eyes. This could be an affiliate.
Would a time delay help in order for Google to see our content as being the original?
Yes, a bit to determine the original publisher, but there is more to it. Photographers is an example have original artwork on their site, somebody else uses that photo and adds better content. This may rank better in Google.