Google Instant Previews: Unanswered Questions

Known Unknowns That Affect Your Searchers and Site

Now that we’ve had Google Instant Previews for a week a little more information about how they work has come to light, but there are still some things we don’t know that will affect the experience of searchers looking for your site. It may seem laborious to keep harping on this, but if any sizable portion of your traffic comes from Google, then if affects your business. So we’re going to harp a little more.

As we mentioned earlier, in Google’s official blog post announcing the feature Raj Krishnan, a Web Search Product Manager at Google, said:

Once you click the magnifying glass, we load previews for the other results in the background so you can flip through them without waiting.

For the geeks here at [meta]marketer, this is a little imprecise. Raj uses the word load rather than the word generate. We found this interesting and Google is remaining…let’s say…tight-lipped about how this actually works. But in the comments on a post on the Google Webmaster Central blog, a Google engineer shares this little tidbit:

we use normal crawling to create these previews (on-the-fly accessing is only used for cases where we don’t have recent, complete data from crawling)

which explains Raj’s choice of the word load. And introduces a number questions.

Bots vs Agents

Some previews will be generated by the GoogleBot and some will be generated by the Google Web Preview Agent we mentioned before. Is there a practical distinction?

  • GoogleBot respects (to some extent) your robots.txt file. Is there anything disallowed in your robots.txt that Google would need to render accurate previews? Like image directories? Does the Agent respect robots.txt?
  • Bots have not historically had the capability to execute Javascript, plugins, and the like; they just read flat files. Agents, however, are usually Web Browsers and do have this capability. Does GoogleBot have enough brains to properly interpret Javascript on sites that rely on it for layout and presentation? Flash is currently unsupported, but will GoogleBot eventually interpret Flash? Will the Agent support Flash, and will Flash support be identical, given the technological differences between bots and agents?

Crawls and “Recent” “Complete” Data

If the previews are mostly generated when your site is crawled by the GoogleBot, there’s also a “freshness” issue with the previews.

  • If a page is updated more frequently than it is crawled, will the searcher be seeing stale previews? Or – to ask that question a different way – how long are previews cached for? How is “recency” of the data measured?
  • What does “complete” data mean? If we presume that GoogleBot doesn’t understand Javascript, but the Agent does, is on-page Ajax enough to consider the data incomplete and generate a preview on-the-fly?

An Example: Does This Make Me Look Flat?

How previews appear is – obviously – going to be the single most important factor for searchers, and flawed previews are likely to affect traffic volume. Just a few days ago, Apple.com’s preview looked like this:Google Instant Preview of Apple.com
And today it looks like this:Google Instant Preview of Apple.com (Fixed)

The difference? The site hasn’t changed, but the preview is now depicting the Apple homepage in the state that occurs after a 3-stage fade-in of site elements (accomplished with Javascript). While Apple.com provides an engaging customer experience, the original preview did not extend that experience to the searcher.

Rank Speculation and Recommendation

One possible explanation is that Google implemented a major technological change just days after release; perhaps the preview generation didn’t support Javascript on Day 0, but by Day 7 it did. That’s possible, but at this point we don’t have any word from Google or enough data to know. If Google announced a major new feature that didn’t fully support the technology on your site, what would you do? Wait and hope for the best?

A second possible explanation is that Apple wasn’t happy with the empty preview and adapted their site implementation to render an accurate preview. If that’s what happened, it’s a good example of being responsive to major changes that affect customer experience, and that businesses who rely on Google for traffic (and customers) need to be ready to adapt to major changes quickly.

A third possible explanation is that the first preview was created by the GoogleBot, which doesn’t understand Javascript, and that the second was created by the Agent, which does. If that is the case, it highlights the fact that the technical details of how this feature works could impact your business. Are you ready for that?

This entry was posted in search and tagged , , . Bookmark the permalink. Follow any comments here with the RSS feed for this post. Both comments and trackbacks are currently closed.