Is A/B Testing allowed by Google?
Am I going to get penalized in search results?
These questions seem to come up often, so I thought I’d sum up the latest information, both from Google’s own blog and the Internet’s collective wisdom.
The short answer to this question is yes, A/B Testing is perfectly acceptably by Google, as long as you keep a few things in mind.
The Risks – Duplicate Content and Cloaking
When thinking about A/B Testing, people are generally concerned about these two things.
The Duplicate Content Penalty
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar.
The thinking here is that Google doe not want to reward spammy sites that replicate content in the effort to boost rankings. This That being said, Matt Cutts (Google’s seer of search) has said that duplicate content issues are actually rarely a source of penalty, and is often simply ignored by the engine. It is however an understandable concern for A/B testing, given that you may very well be splitting traffic between two versions of a landing page with very similar content. One of the things to keep in mind is that this is relevant for pages on different URLs (which may not be the case when A/B testing – more on that later).
Cloaking
Cloaking is a potenially thornier issue to Google. Again from the Webmaster Tools Blog:
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
This underhanded technique is usually achieved by sniffing the ‘User-Agent’ in the request in an effort to identify the search crawler bot, and serving different content accordingly. Rather menacingly, Google suggests that if they detect cloaking on your site you may be removed entirely from the Google index. Rand Fishkin rightly points out however that this isn’t a pure black or white distinction, but has many shades of gray and there are in fact some legitimate and useful reasons for cloaking.
The potential to incur a cloaking penalty makes some people nervous when A/B Testing because you’re frequently serving up different content to different users using Javascript.
Google’s Position on A/B Testing
Google has clearly fielded this question a lot, and has officially expressed their position regarding A/B, website and multivariate testing. To summarize they recognize testing as a legitimate (non spammy) practice, but do recommend some guidelines to be safe in the search rankings:
- Using rel=”canonical”
- Use 302s for redirect
- Only run the experiment “as long as necessary”
Playing it safe
Which of Google’s is relevant depends very much on the type of test you are running.
Single Page Tests
In a single page test you are running an experiment on a single page, where all variations are served from a single URL. Therefore duplicate content is not an issue. Furthermore, cloaking is not an issue because (hopefully) you’re not basing your traffic allocation on a search engine vs human distinction.
Multi-Page Tests
With a multi-page test you’re running variations on separate URLs (for example if you’re testing completely different layouts of a landing page). So traffic is being split between www.example.com/landing-page-1
andwww.example.com/landing-page-2
. In this case the recomendations from Google are to use a 302 (temporary) or Javascript based redirect for the traffic allocation, rather than a 301 (permanent) redirect.
Additionally Google recommends using the rel=canonical
attribute on all of the alternate URLs, indicating that the original is the “canonical” URL to be indexed by the search crawler. Note that this is not the same as noindex
, and can be accomplished by adding the following in the <head>
section of variaton pages (landing-page-2
in our example):
<link rel="canonical" href="http://www.example.com/landing-page-1"/>
What does “as long as necessary” mean?
The last recommendation from Google is more ambiguous – “Only run the experiment as long as necessary”. The curious among you may be wondering what this vague statement actually means. In all honesty I’m not really sure myself. Personally I only run a test if there’s sufficient traffic to justify it in the first place, and either shut it off if it fails to reach a conclusion within my estimated timeframe or immediately after it does. If anyone has more specific guidance I’d love to hear about it in the comments.
So there you have it. A/B Testing remains an important part of conversion rate optimization, and if you follow the simple guidelines above you should get great results with Google’s full blessing.
Till next time, happy (and safe) optimizing.