Skip to content
Home » Insights » SEO Best Practices for Personalised and IP-Based Content

SEO Best Practices for Personalised and IP-Based Content

Person using a search engine on their phone

A client recently asked us whether they should show personalised content recommendations to users based on their region. For example, due to licensing issues, a video may not be visible for users in one country. Presenting a user with a video they can’t actually watch isn’t a great user experience. To avoid this, you can hide the video, or replace it with another, in order to avoid disappointing the user. 

We said yes to their request, but naturally there was some apprehension around whether it could lead to a penalty from Google, because it may be considered as ‘cloaking’. However, when personalisation and IP-based content is delivered properly, the opposite is true and the move can have a positive effect on the user experience of your site. Before we get stuck in, here’s our main takeaways on the subject.

Key Points

  • Content should not be hidden from users and shown to crawlers or vice versa, this is cloaking and can be penalised.
  • Slight variations in your content based on the user’s IP that improve the user experience without drastically changing fundamental elements of the page (e.g. header structure, majority of on page copy, search intent) is not cloaking. 
  • Google has confirmed some IP based content on a page when applied correctly is not cloaking and can improve user experience.
  • Whilst there may be elements of a page that vary between regions so long as the content is the same for both users and crawlers within these regions this is not breaking Google guidelines.
  • Googlebot crawls the web from the US and will experience your content as if it were a US user in most cases. If it is unable to (e.g. due to IP blocking) it will crawl from another region. You should bear this in mind when personalising elements of your content.

What is Cloaking? 

Cloaking refers to a ‘black hat’ SEO technique that involves showing a web crawler a URL, page, or section of content that is different to what an actual visitor sees when they visit a website. The technique is used to deceive the search engine into thinking the site includes relevant content and keywords in order to gain a higher rank in the search engine results pages (SERPs). 

There are a few different ways people use cloaking to improve their rankings:

  • IP-based cloaking – serving significantly and entirely different versions of a web page depending on the user’s IP address
  • User-agent cloaking – using information such as browser type and device to modify content and appearance based on the user’s browser 
  • Referrer cloaking – hiding a website’s source by altering the HTTP_REFERER header 
  • JavaScript cloaking – hiding a website’s source code from crawlers to give user and crawlers different versions of content
  • HTTP accept-language header cloaking – serving different website versions to different users in relation to the language preferences of their web browser

How Do Search Crawlers Work? 

Google isn’t automatically informed when a new page appears. Instead a web crawler constantly scours the web to identify new pages or updated content. It does this by following links from a known page or are seen when a website owner submits a sitemap (a list of pages). Once a URL is discovered, a Googlebot uses algorithmic processes to determine whether or not to crawl the page. If the page is crawled, the bot will then analyse the content. If the content is deemed valuable, it is then ranked appropriately.

It’s important to note that Googlebots generally crawl from the US, specifically California. This means Googlebots almost always view content as a US user.

‘If your site has locale-adaptive pages (that is, your site returns different content based on the perceived country or preferred language of the visitor), Google might not crawl, index, or rank all your content for different locales. This is because the default IP addresses of the Googlebot crawler appear to be based in the USA. In addition, the crawler sends HTTP requests without setting Accept-Language in the request header.’ 

There is no difference between crawlers and human users, other than that crawlers are automated. To identify themselves as crawlers, they will add something like ‘Googlebot’ to the http header request. However, they don’t have to do this. So to catch when a website is using a technique such as cloaking, they simply don’t announce themselves as crawlers to appear as a user. When suspicious behaviour is identified, sites can be penalised or banned completely. 

How Do I Make Sure that Providing Personalised Content isn’t Seen as Cloaking? 

Whilst cloaking is a practice that goes against search engine guidelines, personalisation is a beneficial addition that will enhance a visitor’s experience on your site – provided you do it correctly. Here’s how to do just that… 

Don’t Change the Entire Page or URL 

As previously mentioned, bots generally view content as a California user, so if an entire page’s content is based on an IP address, then the crawler will only see content related to California and will be unaware that you offer content on a multi-regional basis. This can affect rankings in different locations, as you will struggle to rank anywhere other than California. 

During a live stream, John Mueller from Google Search Central explained how you can avoid this issue: 

Some sites try to recognise where the user is and display something additional that’s specific to that location. From our point of view that’s perfectly fine…the general advice is to have a significant part of your site that’s general, that’s valid for users everywhere and then some part of your site that is specific to the user’s location.

His advice is to ensure there is a significant portion of content on the page that is relevant to users regardless of their location, as well as offering a personalised element based on the user’s IP address. In doing this, you can show the crawler that your content caters to users across different regions. 

The same advice goes for personalised content. If you’re showing specific content based on a user’s last visit or recent purchases, be sure to include a large portion of general content on the page as well. 

Avoid Geo-Location Redirects

When set up, a geo-location redirect will automatically redirect a user based on their location through an IP address, cookies, or browser language setting. 

The main reason this is an issue is because a redirect will also work on a crawler. As the bots are typically from the US, when a redirect is in place, it makes it difficult for Googlebots to crawl any non-US websites. 

If the bot is redirected, it may be unable to crawl and index pages or versions of your pages – leading to reduced visibility in the search results. Not only this, but without following the right measures for canonical and hreflang tags, you can end up with duplicate content which can affect your SEO. 

Successful IP/Regional Based Personalisation  

Netflix 

There are plenty of great examples of websites and services that utilise personalisation – Netflix being a notable one as it uses both personalisation and IP-based content perfectly. 

Netflix doesn’t own all of their titles globally, excluding Netflix originals, so instead of presenting you with a show you can’t actually watch, it restricts which titles it shows you. Not only does this improve user experience, but it also ensures Netflix avoids complications due to regional licensing restrictions. 

Alongside this, Netflix also personalises a good chunk of your feed to bring you content that you are likely to enjoy based on your history and watchlist. 

Asos

Ecommerce websites typically benefit from elements of personalisation as you can tailor the content to suit your user. In doing this, you can put the right products in front of the right audience. Websites like Asos use features such as ‘recommended filters’. These display products you’re most likely to be interested in based on your browsing or purchasing history. 

Should You Consider Personalised or IP-Based Content?

Whether or not you should use personalised content on your site is entirely down to personal preference. 

In his live stream, John Mueller commented “If you feel providing more local information to users adds significant value to your pages, then by all means, do that.”

If the addition of localised or personalised content is beneficial to your users, then it’s a great way to improve user experience and further appease Google. If it’s not going to add much value to your site or for your users, then there’s little point. 

Want a Hand Adding Personalised or IP Based Content to Your Site?

If you think personalised content would be useful for your site, get in touch with one of our experts today to discuss how we can help.

Author

  • Marcus has spent his career growing the organic search visibility of both large organisations and SMEs. He specialises in technical SEO but he’s obsessed with curating strategies that leverage expertise and unlock potential.

    View all posts