Do you have a local SEO project that involves creating multiple NAP listings in Google My Business or local landing pages for localised organic search? If so, then you will want to know what are the most potentially profitable keywords to target. That’s exactly what you’ll learn by reading this blog post.
It is important to target densely populated areas to gain the most reach, for the least amount of work. In the UK there are 69 cities. That’s potentially up to 69 local pages that would need to be created. But well worth it if you consider that more than 91% of the population will live in cities by 2020 according to the Guardian.
But if you’re thinking of creating 69 pages to target one keyword, you will want to be very certain that it is for a keyword that lots of people actually search for, and that the keyword triggers localisation.
You can get search volumes from Keyword Planner, but how can you find out if they trigger localisation? Only certain keywords trigger localisation in the search results so it is important to start with as wide a selection of relevant keywords as possible. Once you bulk checked all your keywords and identified the ones that trigger localisation, so you can then prioritise your local targeting around the best search volume of those keywords.
For demonstration purposes, we’ll pretend we are doing SEO for a chain of hospitals.
We have noticed that the search term ‘cosmetic surgery’ triggers localisation, while ‘knee surgery’ does not. It is therefore not always clear which keywords will trigger localisation.
There is no foolproof way to predict localisation 100%, particularly for high volume keywords that do not mention the location name in them. This is because much of the local algorithm is down to user metrics. Basically, Google monitors know how users behave on a query by query basis and so uses this behavior data to decide what users want in each case.
Ultimately what you need is a huge list of words, such as ‘cosmetic surgery’, that trigger localisation, so that we can build some local landing pages and GMB profiles to capture the traffic from these search terms traffic.
Let’s get started
The method for finding local key words starts with a keyword list. There are numerous ways to get a keyword list:
- Use your keyword research (normally the first job done by SEOs)
- Export keywords from SEM Rush or similar tool (although doing this will result in lots of junk keywords as well as some real gems)
- Crawl your website and grab keywords from the H1, URL slug or part of the page title (you’ll probably need to use excel’s Text to Columns function to separate keywords from within URLs and/or page titles)
Now that you’ve got a huge lits keyword list, it’s time to bulk check which ones trigger localisation.
The first step in the process is to turn your keywords into Google URLs
- Put all your keywords in excel in a big list in column B. do a ‘find and replace’ to replace all blank spaces (between the words) for plus ‘+’ symbols.
- To the left of each keyword, in column A put: https://www.google.co.uk/search?q=
- To the right of each keyword in column C put: &ie=utf-8&oe=utf-8&client=firefox-b&gfe_rd=cr&gws_rd=cr
- Finally in column D concatenate the three components together. For example, =CONCATENATE(A2,B2,C2)
Now you should have a full list of Google search queries, one for each keyword in your list. Now we just need to get some software to visit each URL and check if it triggers localisation.
Finally, we need to run these through Screaming Frog to identify which ones trigger a map.
Firstly, set the Screaming Frog user agent string to Firefox so that it matches the ‘client’ specified in our column C URL. This can be achieved by setting a custom user agent in Screaming Frog and pasting in the following: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1
A custom user agent can be set in Configuration>HTTP Header>User-Agent
The next thing to do is to set up a Search within Screaming Frog. You can do this by going to: Configuration>Custom>Search
In this case, what we want to search for is a web page feature that only appears when a map is present.
During the process, I chose to search for the ‘More places’ link that appears at the bottom left of the local pack box
So I set my search for: >More places
It’s worth noting that Google sometimes changes things on their page. If the above search value doesn’t work, you can use inspect element to find some other boilerplate element to search for.
Now we should set the Screaming Frog crawl speed quite modestly so it doesn’t trigger a captcha. I set mine to 4 threads and 0.2 URI/s and it worked very well. If you have a lot of URLs you might want to leave your crawl running over overnight. I am not sure what the threshold is for the captcha to be triggered.
Now you should select List Mode in Screaming Frog and paste all your URLs from column D into Screaming Frog, to be crawled.
You should be able to see all the URLs returning a 200 response code as they are being crawled. If they are not, you may need to make some adjustments to the Google URL.
While it is crawling, spot-check your custom tab to see that what is being flagged does, in fact, have a map present. If it doesn’t, you may need to make some adjustment to the search value in Screaming Frog.
Once the crawl is complete, you can export the custom tab. This is the list of all the URLs that trigger localisation. Now you can V-lookup them against your initial list, or do a ‘find and replace’ to turn the Google URLs back into keywords.
You can dump this keyword list into Keyword Keg or Keyword Planner to get search volumes and then add these into your excel sheet to help you to prioritise which local pages are the most valuable. Alternatively, you can do some additional keyword research around the keyword list to ensure that you’re your pages will be targeting the most popular version of your keyword and to help maximise the reach of the pages.