Table of Contents
Google’s uses over 200 factors for effectively ranking web pages within search engine results pages (SERPs). As machine learning is being progressively added into Google’s algorithm (RankBrain), organic search is becoming increasingly a ‘black-box’ model, leading us to investigate is organic click-through rate (CTR) a potential ranking factor or not?
Does having a higher CTR boost SEO?
This article will cover all of those points and many more.
Organic click through rate is the ratio for the number of people that click on any of your Google results excluding Google ads. The primary method for analysing your organic CTR’s across pages and queries is via Google search console.
Generally a web page with an average position closer to 1 will have a larger CTR because people are more likely to click on the top search results.
The question is, do these click-throughs consolidate or strengthen the page’s ranking? Do higher CTRs help results further down the SERP to jostle for better rankings?
Click through rate (CTR) is usually expressed as a simple percentage out of 100. The formula for calculating it is:
CTR can be taken as an average per page or it can be segmented for each phrase or keyword that page ranks for. Average page CTRs provide an aggregated result and are generally less useful when you’re looking to evaluate the performance of a specific keyword or group of keywords.
There’s no straightforward answer to this as it depends on your ranking position and the search query. Position 1 results may have CTRs of ~40% or higher but as you can see from the previous diagram CTR quickly declines to under 5% beyond page 1.
An ideal method for understanding whether your pages CTR is ‘good’ is to compare it to other pages ranking in a similar position that contain a comparable number of SERP features.
Assuming that click through ratio data follows a normal statistical distribution. For example, If we were to look at a 1000 web pages at a similar position of 5, we could plot these pages on a graph like so:
Then we can find all of the poor performing pages at position 5 simply by doing a simple math calculation such as: Mean minus 2 standard deviations (µ – 2σ).
Although this might be a good start, we still need to take into account that organic click through rate can be complicated and challenging to model as Google’s results often contain knowledge panels, paid advertisements, people also asked questions and many more on-page SERP elements.
So what are some of the common SERP features that can influence CTR?
Google’s universal search is a way for Google to blend search results from vertical search engines such as Google Images or Google News into web search listings. As these universal results vary from SERP to SERP it can cause CTR rates to fluctuate with the appearance or disappearance of universal results.
As featured snippets and people also ask boxes generally display above the first position results in position 0, unless you own these features, it may be nudging your result down the page and diverting your traffic.
Branded keywords often have a higher click-through rate in comparison to non-branded terms, this is to be expected since the user is looking for a particular brand and these searches enable brands to easily claim first position for their name.
In contrast unbranded search term click-through rate curves are generally flatter across all of the rank positions.
Search intent plays a critical part in influencing your organic click through rate and provides an additional reason for segmenting your keywords into different intent categories according to the marketing funnel.
Informational Search: For example informational or comparison searches have a higher probability of triggering a knowledge graph / featured snippets to provide searchers with immediate on-SERP answers.
Transactional Search: Typically user journeys for commercial intent searches involve a lot of comparison shopping or researching so that the user can choose between several products/brands. These type of searches are more likely to therefore show more traditional links.
This is a chicken version egg problem, what came first? CTR or rankings? The variables are likely to be codependent.
CTR is primarily influenced by rank position, whether or not users believe your page will answer their query and how your page result looks in the SERP. Optimising every section of your SERP result is a crucial, iterative process.
<title> Beautiful Bananas by Elizabeth Laird - Fantastic Fiction</title>
Main page titles used to be restricted by character count but now they’re restricted by pixel length, you’re allowed approximately 70 characters but this may be lower if your title includes lots of long letters such as m or w.
<meta name="description" content="Beatrice is carrying a beautiful bunch of bananas on her head to take to her granddad. She sets off on the jungle path but unfortunately a giraffe accidentally" />
Meta descriptions are more detailed segments of text positioned below the title. They can be any length but to avoid them being cut off or truncated it’s best to keep them below 160 characters.
Meta descriptions should be snappy and descriptive, they really need to provide as many clues as possible that the answer to a user’s query is on this page.
Rich snippets are a relatively new addition to the SERP and include images which make a search result look far more appealing thus increasing CTR. You can provoke Google’s SERP algorithm to create rich snippets for your page by using JSON-LD schema-mark-up which creates structured data for a page which then explicitly provides Google with the right data in the right format to load rich snippets from your page.
Google can display links to other pages from a site in a SERP. This can you help to claim more SERP real-estate especially for your brand queries.
You can help provoke Google’ algorithm to display sitelinks by using a table of contents plugin and then linking to different portions of that page.
Here are 5 marketing psychology hacks that you can use to boost your pages organic CTR.
Firstly, the freshness reference is often referred to as how updated your page is so, i.e. if the page mentions that it was last updated this year this might encourage users to click on it because the content appears ‘fresher’ in comparison to posts that were last updated in 2014.
The tao-schedule-update plugin allows you to make incremental, scheduled changes to your posts or pages so that Google keeps providing your page with a ‘freshness’ factor.
Additionally it will modify the date-modified section of your SERP position, demonstrating to users that you took the time to keep your content updated.
The low price reference is for example if you know that you’re a leader in a market or if you’re competing for the position it could be worth mentioning the price within the title tag.
If you’re able to offer a product or service that is cheaper than anyone and customers are shopping on price alone this could be a way for you to earn the click on your title tag instead of somebody else’s result.
The volume reference showcases how much time and Investment you put into a piece of content for example 105 tips on how to boost your SEO rankings is a potentially a much more in-depth and well researched content piece in comparison to 7 growth hacks to boost your SEO rankings.
However its important to note that although content volume is important, you need to keep waffle and irrelevant points to a minimum. #saylessmeanmore.
The speed reference shows that you’re interested in providing fast, actionable solutions for potential searchers. For example this could be related to your fast 1-2 days shipping offer or it could be something to do with loans such as the fact that you might give a fully approved loan within 2 to 3 working days.
Finally the brand reference, if you have a recognised brand including your companies name in the title tag can often drive more clicks by causing brand-loyal consumers to pick your results over picking a different title tag due to brand affinity.
If CTR was a ranking factor then it’d be incredibly easy to fake. All you’d need to do is get clicks from Google through to your site, this could easily be carried out by bots or even by humans.
Bots are now viewed to be limited in their capability to influence any Google ranking metric as Google strongly encourages users to be signed in to Google accounts when browsing the web, This alongside other tricks that enable Google to figure out who is human and who isn’t, ranking bots on blackhat forums etc is now generally considered a redundant tactic.
The potential problems with this collected data might include:
At Sempioneer, we’re looking to build a large Google Search Console data warehouse.
Partly this will be for A/B testing and other data science SEO tools. However by combining a significant number of GSC end-points, whilst web-scraping Google SERPs (to account for rich snippet features) we would be able to predict for decreased CTR’s where we would expect to see them.
Imagine being able to know if your web page had an un-optimised CTR (%) given that:
All of these data features can already be accumulated via a mixture of Google Search Console Data and DataForSEO.com and fed as inputs directly into machine learning / deep learning models.