How many of your top ranked keywords are under performing? If you are like most companies, your not tracking it so you may not know.
The goal of most SEO’s is to achieve top rankings and high conversions. Unfortunately, once we get the top rankings little is done to maximize it or even make sure the correct page is ranking.
Many SEO’s have stopped doing ranking reports due to personalization or the inability to show any real performance improvement from it. I can understand both of these which is why I developed a report in the application called “Top Ranking keywords that Suck” – well, that is the working title until we can figure out what we want to call it. We originally started with the tongue twister you see below “ Top 5 Ranking Keywords with Less than 5% share of clicks”
When I have done SEO for clients I wanted to focus on the immediate low hanging fruit that drives incremental revenue. I have talked at conferences in the past where I found keywords that were ranking and simply fixing the snipped resulted in not only increased traffic and revenue but on one case a PPC click cost reduction of $26k in one month.
The following is the use case that I created for this function in the application.
Tool Use Case: Identify keywords that are currently ranking well (top 3 or 5 positions) but getting a low share of clicks (less than 5% of the total search volume or less than the average click rate for its position). This would allow the Search Marketer to quickly identify underperforming words to review snippets for relevance as well as messages and offers.
In our first generation we simply called this out on the dashboard and allowed you to click in and see the details and do sorts on the data. The default view is performance sorted by highest CPC but you can sort as below – by highest Google demand.
The new version of this report we are calling, as mentioned above, “Top Ranked Keywords that Suck” – in this example, this client has a fairly experienced and robust SEO team. In our analysis across a few beta clients we are finding similar patterns:
In this example, the client has 1.4 million keywords in the database. Of those keywords, 101 currently rank in the top 3 positions of Google but are getting less than a 5% click rate. If they could increase the clicks from Google for keywords to 5% that would be in increase of 103,678 visits a week.
[Why 5% – we use a simple rationale for these. 10 organic and 10 paid = 20 chances of being clicked. If a searcher clicks just 1 listing we have a 5% chance of being clicked. No complex formula of paid and organic, national brand, message, universal search results – just simple probability. ]
Actionability of the Data
I have to say that whenever I do a prototype for a new function I am haunted by the voice of Avinash Kaushik in “don’t just Puke Data – offer Actionable Insights” – based on his “Actionable Dashboards” presentations. this was no different. How can someone process 101 results? We are still working on ways to just mine the uber nuggets of data – see “Crowdsourcing Questions” below.
In our tool we can sort these by Google Search Demand, Revenue Per Visit, or Conversion Rate or any other number to identify which of the 101 that we want to fix first. Typically the user is choosing those that have the highest demand and the lowest click rate. The typical user of the tool is trying to identify 5 or 10 keywords a week to try to improve.
Action 1: Sort by highest Google Search Demand – this helps us find the keywords with the greatest opportunity overall.
Action 2: Sort by revenue or Revenue Per Visit (RPV) to identify keywords that generate the most revenue per visit. This is a good variable since our goal of this look is to find words not getting the share of visits.
The following are key insights when we do this:
.acme is a brand misspelling keyword and there are clearly not 695k searches for it but Google associates that to the brand name. We are now dealing with this by filtering out keywords with keyword type = Brand Misspelling
Cheap Tickets – in our demo case, they are ranking #2 in Google and only getting 6,492 clicks or 2.6% share. A 5% share would be 12,500 or 52% of our goal opportunity. Even a 1% increase in traffic would
Note: In this case if we average the actual click rate of this client for keywords with an organic position of 1, 2 or 3 without paid search influence it is 4.32% so not far off the 5%. We can assume the actual click rate would be higher if we could count those 103k clicks we should be getting at 5%
In the next generation of the application, for each keyword phrase we will return the snippets for the top 5 positions to review them and apply a “reason for non-performance” In doing beta tests we started to understand why some of the words are ranking and others are not.
As with everything in search, there are never absolutes. In working through the unique percentages of clicks for a company like IBM we may need to have more robust scoring moving forward. The following were interesting findings from an exercise with Lee Moore, Global Search & Syndication Manager from IBM. He wants the ability to flag those not performing for one of the following reasons and re-weight the variables.
- Context of the listing – for example, There are x searchers for SOA and IBM, while ranking #4 is only getting about 1% of those searchers. Looking closing we can assume it is due to “context.”
- #1 ranking listing is “Society of Actuaries” which has been around a while
- #2 is Wikipedia for the technical term Service Oriented Architecture
- #3 is the popular TV show “Sons of Anarchy” which is abbreviated SOA and most of the cast and followers in social media refer to the show that way.
- #4 IBM Service Oriented Architecture (SOA) solution
- #5 is Oracle which is a competitive placement.
So we can assume a large share of the “Searchers Intent” and clicks are related to “Actuaries” and Sons of Anarchy and not expected to go to IBM. Since this is also generic, a site like Wikipedia may give a better “What is SOA” than IBM or Oracle if they are looking to the technical variation of the acronym.
- Message – Not related to this IBM case but if the query and “searchers intent” is price based and the price is out of line this will cause a reduction in clicks. If the message is branding and they want to buy that may reduce clicks.
- Paid Search – If there is or is not a paid search ad for the company may impact click rate. Many studies show that both a paid and organic lead to increased brand awareness or assumption they re bigger resulting in a click on one or both.
- Branded Keyword – branded keywords tend to get clicked more than non-branded. For example, Dell Laptops vs. Laptop Computers. In this case listing 1 and 3 are branded for “other” brands and not for Service Oriented Architecture. It can be a battle of brands between IBM and Oracle.
- Gibberish Snippet – already addressed but added in for completeness, if the snippet is bad for IBM or any listing the potential of clicks drops exponentially.
Base on these factors, we need to factor in some sort of automatic or manual override of the percentage based on an analysis of the listing so that it is not counted against the company going forward. We can also use the same analysis to help identify the need for and priority of corrective action.
Crowdsource Questions: Post any suggestions in the comments section below:
- Should we restrict the rank to top 3 or top 5 positions?
- Is a search demand of 500 the right cap? If we lower it, how do we filter out those in mass?
- Since these are ranked high this will draw clicks from paid search. Due to bad snippets there should be increased clicks on paid. Not sure how to calculate yet but am thinking of a “Cost of Not Clicking calculation that would show that the delta in the 5% is made up with paid clicks.
- Should we offer facets to allow user to choose which position, search demand and which click rate?
- Should these filter variables be user editable in an admin area?