Keyword Prominence – or why your on-page SEO software is lying to you!
If you’ve never heard of keyword prominence before, it wouldn’t surprise me, because I just invented the term. You’ll be hearing a lot more about it though, because keyword prominence is a more modern take on the old idea of keyword density, the mainstay of on-page search engine optimisation. SEO, as the cool kids call it, can refer to off-page optimisation too, but in this article, we’re only interested in optimisation that happens on-page, the process of tweaking website page content in an attempt to make it more attractive to search engines, such as Google. Keyword prominence is vital to on-page SEO, because it ties together all the strands that people have traditionally treated as independent, creating an overall metric that encompasses all important aspects of page-level optimisation.
Back in the day, SEO ‘experts’ tried to understand how to programmatically determine whether or not a page was of high quality, because they assumed that’s how Google, etc ranked the entries in their indexes. If they could figure out the details of that mechanism, they’d be able to manipulate the rankings and steal vast amounts of unearned free traffic for their websites. To that end, the SEO community spent a lot of collective time and energy trying to ‘reverse engineer’ the ranking algorithms in an attempt to pin down the things Google, Bing actually liked. That proved impossible, because there were simply too many constantly shifting unknowns, but what they did discover were some fairly basic things all the engines absolutely hate. The first and most important of these was that ‘keyword stuffing’, is a really bad idea, which is not really a surprise. Having figured out this apparently fundamental law of the Internet, the SEO crowd then tried to formalise it with the concept of ‘keyword density‘, which is essentially just the number of times a particular word or phrase occurs in a piece of text, divided by the total number of words in that text. A 100-word article where 50 of those words were ‘credit’ would have a density of 50% for the word ‘credit’, for example. As you can see, it’s not exactly rocket science, although it does provide a foundation for the concept of keyword prominence. It became widely accepted that the maximum value which wouldn’t instantly trigger index penalties was somewhere south of 5%. Any more than that, and experiments proved it was almost impossible to get a page to rank for the ‘stuffed‘ phrase. In a similar way, the minimum density sufficient to enable a search engine to detect the topic confidently was generally agreed to be somewhere north of 2% or so.
Other conclusions began to be drawn too; self-proclaimed leaders in the SEO community declared that the engines liked to see key words in bold, or italic, or in header tags. The list of things you had to do and must never do on a page grew to the point where expensive little bits of amusingly simplistic software like ‘SEOpressor‘ popped up in order make the process easier. It’s fair to say that if you stuff a page with keywords, the search engines are unlikely to take you seriously, so this approach has merit in the sense of alerting you to a potential problem that would prevent you getting into the indexes in the first place. However, as you will have noticed if you’ve been at this for any length of time, it no longer seems to work reliably. Even well written content with sensible keyword densities can be almost impossible to rank. So what is going wrong? What has changed?
Keyword Prominence is the answer!
Search engines want, above all, to please their own customers. A happy surfer comes back to surf again, and each page view is an opportunity to make money from adverts. What makes a happy surfer? You have to give her some high-quality content that is a good fit with what she is looking for. But how do you determine mechanically whether a piece of content is high quality or not? It’s subjective, right? The most obvious way is to let your own surfers effectively tell you. You track the bounce rates, in other words. So the question really becomes, ‘how does a human decide a page is good’?
That’s a piece of string question, unfortunately, but there are some things we can say with confidence by simply examining our own surfing habits. Perhaps the best way to describe what happens when you start to read a page is that you scan for what I’ve dubbed ‘keyword prominence‘. The topic of the page must be clear quickly in order to match your expectations, but it mustn’t try too hard. While that may sound like keyword density again, it’s a little more complicated, because humans don’t read a page the way bits of software analyse it. We scan it sequentially, and the topic reveals itself over time as that process unfolds, as does the actual quality of the content.
Keyword density then, is far too simplistic to be an adequate way of detecting instant quality fails; to a human, even a low density word suffering from a leery overabundance of decoration can be just as bad, because high-quality natural content doesn’t work that way. Overuse of the bold, italic and heading decorations so beloved of SEO experts can actually raise the page’s keyword prominence which can be just as detrimental as page stuffing. It gets even worse:- current SEO software doesn’t read a page in the way a human would. We process text linearly, and as we read, we build up a model of what we are reading, so related words or lexically identical words can have the same effect as overusing a specific key phrase. A 100-word article in which 50 of the words are “cat”, “cat’s”, “cats” and “cattery” is just as bad as a keyword density of 50% for the single word ‘cat’. Those lexically identical terms essentially increase the keyword prominence by drawing attention to the topic, and can result in a page that would look ‘spammy’ to a human visitor. That process is a little bit like a local version of off-page back-linking:- the synonyms are passing weight to the words they are associated with, in your mind at least. What else can we say? We read linearly, so if the topic doesn’t reveal itself fairly quickly, we’ll assume we’re on the wrong page, hit the back button, and try somewhere else. Position of words affects keyword prominence, in other words, because well-written articles tend to feature the key phrases throughout, not just clustered at the start or end of the post.
The Lexx Plugin for WordPress uses these observations to ensure you don’t accidentally fall over your own feet, metaphorically speaking. The marvellous thing is that Lexx takes all the work off you. No longer do you have to figure out independently what topic your page is targeting, Lexx’s Keyword prominence scores will tell you that automatically, recalculating instantly and effortlessly whenever you save or update a post. keyword prominence, combined with a few other omnipresent metrics (such as readability, text length and so on) sum up to automatically give you an overall quality score for your writing. Get that score above 70%, and you can be certain you haven’t made any of the beginner boo boos that generally cause a post to be binned by the search engines. The keyword prominence tool itself is about as easy to use at it comes. There are no parameters or settings, it simply tells you what’s what, and offers helpful suggestions to make sure your post gets over that first hurdle, and can begin to earn the traffic it deserves. Keyword Prominence, although discovered by TigerStep, is open source, because we personally feel a better web with less spammy content is the way forward. To that end, here’s…
By: Dr Daria E. H. Chene