Skip to Navigation

The Perfect Search Engine

Dreaming about a perfect search engine is one of the things I like doing during those rare hours when I’m free from the rest of my duties. We all know that Google is far from being perfect. It is also beyond any doubt that Yahoo! and MSN are even further from ideal. What if we could ask a wizard to wave his magic wand and create for us the perfect search engine – what would it look like?

The Obvious

Of course, for most of the SEOs the best engine would be the one that would give them the top ranking for all keywords they target, and quickly. Who doesn't want good rankings? The problem is that there is only one top ranking spot in each SERP so it’s impossible to keep everyone happy.

So, let's look at the perfect search engine from the user's point of view.

The best search engine is, of course, the one that brings the searcher what they're looking for. Does that always mean then the site with the best calculated optimal keyword density or the site with the greatest number of links? Of course not.

If the surfer is looking for information, then the best site is the one that gives the most comprehensive review of the desired topic. If the purpose of the engine’s user is to find a shop selling shoes, the engine should come up with the information needed to choose what fits best their needs and size, and a friendly interface that won’t confuse when one is ready to visit the “Checkout" page. If the respectable business owner is searching online to find lets say, the best sub-contractor, wouldn’t he expect to find the most reliable one, with a clean record and years of admirable, honest work?

The current engines aren’t capable of such in-depth estimation of the quality of the billions of web resources scattered around the web. Will the engines of the future be smart enough for this? Quite likely, they will. I firmly believe that the engines in the not too distant future will have to become, at least, semantic – or fail miserably and disappear from the scene.

Do all links vote FOR the site?

The search engines of today calculate the authority of each web resource by looking at the number of links that point to it from the rest of the internet. The more, the better, they think. Apart from being easily manipulated, this method has another important flaw, often links point to sites with the only purpose of showing the surfers how bad a certain resource is. The “miserable failure” Google bombing campaign against George W Bush’s official site has already become a classic example; there are many others of the same sort. Is it right to count such links towards the overall link authority of the resource? From the humans’ point of view, probably not. But can the engines tell good from evil? Not yet, and never will, unless they become really semantic.

The end to SEO copywriting

When the engines become capable of reading into the actual meaning of the web text, there will be no more need to keep those boring keywords in mind when writing copy for sites. Our only goals will be to keep it informative, persuasive, readable and pleasing. The engines will analyse synonyms, derivatives, grammar and, most importantly, quality of the writing style, and choose the number 1 page based on all these factors, not on keyword density in the copy or in the title tag. There will be no more doorways and web copywriting will become a much harder task than it is now. It is not hard at all to saturate the copy with keywords; it is much harder to make it really perfect – as perfect as the Perfect Engine will require it to be.

The search engine engineers of today are already playing with semantics packages, and the latest results definitely show that the engines already prefer better written copy. But doorway creators still flourish, and their machine-generated rubbish “pages” overloaded with thousands of low-competitive search terms still bring them traffic and, sadly, some revenue. This means the engines have a long way to go still before they win this endless battle and become really perfect.

True Link Authority

The current algorithms of calculating link authority leave something to be desired, as well. The key is to acquire more links, as many as you can, and your own authority will grow tremendously and then everyone will want a link from your site thinking it will add value to their work. It’s a logical principle, but a bit too obvious and a bit too primitive. It can be (and is) easily manipulated.

These days, apparently, the engines are starting to realise that to be able to pass authority, the site has to have authority, not just tons of inbound links pointing to it from all over the online Universe. They now look at relevancy more closely than before, and give new links a certain “trial period” before they start actually affecting the rankings of the target site. They are also starting to develop very complicated algorithms that would analyse the natural linking patterns. But these first steps in the right direction are timid and unsure; the engines tread these new paths very, very carefully

In an ideal world what other factors should affect the site’s authority, apart from the sheer number of inbound links? Obviously, the age is one of them. Another factor would be history; sites that maintained the same owner during their whole lifetime should be more authoritative than those that were sold and bought now and again. Sites that were never banned or penalised are bound to be more respectable than those caught using dirty tricks, especially more than once. Many other things come to mind, such as the number of pages containing quality content (remember – semantic engines will know what it is), the freshness of the latest added pages, the loyalty of visitors through the years and the business reputation of the owner. The engine capable of incorporating all these (and many other) factors into its ranking algorithms has simply no other way but to succeed (maybe, at the expense of leaving us SEOs unemployed).

The awakening

When I suddenly woke up, the Perfect Search Engine disappeared leaving me with Google and other engines that looked very familiar. I sighed and went back to my everyday routine: keyword research, SEO copywriting and, of course, links, links, links! Frustrated, I added a few keywords to a page, and it soon jumped from the 3rd page in Google to the 1st; the client was happy and placed another order.

It made everyone happy: my colleagues, the boss, and of course, myself. It was a cool, exciting sensation known to all SEOs and never losing its charm, no matter how many rankings you achieve

Who cares about dreams, wizards and magic wands when life is so real?

Irina Ponomareva joined Magic Web Solutions ltd. (UK) on March 2003. She has been acting as a web master, a developer, and an SEO specialist ever since. Irina then launched Spider Friendly - www.spiderfriendly.co.uk - the autonomous SEO branch of Magic Web Solutions (UK) which provides SEO/SEM services.

engine