Skip to Navigation

The Future of SEO (part 4)

SEO is unlikely ever to be an exact science since the techniques used to promote web pages are constantly evolving as the search engines refine their algorithms to deliver more accurate results whilst combating spam. The efforts of marketers to understand search engine algorithm secrets has resulted in something of a cat and mouse game between SEOs and engine developers as they seek to exploit loopholes in the algorithms.

Many SEOs examine and probe engine algorithms in meticulous detail, testing different coding techniques to identify optimal markup; conversely, others swear that content is king – meaning the intrinsic value of a page and its resultant success in terms of SERPs visibility is largely dependant on good copy writing and tight relevance to search terms. When key players at search engine strategies conferences around the world disagree vehemently on the best approach to SEO, yet still deliver formidable results for their respective clients, it clearly illustrates there is as yet no perfect optimization strategy, simply best practice.

This is made more complex by the engines themselves who operate tightly guarded secrecy over their algorithms and processing techniques. Until recently, web marketers targeted and saw most traffic from Google; matters have changed now that Yahoo! and especially MSN have responded to the fray. As the battle for search supremacy became a three horse race the share of search segments has become more evenly distributed between the major players.

Tighter Algorithms

SEO experts – who may have once used dubious, perhaps unethical techniques, to identify and serve up different pages to meet engine-specific spiders – must now be far more careful in their approach lest they fall foul of complex engine spam filters and find their clients’ site barred from the engines or suddenly no longer featuring in the SERPs.

Google caused much anguish and outcry when they made fundamental changes to their algorithms in November 2003. The ‘Florida’ update saw thousands of once high-ranking sites all but vanish from the SERPs. The SEO world was thrown into turmoil as people tried to figure out what had happened. Arguments still abound but it is most likely Google placed filters against anchor link text – keyphrases used in and around links to other sites – and repeated use of a single phrase saw the link relevance devalued. Sites did not disappear altogether but failed to appear for the specified search phrase.

A side affect of the update was the proliferation of directory sites and link farms (sites build purely to generate outbound links – often for a fee). Google gradually corrected the flaw and today deliver remarkably accurate results. There have been further updates since Florida but none more shocking to the SEO world than Florida. A lesson was learned: don’t rely on dubious techniques to deceive the engines.

The engines themselves issue guidelines for SEOs; it is an unwise developer who ignores them.

Emergent Technologies

It is through the growth of the Web and a better understanding of its nature that both academics and businesses have come to shape the Internet. As the transmission and reception of information worldwide becomes more efficient, affordable and generally more available, so the need to better categorise, store and retrieve knowledge has pushed storage and retrieval mechanisms beyond simple indices.

The rise in computing power has enabled more accurate search engine results because the heuristics and later algorithms responsible for indexing can process and deliver the information faster. Emergent technologies like the Mobile Web with its opportunities for instant, on-demand search results and information retrieval, shape website development standards and afford further website marketing opportunities as the skills to exploit new markets are learned and adopted.

Other technologies like RSS (Really Simple Syndication / Rich Site Summary) and XML feeds permit transference of summarised website content for 3rd-party aggregators like Feed Demon where fresh content is automatically flagged for viewing. Such dynamic, proactive initiatives are embraced by marketers.

SEO will constantly adapt and respond to the quest for ever more reliable and relevant search results irrespective of whichever platform or medium they are displayed on, and as the search engines refine their processes the opportunities to tease out a marketing edge through algorithm exploitation will gradually disappear. May the best page win.

Article March 2005 by Sonet Digital an Internet marketing company that specialises in SEO and PPC advertising. They are a contributor to Search and Go special features portal and directory that provides up to the minute information on every subject imaginable. If you need info… Search and Go!

Previous Page

Future of SEO