SEO in 30 Years : What Will It Be Like?

I feel like doing some bandwagon-jumping today, and I’m all out of Top 10 lists. So I’m going to do some prediction-generation, always a popular and interesting task. What will search engine optimization be like 30 years from now? How will the SE algorithms have evolved in this huge – from the technological perspective – time period? Will every SEO have a solar powered flying car, or will they be reduced to working at algae farms?

In some ways, it’s easier to predict something thirty years in the future. There’s a lot of room for speculation, and I can easily assume everything that’s “experimental” now will finally be working, and the current state-of-the-art algorithms, which are understood only by a handful of scientists, will have been implemented widely. On the other hand, one also has to assume we won’t have reached the Singularity or total extinction yet, too. Ah well, details. On with the predictions!

Blackhats Only

Whitehat SEO won’t exist anymore, as information extraction tools will be extremely efficient. Search engines will be sufficiently advanced to find the information that really you need, correlate the best sources and produce succinct summaries or in-depth guides – your choice. With SE algorithms that much improved, “whitehats” really will be able to follow the “write for humans, not search engines” mantra. Only a small number of blackhats will still look for exploitable glitches and bugs in the increasingly complex algorithms.

The Rise of the Subscriber

For most information-seeking queries it won’t matter who or what created content. Only the content matters. Most of “what/how/where” queries will be dominated by autogenerated content – multi-document summaries, longtail articles created by semantically advanced NLP tools, and so on.

The sole exception will be popular/well known authors that manage to attract readers with style, inside information, creativity (fiction, webcomics, etc) and similar “human-exclusive” factors. This will also serve as a form of information filtering. The ability to attain a large number of faithful subscribers will be much more important than it is now.

Optimizing for Humans

Wait, didn’t I say whitehat SEO would be dead? So what is human-oriented optimization doing on this list? What I’m talking about here is social engineering and psychology. Subscriber attraction and retention. What’s blackhat about it? If Google eventually succeeds in policing linkbait, it’s likely that some human-oriented visitor attraction techniques that are perfectly “white” now would be considered “evil” SEO in the future.

Knowledge Spam

Instead of generating a massive amount of links, SEOs will attempt to feed search engines specially crafted statements and bits of information. This approach will target the semantic information extraction algorithms and try to make SEs “believe” something that the SEOs in question want them to believe. For example – “The best place to buy video cards is arandomstore.com!”. By placing this information on a number of sites (hopefully – highly trusted ones), they will try to convince the data mining programs that these statements are true. If successful, search engines will then use the “spammed” knowledge when answering searchers’ queries, which will further the spammers goals.

Simulated Personality

Search engines, especially Google, already collect a huge amount of information about what we do on the web, what our surfing habits are, etc. The most obvious use for this information is better ad targeting, but it doesn’t have to stop there. For example, to deal with the aforementioned knowledge spam, a blogger who regularly reads niche-specific sites might be given more SE “trust” (in that niche) than a random cookie-cutter blog where you can’t tell who or what writes the posts.

Auto-poster bots and splogs won’t be extinct yet, but the dumb algorithms used today will be completely ineffective. Instead, bots will be used to simulate thousands of virtual Internet users, each with their own taste and hobbies. A tiered system might be used, where a couple thousand (or tens of thousands?) virtual users subscribe to other virtual users’ feeds, endowing the latter with a significant amount of “trust”. The higher-tier virtuals would then be used to effectively push knowledge spam or similar.

Conclusion

I see three paths – AI, hacking and psychological manipulation. And no flying cars for you 😛

Related posts :

4 Responses to “SEO in 30 Years : What Will It Be Like?”

  1. I agree with your thoughts. Seeing that the internet is already a cesspool I believe that the search engines will involve a lot more tracking of time on pages and time navigating a site compared to similar sites as well. But then there will be artificial users created to simulate this. The search engines have a lot of work ahead of them.

  2. White Shadow says:

    Of course, eventually search engines will get good enough to judge pages on their own merits – the meaning and quality of the information – which will (mostly) negate the simulated visitor problem.

  3. high rank says:

    Dramatic commentary about google. I’m honestly flabbergasted that this hasn’t been pronounced before.

  4. Pat Jaysun says:

    Awesome blog post about SEO. I’m frankly astonished that that hasn’t been told earlier.

Leave a Reply