Sit down; let us tell you a little tale about the Wild West…
And by Wild West, we mean the early days of marketing on the Internet.
In the mid-1990s to the mid-2000’s, the Internet was just coming into its own. More people than ever had home computers, and businesses slowly began capitalizing on this by creating websites and online ads to market to home users.
Because search engines and websites were fairly simple at the time, SEO itself was fairly simple. It started with having a domain name directly related to whatever you were marketing. Something like fun-toys-for-cats-from-hypothetical-brand.com would naturally rank higher than, say, just hypothetical-brand.com alone.
Next came the oversaturation of keywords. Rather than writing high-quality content, writers simply had to create ANY content at all and stuff it with as many relevant keywords as possible. This included both base keywords and their variants.
Because most search engines were only capable of cross-matching words (known as exact match), this approach worked.
It was also incredibly spammy.
The First Evolution (Or How the West Was Won)
By around 2000, it was becoming very obvious just how much of a problem this approach caused. Results were rarely useful, and even when they were, you often had to go to the third or fourth page to find what you needed. Searching felt frustrating, annoying, and exhausting.
By about 2005, Google had had enough. Search use was dropping off because of the low quality, hampering the company’s ability to sell ads. Users were frustrated and turning to other indie engines instead.
So they decided to make a major change: keywording would still be relevant, but not in the same way. Instead, Google would focus on reasonable use of keywords, including where and how keywords were used (e.g., meta tags, H1 tags, and within the body of text).
Writers who engaged in deceptive keywording tactics (like hiding text or keyword stuffing) in an attempt to manipulate results were penalized. Often, Google removed their websites from search engine results altogether.
Around the same time, Google also began to put more emphasis on linking strategies. At the time, they believed that pages with more backlinks must obviously be creating more quality content. After all, why would people link to bad pages?
This was the first and most impactful change in the SEO and content writing industry. Rather than paying for churned-out cheap, spun content, businesses had to scramble to find skilled writers who could write without angering the “gods.”
The Final Frontier
Search engine results did improve after these changes. Overall, results became better targeted to the needs of searchers with fewer spammy links or uncorrelated results. But there was still a problem; people were still gaming the system, especially in relation to keywords and links.
In 2008, when Google first placed more emphasis on linking, they underestimated how likely people were to abuse these links. The result was that marketers began creating link directories and link farms – pages they owned or posted a link to their website on solely to increase backlink volume.
Obviously, this was a problem. In some cases, websites would have 20,000 or 30,000 backlinks pointing to them, giving Google the impression that they were immensely popular. In reality, those links were fake and/or purchased from link farms instead.
The Start of the Future
In 2011, Google rolled out their Google Panda update. Panda focused on using more contextual analysis to provide a better picture of what really qualified as, well, quality. It also penalized users for manual links – including backlinks, purchased links, and linking directories or farms. Within an instant, a significant portion of the marketing industry found themselves sandboxed.
It gets worse.
After Panda, many websites who weren’t even using paid backlinks or linkfarms suddenly found themselves de-listed.
As it turns out, competitors were abusing Google’s system to get their opponents de-listed, and there wasn’t much they could do about it.
Enter Google’s Penguin update. Penguin made the penalties for abusing keywording or links even more harsh, but it also gave webmasters the ability to disavow links permanently. This ensured the system wasn’t abused.
By 2016, algorithms were smarter than ever. Keyword matching became much less effective; instead, Google started using a strategy called intent matching. How intent matching works is fairly complex, but essentially, it focuses not only on what’s being searched, but the specific intent with which the searcher is searching.
The Mobile Revolution
It was around the same time that Google enacted a massive crackdown on bad mobile website practices. Websites that refused to optimize for mobile saw a drastic drop in ranking. This was the first time we saw the search engine really factor in website design as well as keywords and contextual search information.
Just afterward came another change to mobile and its impacts on ranking. Pages that used mobile pop-ups (especially if deceptive or difficult to close) would also find themselves falling in rank or de-listed altogether.
In the search engine’s official announcement, they drew parallels between pop-ups and poor user experience, especially in accessibility. “Pages that show intrusive interstitials provide a poorer experience to users than other pages where content is immediately accessible,” they explained.
Fortunately, everyone’s favorite search engine also provided webmasters access to a handy tool to help them acclimatize. The mobile optimization test, (found here) remains online even now. It’s a fantastic way to get a quick peek at your website’s mobile friendliness, even if it is an admittedly surface-level view.
Looking Forward to a Changing Future
That brings us neatly up to just about present day. In 2018, we’re seeing more change than ever on the horizon as search engines respond to increasingly complex technological demands.
Mobile, for example, will likely be a much larger focus as people make the shift from big, bulky computers to smaller handheld devices. Given that over 50 percent of all users view websites from a tablet or smartphone, that just makes sense.
Newer technologies that don’t currently factor in to Google’s ranking system may also play a bigger role in coming years. This includes voice search requests, chatbot-driven search assistance (Siri; Alexa), and social search influences. All three are currently at the forefront of search algorithm research, but most haven’t been integrated or developed enough just yet to make them useful to Google.
Content will remain key, but the way we digest and use it may change. We’re seeing a shift away from average-length content (500 to 800 words) and a bigger focus on short, snappy tidbits (under 400 words) and longform content (1000 to 4000 words).
You should also expect other forms of media, including photos, gifs, graphics, video, and even mobile games, to qualify as “quality content” in the upcoming years.
Ultimately, what this means for you is an even bigger focus on content quality, diversification, and skill. This is a good thing (even if it does make our job a little bit more challenging) because it encourages a higher level of expertise than ever before. And that makes it harder for shifty SEO “specialists” who rely on client deception to succeed against the real, hard workers in the industry.