SEO has existed in some form or another since the very beginning of search engines. As long as there has been a way to display websites in sequential order, there have been people trying to figure out how to get their sites to the top of the list. It is an endless, cyclical struggle wherein people develop strategies to game the system, and the system fights back. There have been ups and downs throughout the last 20 years, but over the last decade, much has stabilized. How has SEO changed, and how might it continue to change in the future?
For a long time, the search engines would essentially break down content into keywords and index them in that manner. If your page didn’t have a given keyword, then it wouldn’t show up for a search for that keyword. This led to webmasters stuffing every available field on their sites with keywords to rank for as many queries as possible. The search engines quickly put a stop to that practice, but in doing so, it has diminished the utility of keywords in general. Even just a few short years ago, it was very important to care about your keyword density. You would need, for example, exactly three instances of a given long-tail keyword on a page with 600 words of content for that page to rank for a search for that query. Four uses was too much and read as spam, while only one use meant your site just wouldn’t show up.
With the rise of semantic search and the decline in keyword utility, the power of the keyword has declined. You still need a keyword, and keyword research is still important, but you don’t need to pay attention to keyword density.
Links are how the search engines discover new content. They act as votes of confidence from one site to another. Their anchor text is a powerful location for keywords. All of this was true in the past, and it remains true, but the power has diminished. Links are important, but they won’t single-handedly make or break a site unless that site is in a very precarious position.
There are even hints coming from Google that links may be on the way out. Oh, they will never cease to have importance, but they may be supplemented by the implied link that comes with a brand mention. It’s a specific kind of link power, but it supplements what already exists and allows Google to refine how links work.
The early web was full of search engines. Some ran specialized searches, some worked on algorithms that allowed webmasters to rank highly when they couldn’t on other engines and some just struggled to compete in a world with the big three. Since that time, dozens of search engines have either died, been acquired by the larger companies or have fallen into disuse. Google dominates the field, with Bing as the runner up, and few even know that many others ever existed.
A major part of the push in recent years has been to cut off any possible black hat technique at the core and focus on providing quality content to users. One way Google has done that is by emphasizing the quality of content beyond nearly anything else. This led to a time when webmasters would use text replacement tools to take one article and spin it into dozens of others that looked unique but did not provide any unique value. Content spinning was a huge industry for a few years, but it has since gotten harder and harder to do. Google is wise to the game, and knows how spinning works; anything that technology can do, technology can reverse-engineer and detect.
As keyword emphasis declined and spinning was cut off, the emphasis has been placed on pure high quality content. Sites are capable now of producing a high volume stream of high quality content and riding on that for a position at the top. Links, keywords, social media; it’s all incidental if the content is good enough. Granted, some niches are so competitive content alone won’t do it, but it’s a far cry from the way the Internet worked just a few short years ago.
2004 was the year that Facebook was founded, and since then it has dominated the world of social media. Likewise, social media has come to dominate the fields of SEO and Internet Marketing.
Once again, it comes back to Google and the desire to serve users with the content that best fits their needs. In order to promote a brand as one that provides the content users need, social media became a valuable tool. A brand can establish a reputation on social media and leverage that trust to bring in users. Those users find valuable content, and thus promote the site, which earns higher rankings based on the traffic and value of their content.
Long ago, when keyword stuffing was still relevant, meta tags came into prominent use. When Google penalized sites for stuffing keywords in the content, webmasters moved those keywords to the otherwise-hidden meta title and description. Google penalized those as well, and meta tags lost much of their utility.
Later meta tags gained emphasis as they became the foundation for the title and snihpet served up in search results. The ability to customize those fields became invaluable, and so meta tags have achieved new heights of value.
In addition to building trust and reputation as a brand, many people have discovered that the same can be done as an author. The producer of content can build their own online reputation and grow an audience that will follow them from site to site, blog to blog and social profile to profile. Google even encouraged this with the Authorship system, which is proving to be more and more valuable as time goes on. Even with the recent removal of an author picture, it’s more valuable than not having authorship.
In 2004, one of the best selling mobile devices was the Motorola RAZR. It has been less than a decade since the first smartphones hit the market and have since exploded into widespread use. The use of voice search, of semantic queries and of responsive design have all sprung up since then. Every single mobile visitor is a visitor because of something that came from the rise of mobile computing. It’s not stopping, either; mobile traffic is steadily surpassing desktop traffic, and wearable tech promises to be the next innovation.
Google released Google Analytics in the mid-2000s and the world of SEO has been vastly improved ever since. The ready and easy accessibility of the information presented by analytics suites has made it easier than ever for businesses to track everything from who visits to how long they stay and what they click.
Of course, all of the penalties and neutered strategies still exist. Some have even become tools for black hat SEOs to use against legitimate sites. If paying for 10,000 links will penalize your site, paying for 10,000 links to a competitor’s site should in theory penalize their site as well. Of course, there are caveats to everything; negative SEO only works on new, fragile and untrusted sites. Still, the core concept of negative SEO is a new development based on the strict policies implemented by Google in the last decade.