When I first started working in SEO, it was easy to find success.
In the days before Google, all you had to do was look at the pages that were ranking for the term you wanted to rank for and copy the on-page tactics they were using.
You were pretty much assured a decent ranking in the SERPs by mere imitation.
In fact, I remember more than a few instances where inappropriate results showed up for innocent queries because nefarious practitioners used cloaking to serve up copies of high-ranking pages. They fooled the search engines of the day into showing porn for queries where one would not expect to find it.
Things have evolved since those days.
I don’t need to tell you that today’s search algorithms are complex.
The ever-evolving and fluctuating factors that go into how a site ranks are in many ways unknowable.
This does not mean that success is unattainable, though; quite the opposite.
Success in search engine optimization is completely within reach of anyone who is willing to put in the work.
But having complete confidence in the exact factors that cause a site to rank is not usually possible – nor is that knowledge necessary.
For years, SEO professionals complained about others that “didn’t do their own work.”
Many said that others were not doing their own testing.
If you look at articles from the past with advice on how to be an SEO pro, there is a good chance that you will find some form of “test everything for yourself.”
In some ways, this advice is still valid for new SEO professionals.
Thinking through the complexities of testing a theory in SEO will help you to become better at the craft.
Implementing a test can improve both your analytical and implementation skills.
But the days of having significant confidence in repeatable and sustainable results based on what you learn from self-testing are in the past.
Testing for anything but the simplest tactics is too complex for most organizations, and even those with significant resources must take learnings from in-house tests with a grain of salt.
The relevancy of test results depends on a number of factors.
It’s important to understand how the test was set up and if there are any other variables, besides the one being tested, that might influence the results.
Fluctuations in the search engines’ algorithm can affect the results of a test.
Current events that relate to the domain can affect the results.
Actions by competitors can affect the results.
You get the idea – it’s hard to create a test in a vacuum, so in most cases, the results may not be caused by what you think.
I’ve talked about Google’s “baby algorithms” in the past.
In short, the baby algorithm theory purports that Google has slightly different variations of weighted factors for sites in different verticals.
In other words, what works for one site may not work for another.
The variation in how Google weighs specific factors for different queries makes universal testing results very difficult to achieve.
Back in the early days of SEO, we typically looked at the top-ranking sites and tried to emulate their formulas. (There are several tools on the market that still attempt to work in this way.)
The problem is that it’s impossible to know if replicating the efforts of a high-ranking competitor will even work for your site.
SEO professionals that get bogged down in minutia like word counts, keyword density, title tag length, and a myriad of other trivial details frequently waste a ton of time working on items that don’t matter.
And when we start looking at various SEO experiments, we find that any single test is typically not enough to justify resource-intensive and time-consuming efforts.
The story goes that Google’s link-based algorithm was based on the world of academic bibliographies.
Google’s founders realized that if a particular study or paper was frequently cited in academic papers, that particular piece of research was more authoritative than a research paper with no citations.
So Larry and Sergey modified that concept in the PageRank algorithm, using links as a substitution for citations.
In the world of SEO, we can use the power of our SEO village to separate the wheat from the chaff when it comes to experiments and tests.
But it takes more than just chatter to verify that tactics cited in SEO experiments are valid.
If the results of a test can be repeated by several other SEO pros, those results are worth noting and most likely worth implementing on your own site.
This underscores the importance of being involved in the overall SEO community.
Participation in the community is one of the best ways to understand the tactics that work — and the ones that don’t.
Keeping up with the latest search engine optimization techniques is a daunting task.
It’s especially difficult to know what tactics will work for the site you are working on.
And anyone who has spent much time in SEO knows that bad SEO advice seems more common than good.
Despite the ever-changing nuances of the field, the basics of search engine optimization haven’t changed in a long time.
The experiments and tests are, for the most part, looking to find loopholes or magic bullets to rank a site quickly.
These tactics are typically short-lived, if they work at all.
For most sites, mastering the basics of SEO is daunting enough without trying to implement esoteric tactics that may or may not work.
The basics include clean site code, relevant and authoritative content, and quality backlinks.
If you’ve mastered the basics, then you can look into some “fancier” stuff.
But for most sites, the basics are enough to show success.
Creating relevant search engine optimization tests that actually provide actionable findings is very difficult.
The nature of the search engines, and the fact that they don’t tell us what the algorithm actually consists of, make it very difficult to find causation in any test.
And in many cases, correlation isn’t enough to justify devoting significant resources to questionable tactics.
Most sites would be best served by mastering the basics of SEO.
And for those tasks beyond the basics, rely on the village. Get involved in the search engine marketing community.