When did SEO become so complicated?
It seems like only yesterday that a brand new company could launch a site, bake some bolded and italicized keywords into each page, prop up a few cookie-cutter articles on free blog platforms for instant backlinks, and watch the organic search traffic start rolling in. Back when SEO was this simple, the average site owner could read four hours’ worth of blog posts on the subject and feel reasonably equipped to do a decent job at it herself. Accordingly, the benefit she could expect to reap from turning her SEO over to a professional agency or consultant spoke more to a difference of degree than to one of kind.
Then, we started hearing about a bunch of new Google algorithm updates named after animals that were rendering these old tactics ineffective. Around the same time, we also started hearing about Schema.org and rich snippets, and the Knowledge Graph, and “semantic search”, and “entity search”, and the SEO impact of social media marketing, and how Content Was King… suddenly, and seemingly all at once, SEO wasn’t simple at all anymore.
What big picture is drawn by the convergence of all these small changes? Essentially, it’s this: SEO has become more complicated — and, frankly, harder — because it has been forced to grow into a more honest enterprise. Doing what’s right is always tougher than doing what’s easy.
Beginning in 2011 and continuing right up to the present day, Google has aggressively toughened its quality guidelines for site content and backlinks, conducted human polling to enhance its understanding of what distinguishes a good webpage from a bad one, enriched the appearance and substance of its search results pages, and invested heavily in curating a massive, still-growing database called the Knowledge Vault that has vastly increased the power of its algorithm to understand meaning. By working along all these axes of search at once, they’ve not only forced the entire practice of SEO to conform to newer and loftier expectations — effectively slamming the door on the traditional tricks of the trade — but have also greatly expanded the range of strategies and tactics that comprise the practice of SEO, which, to continue the metaphor, amounts to opening some windows.
The upshot for searchers of all these Google renovations was plain to see; you likely noticed around this time that the search experience was improving across the board. Junk pages started getting downranked, query refinements started mattering less, more information of value started appearing directly on your search results page and saving you a click… in short, you started getting better answers to your questions, and getting them faster.
But the upshot for site owners is that proper SEO today is arguably a bigger and more complicated job than ever before. At this point, SEO is comprised of so many tasks (a large percentage of which must be performed on a continual basis) that hiring a professional might feel like the only way to be sure that you’re even keeping track of everything required of you, let alone executing on it all successfully. The only truly surefire strategy now is the complete one, the one where you advance along every conceivable avenue simultaneously. But with avenues this numerous, and Google still redrawing the map a few times every year, where do you begin?
Let me start by arguing, transparently, in favor of hiring a professional. I cheerfully disclose that I am one of those SEO professionals, and I say “cheerfully” because I think this stuff is neat and that SEO is worlds more interesting in the current search landscape than it has ever been before. It used to feel like we were simply trying to trick a machine that was plainly dumber than we were. Now, it feels like we’re having honest, candid exchanges with a machine whose intelligence (at least in practical, task-oriented terms) is rapidly catching up to ours, and we’re both teaching it and learning from it as we go. But in keeping with the above, there really is so much to proper SEO these days that deputizing somebody to do it — or at least portions of it — on a more-or-less full-time basis is, to my mind, the best way to go.
But say you don’t have the budget to hire a full-time agency or consultant, or you don’t want to grant access to your site data to anybody beyond your walls, or the DIY spirit runs deep in you and you just feel like doing it yourself and learning something as you go. What exactly does it take these days to optimize your brand for organic search? Let’s break it down into nuts and bolts.
The On-Page Factors
A list of on-page SEO considerations is naturally going to begin with content optimization, which in and of itself isn’t all that hard. It has the added benefit of falling entirely within a writer’s purview, which means you could do most of the work yourself and wouldn’t need to consult your web developer until the time came to push it live.
In a nutshell, content optimization is mostly about how you research, select, and ultimately place your keywords. The essence of this task is determining the potential search queries that correspond most closely to your desired audience, and convincing Google of your site’s relevancy to those keywords on a page-by-page basis. See this helpful eBook for a thorough tactical rundown of content optimization best practices.
The remaining on-page (or, more broadly, “on-site”) factors are all significantly more science than art, and many require specialized knowledge. These are the tasks that comprise what is usually referred to as “technical SEO”. Let’s go over some of the most significant ones.
First on any technical SEO agenda is securing your site’s pages against risk of duplication on your website. The danger of page duplication is simple: if a page on your site can be accessed at more than one URL, then the various URLs will circulate on the web independently and acquire links independently. This produces two negative results. First, the link equity that the page acquires will be divided among the various URLs at which it can be accessed, which means that the page will not be able to get full credit in search for its value. Second, each of its multiple URLs will be indexed by search engines individually, which means that the same page will be forced to compete against itself for rankings on the same search queries.
Scenario: The same page (Page A) exists at three different URLs
- Page A has received 100 links from unique, high-quality external pages, but these 100 links have been spread among the three versions (notice how small the changes are…a forward slash, a capitalized letter…easy mistakes to make)
- Page B exists at only one URL, and has received 60 links from unique, high-quality external pages
Search engines view Page B as having the superior link profile.
You can best protect against accidental duplication by writing instructions in your .htaccess file forcing all URLs to resolve in a single format (e.g. all in lowercase, and all with or without a trailing slash, and all with or without the WWW prefix), and by ensuring that you didn’t let your development site remain live in your staging environment after launch. Also, if your site serves more than one country and/or serves content in a range of languages, an additional category of multilingual/multiregional SEO considerations comes into play, which will influence the way you choose to structure your URLs and will also demand that you make use of “hreflang” tags to help Google understand when two pages are region-appropriate translations of one another rather than duplicates.
Search Engine Accessibility
There are certain features of a website that serve no purpose outside the scope of SEO, but that are critical for providing search engine crawlers a means of orienting themselves, as well as instructions regarding the value of each page to the organic search audience. Truthfully, a list of every such feature currently known would run very long indeed, but the two most critical are:
- a dynamic XML sitemap — a clean and precise XML-coded list of every URL on the domain, containing timestamps of each page’s most recent update and a webmaster-designated “priority” score indicating that page’s relative importance to the site, and
- a robots.txt file — a simple text file that, if properly formatted, specifies pages that are off-limits to search engine crawlers (admin pages, or pages behind the paywall, typically), and also indicates the location of the XML sitemap.
At the very least, you’re going to want to make sure your site is armed with these two documents. Moreover, if admin access to the site is shared by a large number of different people, then part of your job, as the site’s webmaster and SEO manager, will be to check in periodically on the proper functioning of these files, to be certain that nobody else has accidentally made changes to them that might disrupt the site’s search visibility.
My note above regarding the human polling that Google has been doing has its most direct, specific consequence in the arena of page speed. It has already been nearly five years since Google became so thoroughly convinced of the value of fast page to a positive user experience that they not only made page speed a ranking factor, but publicly announced as much. Google so seldom explicitly names a ranking factor that we in the SEO community have no choice but to listen with the utmost attentiveness when they do. Make no mistake about it: you must maximize your page load speed in the interest of your organic search visibility.
This is verily just the tip of the on-page SEO iceberg, and that’s not even the scary part. The scary part is that on-page SEO is, now more than ever, not the most important kind of SEO. This is not to say it isn’t indispensable — you simply aren’t doing SEO properly if you aren’t maximizing your site’s ability to harness the authority it has earned and bring it to bear on the keywords it’s chosen to orient its content around — but what about the work that goes into earning that authority in the first place? That’s the domain of off-page SEO, which is significantly harder to effect change in, for all of the following reasons:
- it depends on things outside your direct control (most notably, the actions of other sites);
- the factors that make it up grow in number and diversity all the time, and never seem to shrink;
- it is never in any sense “done”, and requires a constancy of effort in monitoring, outreach, and production of new content;
- the creative output that it demands means that to do it properly requires that a company’s SEO team be joined at the hip with its product marketing, design, advertising, and PR teams, at a minimum.
The Off-Page Factors
The individual activities that could be argued to fall under the umbrella of off-page SEO are too numerous ever to have hope of listing, so what I’m going to do instead is describe the overall aim of off-page SEO as thoroughly as I can. That aim can be expressed in one single phrase:
No, not “link-building”. That term betrays an outdated view of SEO, from a time when search engines’ assessment of a webpage’s quality depended entirely on how many backlinks it had earned. Now, search engines are not only tougher than ever on link quality, but are also considering a range of factors outside the link graph in assessing a given page’s worth (or trustworthiness, or “authority”). Your job is to keep up with these factors, and execute on them: all of them, all the time.The search engines had a good reason to ascribe such weight to backlinks, as they did from the dawn of the PageRank algorithm until very recently: because for all that time, there was nothing else to go on. Until roughly the early 00s, if you really liked a webpage and wanted to tell people about it on the web itself, your only course of action was to mention the page on your website and link to it. If you didn’t have a website, you could either decide it was worth creating one just to tell people about this page you liked, or you could just decide you’d tell people about it by way of old-fashioned word-of-mouth, which is what 99% of people did back then. Then, a profusion of new, simple, user-friendly blog platforms in the early 00s lowered the web’s barrier to entry considerably, democratizing the site quality conversation in the process.
But another, bigger revolution followed that one a few years later: the rise of social media. This is how people express positive sentiment for a webpage now: by spending a half-second firing off a link to it in Twitter, or giving it a “Like” on Facebook, or (admittedly, more rarely) a +1 on Google+. Search engine algorithms have kept up with these changes, and it started showing most notably around the time of the Google+ launch in summer 2011, when it became clear that +1s on Google+ led directly to greater visibility in personalized search. It has been surmised by certain voices in the SEO community that the true purpose of Google+, from the beginning, may have been to collect a vast trove of personal data that could be used to improve search results.
Not long thereafter, the Google Penguin algorithm of April 2012 (one of the animal-themed ones I alluded to before) arrived, with the result that a positively huge chunk of the links on the web were devalued overnight, and the sites that had been on the destination end of those links got downranked, often severely. This was a proactive effort on Google’s part: they decided to wage war on link farms, link exchange schemes, and all the other too-easy means of procuring backlinks that had been allowing poor-quality sites to buy their way to the top of search results pages. So, quite suddenly, not all links were created equal. Google was suddenly better than ever at distinguishing a good link from a bad one and was actively rewarding the former while punishing the latter. Link quality began to matter as much as quantity.
Accordingly, where we used to have a simple task called “link-building” — trying to get as many links as possible to point to your page, by hook or by crook — we now have “authority-building”, a multifaceted slate of tasks that converge to inspire confidence, on the part of users and by extension search engines, in your page’s ability to speak with authority on the topic of its choosing. What this requires of SEO professionals now is that we work not to construct, but to attract high-quality inbound links and social media recommendations.
And my oh my, that is not all there is to it. Authority-building work begins with link attraction, but also includes:
- securing opportunities to post relevant and substantive content on well-liked third-party sites (“guest posting”);
- monitoring your earned media to maximize the online buzz you can wring out of it (including getting links added to it through honest outreach);
- building active, loyal online communities where your brand is openly discussed in a larger topical context by real human beings.
These tasks are truly never-ending: when you’re not actively engaged in one, you’re pursuing the leads that will give rise to the next one. That seems like a full-time job in itself. Considering these authority-building measures in addition to the on-page work listed above paints a picture of SEO as a terrifically complex, fast-moving, and strenuous line of work, which is exactly how I view it. If you’re going to take your site’s SEO into your own hands, you must be aware of the fullness of what comprehensive SEO requires of a brand in today’s landscape and be prepared to prioritize according to the limits on your time. Alternatively, hiring a professional agency or consultant will ensure that you have an expert (and likely an enthusiast) dedicated to the cause of your organic search visibility full-time.