Monthly Archives: November 2015

No Fear SEO

Unlike, say, gravity, nuclear fusion and thermodynamics, SEO is not a law of nature. It follows no absolute guidelines that are absolutely true throughout time. When I am asked how much traffic my optimization methods could generate I offer an answer that will most likely be proven wrong: I will either be wildly successful – or not.

Luckily, I’ve mostly erred on the side of wildly successful, blowing most estimates completely out of the water…but, there is always the fear of not succeeding as much as I think I should have.

I’ve encountered situations where I quadrupled traffic, and felt like a loser, because I expected twice that. My error was not in optimizing the website: my error was in the estimate.

In SEO, there are no guarantees.

Google, the 800 lb gorilla search engine defines its own laws of search engine physics. And the laws can change at any time. “Cloaking”, or serving content to the search engines that might be different than what the user sees – was just an SEO technique that worked – until it didn’t.

The shot fired across the bow of the SEO world was in 2006 when BMW was banned from Google results for applying cloaking. There was a collective “Whoa!” heard across SEO world. A major website was unfindable. It also revealed the power that Google wielded. Like, who would actually type in “BMW.DE”? Not even a dotcom domain. That’s right: nobody. Or maybe just internal BMW folks. Can we just round that down to zero?

That marks the time that SEO officially became hard. All the blackhatter techniques were now suspect, and using anything not squeaky-clean white-hat SEO put websites, and clients, at risk.

In SEO, everything is fair game – until it’s not. Thin content was just an SEO technique, employed by the likes of the ehow’s of the world – until it was not. Until Google decided it was not. The rules of Google are mutable and mysterious. Which grants SEO professionals the mantel of “Artist” as in “SEO is the Art of optimizing for search engines.”

I don’t think that’s particularly correct. I think it’s better to be a scientist about it, to create a hypothesis, and then test rigorously. Don’t trust what Google puts out in press release and blog posts. For example, their post on being able to crawl Ajax website was not correct. I personally experienced this on a client website. Developer’s were telling me that Google stated their Search Engine bot can now read Ajax; that there were many articles that stated that Google can read Ajax websites. Emphatically, I responded:

‘That is incorrect. I know you read that, but in practice that is incorrect.’

The same for human-readable words in URLs. I know from experience that human-readable URLs are a primary ranking factor. That strings of incomprehensible ids just don’t fly McFly. I’ve been challenged by developers on this point as well. They would cite reading ‘…numerous documents state this…’ But if you think about it in regards to human factors – it makes sense.

Google has no problem reading URLs that are incomprehensible gibberish to the rest of us – why should they? It’s simply code. But humans need words to communicate. Words describe – and an URL that is easy to say, easy to spell, reads like a sentence, has human factors that encourage interaction.

Google is like Pinnocchio: a toy who wants to be a boy. For our example, Google is a machine trying to think like a human being. Trying to figure out what matters most to human beings. An SEO strategist, on the other hand, is a human being trying to think like a machine that’s trying to think like a human being. What metrics would translate to human factors? Time on page implies content that is valuable enough to take the time to scroll, read and digest. Page Views implies Content Depth – interesting enough to encourage a clickthrough. A bounceback implies thin content, and lack of content depth not worthy of further viewing. Playing to those metrics then plays to optimization techniques.

You have to read between the lines of recommendations. For example:

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link

Why? Why text links? Why not javascript hide/show links, or flash links or…whatever?

One reason is to be able to serve the blind and visually impaired. We are not in a completely “abled” world. Text readers for the blind stumble on javascript only links. So that super-advanced website all Ajaxy and shit might be invisible to text readers. Great, you’ve just made the blind more blind. How does that feel? There are other reasons besides the ADA adherence, but being a decent website steward should be enough, ranking lower is just the stick of the ranking higher carrot.

The goal of Google can be stated as: “No more crappy website results.” If you stay on the side of not catering to a “traffic-at-all-costs” mentality, you may find that there are plenty of so-called “techniques” available to rank higher.

The problem is: Google doesn’t tell you the rules of its game. It’s not in their best interests to do so. Once they let it be known that links were a key to their ranking algorithm – Search Engine Optimizers all of a sudden got into linking “techniques” later known as “schemes.” SEOs are nothing if not opportunists. We don’t ask philosophical questions like: Why are they called apartments, when they’re all stuck together? We just live in them.

We are pragmatists: we use optimization techniques, all the while trying to stay on the right side of the law Search Engine algorithms. We try to future-proof websites so that they do not fall prey to what Google decides what was once a “legal” technique is now “illegal.” For the best, our pragmatism extends to preventing damage as much as accumulating traffic.

In that sense, I guess it’s an art. Tiptoeing the knife-edge of the thin grey line separating decent search results from sucky ones chaos. Google is the caretaker of results, with the power to withhold traffic and reserving the right to taketh away what it has so generously giveth.

For the SEO, the end result is to eliminate the fear, proceed with effective strategies, and forget the fear of traffic loss.

When a client asks what kind of traffic they can expect, I give them my best estimate based on past experience, weighing the quality of their website benchmarked against my mental list of comparable websites, the competitive environment and the state of Google at the moment. I state it, knowing, in the end that I am wildly wrong, one way or the other.

 

 

SEO is Dead (Long Live SEO)

SEO stands for Search Engine Optimization. I get that question less and less, and am surprised now when I hear ‘what does that stand for?’ So, there you go.

I get this every so often. A content manager, digital director, production manager du jour – once they find out that I specialize in SEO they inevitably end up saying some version of: ‘I read that SEO no longer works anymore…’

Dead stare, thoughts rolling through my head on: where to start exactly? Should I start on how I just increased traffic by 900% in 4 months at my current consultancy? How I quadrupled traffic at a site that thought they had reached their peak at 3 million UV’s?

That SEO should just be called Digital Optimization – or something. That it now includes content curating, site architecture, technical SEO and SEO usability. That SEO makes Social Media it’s bitch.

That after doing good work at one consulting job and doing salary negotiations I spat out a what-to-me sounded like a ridiculous hourly, they responded: ‘What else do you want?’ Hmmm…daily desk massage? Endangered animal under glass? Etcetera ad infinitum, ad absurdem.

I’ve since branched out to Appstore Optimization (ASO), Local SEO, Reputation management, PPC and Adsense Optimization, but SEO has always been, and will almost certainly be, an long arrow in my quiver – at least as long as Google exists.

Let me give it to you: SEO still exists, it’s still relevant. It’s especially relevant to the company whose traffic dropped over 90% (and which I recovered in the aforementioned 4 months). It’s still relevant to companies that still want to dominate a segment that throws off 10′s of millions of visitors.

‘But we don’t want just ANY visitor, we want relevant visitors…’ they say, when they are just getting traffic in the hundreds per month.

Well, it’s not like I can optimize their site for just anything.

Let me reassure you: you can only rank for “blue widget” when your site is actually about ”blue widget”. And ‘too many visitors’ is what I call a high value problem. Most businesses call me in once they realize that they’ve exhausted all that they know, and what they have been able to Google with their mad Googling skillz. After they’ve applied title tags, keyword density and xml sitemaps…

I’ve talked to a Marketing Director talking about how they’ve applied schema, as if that were a silver bullet to ALL their website issues. As in: ‘Hey, I don’t know if you’ve heard, but we’re applying schema throughout our entire site. It’s going to be huge!’

I just nod my head, and say, ‘sounds like you have everything well in hand,’ as I thought: ‘That’s it? What else you got?’

I know, sounds arrogant. It probably is. But it’s arrogance bred from expertise. From trying to convince the corporate officer du jour on why SEO is important, trying to figure out the best way to communicate to the spot where their knowledge ends and mine starts.

I have a particular set of SEO skillz

I have a particular set of SEO skillz

All I have to say really is: SEO is irrelevant until it becomes relevant. When the pain of philosophical questions such as: ‘If a website gets created and no one shows up, does it still exist?’ haunt their waking lives.

Word gets around. The best in many industries do not toot their own horn. But people find them. The person who needs the guy may not know the guy, but through social media they can meet the guy that knows the guy. And for SEO, that guy is sometimes me.

SEO is dead. Long live SEO.