Why SEOs Should Get Sites Banned

Why SEOs Should Get Sites Banned

When Google announced the roll-out of Penguin 2.0 last week, it triggered a string of events nearly as reliable as the instinctive flocking of the salmon of Capistrano

Thus began the time honored tradition of SEOs blogging and tweeting about who got penalized…

What caused the penalty…

https://twitter.com/danhortonseo/status/337506989410230272

And how to avoid or recover from the update (preferably by buying whatever they’re selling )…

It’s so expected it’s almost a Pavlovian response at this point. Even more astounding is that each and every one of these things happens within the first 24 hours of every update!

Each of these reactions is troubling in its own right but for the most part, I get it. We’re marketers at heart, and it can be tough to pass up a chance to peddle your product.

Plus, we’ve been told for years that you have to blog and provide near instant commentary to become a “thought leader” in this space.

But there’s one portion of the post-update ritual that I really don’t understand…

Bragging About Ignorance

No matter what the penalty or update is, there will always be an SEO there to brag about how they’ve never had a site penalized – as if that’s something to be proud of.

Pushing Boundaries is How You Learn

The way I see it, getting a site (or even multiple sites) penalized or banned is an important part of an SEO’s education. If you’ve never tested Google’s boundaries to their breaking point (and even beyond), how do you know where they actually are?

If you only use the “whitest of white hat” SEO tactics approved by Google, how can you know whether things like buying or spamming links, spinning or scraping content, cloaking, etc. are actually dangerous?

Sure, you can watch a site rocket through the rankings on the back of spammy links and flame out a few weeks later, but watching from the outside doesn’t give you the kind of intimate details you get from being at the helm of that doomed site.

Watching Matt Cutts’ videos or reading about how content marketing is the wave of the future you might get the impression that Google is getting good at detecting and devaluing paid links.

But if you actively tried to get a site penalized or banned using paid links, you might change your mind.

Most SEOs think link exchanges went the way of the dodo bird, but in fact there are thousands of sites in very valuable niches ranking right now on the back of exchanged links.

And while helping a site recover from a penalty or a harmful update can be incredibly enlightening, you won’t learn to spot the early warning signs that can accompany manual penalties.

Unchain Your Elephant

There’s an old proverb that claims if you chain an elephant to a stake in the ground when they’re young and weak, they’ll grow so accustomed to the restraint that they’ll never test it when they’re older and more than capable of breaking free.

While that story may not be true when it comes to actual elephant behavior, it’s proven on a daily basis in the SEO community.

Google developed a set of constraints to prevent marketers from exploiting an algorithm, and thus conditioned many SEOs to stay in those chains forever. In fact, many SEOs embrace the Google shackles and deem breaking them to be unethical.

Risk vs. Reward

Now don’t get me wrong. I’m not saying that you should have client sites getting swatted by Google on a regular basis. But there’s nothing wrong with using aggressive tactics if the client is aware of the risks and has prepared appropriately for them.

Better yet, develop your own SEO testing environment far away from your client sites (different domain, web host, registrar etc.) and include riskier “gray hat” techniques in your experimentation. (Obviously, avoid tactics like phishing or hacking that break the actual law, not just Google’s guidelines.)

SEO at its core is simply a balancing act – finding the equilibrium between risk and reward that works best for you or your clients.

While you may ultimately decide to stay well clear of the self-serving and arbitrary line that Google draws in the sand, you’ll never know precisely where that line is unless you step over it once or twice.

So get to steppin’!

  • http://www.ryanmjones.com Ryan Jones

    I’m a huge proponent of always testing. When penguin hit the first time, I was at SMX Tor and while speaking I had scrapebox building links to a newly established spammy site I just threw up – trying to see what it would take to get banned. (Note: it’s still ranking)

    When Ads above the fold launched, I pushed it with another site constantly tweaking the layout until it got hit, slowly changing one thing at a time to see what the most I could get away with was.

    EMDs? I’ve got a few of those I’m testing too.

    I don’t test with my huge money sites, and sometimes the spam sites become money sites, but the point is always testing.

    I trust SEOs who are able to achieve success for their clients AND for themselves a lot more than those who don’t have any personal involvement.

  • http://www.highrankings.com Jill Whalen

    Getting a client website penalized or banned is not only irresponsible, it’s probably something you could get sued over if your client has a good lawyer. If you want to play with Google, you should only be doing it with your own sites.

    That said, some of us don’t need to push Google’s envelope to see what works and what doesn’t. We have to help clean up the web spam messes that the irresponsible “SEOs” (and I use the term SEO lightly) have used that have hurt the livelihood of their clients.

    Nice way to treat a client, indeed.

    • http://directmatchmedia.com/ Ben Cook

      Jill,
      You can get sued for just about anything if your client is litigious. However, if you document what you’ll be doing, what the risks are, and what COULD potentially happen, you probably won’t get sued because your client will know exactly what’s going on.

      I agree, it’s BEST to push Google’s boundaries with your own sites. That’s why I recommended developing your own testing environment.

      As to your final claim that you don’t need to push Google’s envelope to see what works and what doesn’t, I would argue you’re wrong. Cleaning up after the fact doesn’t give you the same experience.

      It’s a bit like investigating a murder. You can come in after the fact and get a lot of clues and probably figure out what the cause of death was, but no amount of investigating will show you exactly what happened. Thankfully, we’re talking about spamming Google (something that ISN’T illegal) and that you can (and SHOULD) do without a guilty conscience. And just because you get a sob story from the site owner (who you perceive as the victim) doesn’t mean you actually have a clue who did what or why. Many of those client’s livelihoods may have been harmed by their own actions, which they are now blaming on the SEOs.

      And finally, why do so many SEOs assume that anyone who breaks Google’s guidelines is treating their client poorly? I can assure you there is no shortage of clients who would be extremely happy for someone to spam Google if it meant that they could rank well. They’d even be happy if it meant they only ranked well for a little while. So let’s stop with this nonsense that associates breaking Google’s guidelines with treating a client poorly, shall we?

  • Butler

    Or go one better: Never have a site penalised, but play a central role with the recovery of sites that other SEOs have.

    Best of both worlds?

    • http://directmatchmedia.com/ Ben Cook

      Butler, that isn’t one better at all. There is value in the experience of having a site penalized or banned. For example, only cleaning up sites that have been penalized or banned might give you the idea that Google is actually quite good at detecting spam and that it is very dangerous to use any “grey” hat tactics. However, if you’re the one actually doing the spamming you’ll get an idea of thresholds, effectiveness, and Google’s response times etc.

      • Butler

        That is to assume that you need to personally pass the threshold in order to appreciate it, which is, I’m afraid, entirely false.

        I haven’t, but I do. And better still, it’s demonstrable.

  • http://raventools.com Jon Henshaw

    Several years ago Google representatives started attending conferences that were geared towards SEOs. Two things happened.

    1. SEOs got better access to Google to voice their concerns and problems. Matt Cutts in particular has been extremely generous with his time and has helped many of us in the industry.
    2. Google was able to get a better handle on the exploits SEOs were taking advantage of, thus helping them know what to look for in order to improve their algorithm.

    With the exception of practices that are truly illegal or deceiving, like fake reviews and phishing, I think a paradoxical relationship exists between the Google spam team and SEOs. The more SEOs test and take advantage of ranking opportunities, the better they can tweak and optimize their algorithm to work the way they want it to.

    Fact: The Google algorithm and its results would not be as good as they are today without SEOs pushing the boundaries.

    The trade off is that these tests, exploits or whatever you want to call them are often short-lived. Several still remain, and if an SEO is profiting off of them, they would be an idiot to share with anyone else. While I’ve never gone so far as to try anything that would get a site banned, I’ve certainly tested all kinds of techniques just to see what does and doesn’t work (nothing too risky or shady).

    If there’s any message to get from this, it’s to test everything, and don’t take what Google and other experts say at face value. Also, if you are testing some more risky stuff, tread lightly, because Google is very good at making associations.

  • http://www.Johnon.com john andrews

    I hate to break it to you Ben, but there are liars out there. Some of those “None of my sites has ever been punished” and “clients who follow my advice never have trouble” are simply liars. Sometimes they are in denial (lying to themselves), and sometimes they are ignorant (they don’t watch sites that get trashed, so don’t know). One thing about humans – past behavior is indicative of future behavior. No one can escape that.

    And Google does use that as a tool. Which is why I don’t like your simplified “test” advice. Setting up a test environment is very complex, and possibly impossible. One reason is Google’s seemingly heavy dependence upon humans and quick decisions that err on the side of caution.

    Ryan has sites that continue to win, despite spammy tactics. They’ve simply never been highlighted, or don’t represent a risk. Another may publish a little site that is barely beyond guideline, and get slapped. She may get her other sites punished as well, just as a safety. How does testing help you there?

    It’s almost like “seo testing” is the new Social Media… unaccountable, potentially expensive/lucrative service offering that SEOs can sell without worry. After all, if testing didn’t reveal any problems, how could anyone have known?

    And by the way, check again on violating Google’s Guidelines not being illegal. That’s a grey area right now… with the recent computer fraud legislative actions it’s possible that actions which influence Google’s “algorithm” in ways that interfere with Google could be prosecuted as illegal. Maybe not yet… but who wants to go first?

    Also see the recent brass balls attitude of the FTC people… they think they can shape markets with threats of punishment, supported with “example prosecutions” (perhaps better called persecutions?). They then stretch those into whatever policy they think should be law (even when it isn’t).

    • http://directmatchmedia.com/ Ben Cook

      John,
      If Google begins to ban all sites that someone owns for breaking their guidelines that may well change things. However, I’ve not seen any reports of that and certainly haven’t experienced anything along those lines.

      Are you arguing that one shouldn’t do anything that violates Google’s guidelines?

      • http://www.johnon.com john andrews

        Go test some “shady techniques” and get a site banned. It may be that your demonstration of technique costs you trust, and that trust hit impacts you other activities. If the “testing” was in the same vertical as your other sites, I believe you’re more likely to have trouble down the road. If it’s in a different vertical, it wasn’t a test of the vertical’s tolerance for aggressive tactics.

        Show me an individual who can operate without a Google footprint, and I’ll reconsider my point that “testing” is far more of a bugaboo than it sounds. I believe a lot of the testing done is a complete waste of time and effort, PLUS it brings more risk.

        An exploit is an exploit; it is not a “technique”. Either you make use of it, or you don’t, but you don’t turn true exploits into practice in today’s Google.

        • http://directmatchmedia.com/ Ben Cook

          John your comment has created an interesting train of thought… I wonder what it would take to operate without a Google footprint.

          I generally pay attention to things like domain registration information (which you have protect from the point of registration), scripts/tracking code similarities (don’t use the same tracking codes on all your sites etc), and using a variety of themes. However, most of that stuff is intended to keep other SEOs from tracking what I’m doing, not so much trying to keep Google unaware. From what I’ve seen the bigger risk is from another SEO “outing” you than from Google taking a “scorched earth” policy with individual site owners.

          But I’m curious, what other things do you think Google uses to identify site owners?

          • http://www.ryanmjones.com Ryan Jones

            I’ve had sites that have gotten penalties and banned – and I’ve had them on the same server and same web host and same IP using the same GA and same adsense as other sites that flourished. Anecdotal, I know – but I haven’t yet seen a case of a good site in full compliance with the webmaster guidelines getting penalized simply because the person who runs it also runs a spam site.

            That may change, but I haven’t seen or heard of it happening yet. (Sure, people have claimed it, but 5 minutes of investigating shows they were usually doing shady stuff on all of them)

          • john

            Google knows Ben Cook’s activity stream… from GA and G+ and other tracking on web sites in the Google realm, plus data that is available for purchase or trade. A glance at “the logs” would easily reveal Ben’s association with various web sites. Ask a “quality team” member about how difficult it is for them to get access to the logs… it’s not easy, but they can get access if evidence warrants it (to protect Google from risk).

            There is history of Google tracking web masters, looking at sites they manage, and questioning the trust granted to other sites they “own” when one is suspected.

            Ryan provided another example of his personal experience… and I will point out that Ryan is “known” to Google and Matt’s team, Ryan is with an Agency, Ryan works with big brands, etc.

            The point is not that Ryan doesn’t fit the model… the point is that there is no model, because Google makes human decisions and does so on a reaction basis… reacting to algorithmically detected signals, reacting to spam reports, reacting to public bragging, reacting to findings from inspections, or acting on future plans where it wants to shape webmaster behavior. Most of those are (vertical) query-specific, which makes for pretty tough “testing”.

  • http://myblogguest.com/blog/ Ann Smarty

    I personally think there are no “white-hat” SEO practices. I had an unnatural backlink warning come to a 100% clean website that never did anything wrong and looking at its backlinks I say “really”? That’s when I stopped saying that I am proud of being a white-hat :)

  • http://twitter.com/n8ngrimm Nathan G rimm

    I agree and disagree.

    I agree that you learn a lot about Google when you push boundaries and get penalized. It’s really interesting to see how your site is affected and what stats correlate with the penalty and what it takes to recover. I’ve worked on several sites that were penalized for multiple reasons. I’ve also led several recoveries.

    While pushing the envelope makes your smarter, I don’t think it’s economically beneficial to push the envelope on a site that’s supposed to be valuable in the long-term. With the penalties I’ve experienced it set back our work by one or two years. If we’d invested in purely “white-hat” promotional strategy the entire time, I think we would have had a much better ROI over five years.

    If you don’t care about making money from a site five years from now, by all means, push the envelope.

    • http://directmatchmedia.com/ Ben Cook

      Nathan, perhaps I wasn’t clear enough. If you’re working on a site that you can’t afford to have banned, don’t get too close to the line. However, I would also argue that it’s already very risky to have a site that you can’t afford to have banned.

  • http://firestarterseo.com/ Skyler

    Ben, What a great post and great points…. I think this is spot on. I have seen many people write about what to do when a site has been penalized but have absolutely no experience actually doing what they are saying. Often they are just regurgitating what they have read elsewhere to create content.

  • http://jamesnorquay.com James Norquay

    Nice article, the best SEO’s I know are the ones who are heavily into affiliate marketing or high risk areas such as gambling. Too many SEO’s who talk the talk actually don’t even do the work, which is worrying.

    • Butler

      That depends very much on your view of what comprises a ‘best seo’.

      The porn, poker and pills link spamming stereotype’s best days are most certainly over.

  • Alan Bleiweiss

    hahahahaha This has got to be the funniest post I have seen in a very long time Ben. Good job making up a complete piece of crap story and claiming it’s true.

    • http://directmatchmedia.com/ Ben Cook

      Alan,
      I assume you’ve got your hackles up because I used your tweet as an example, but if you’re going to accuse me of making things up please clarify what you think I’ve fabricated. If you disagree with my opionion, that’s fine. But please don’t imply that I’ve been untruthful about something.

  • http://www.greenlaneseo.com Bill Sebald

    I agree with testing – to John’s note above, I’d rather have some semblance of directional intel than a guess, and fear of Sherlock Holmes type individual detective work. While I think it’s about time Google got more manual (again), it’s a big Internet. I don’t wear the tinfoil hat – but I’m probably just shy of it. I also don’t test wrecklessly.

    So I test. A lot. On test sites. I always have, at no risk to my big brand clients, or any online business activity. The only damage I would do to my clients sites is damage I could actually implement on my clients’ sites. I truly believe that (and I’m actually a somewhat skeptical person)

    Now with that said, I think you don’t have to push the envelope to learn if you have a trusted group of peers with practical experience. If you don’t have that authentic support group (and most blogs just ain’t it), I also recommend testing to find your own line.

    We can’t get swept up in fairy tales. We gotta go with what we really know, as well as we can know it, whether by online Socratic Method or actual experiments. People don’t question things enough anymore in my opinion.

  • http://www.mattevans.me Matt Evans

    Really interesting and clearly sparks a debate among the industry.

    Personally I partially agree, you should always be running projects and your own websites to test what you can and can’t do – then use this to educate your team and eventually the client.

    I also believe you should offer the best performance for your client, so if they don’t care what the risks are but want an off-site performance package with bought links – if that still works then give it to them!

    But I wouldn’t flirt with Google boundaries with a clients website unless they specifically asked (and I’ve alerted them to the risks).

    Definitely agree with testing your limits though, cool post man!

  • http://www.skiusainc.com Cady Haren

    Interesting post Ben Cook. The problem though is that most of us don’t have the time or the resources to do so. I agree that the majority of us treat Google as the bible but then you are only bound to lose if choose to follow another path than that guide.

    This may be applicable to data scientists who can invest time and resources unlearning and learning but for a marketer like me, doing something like this is a strict no no.

  • http://www.ryanmjones.com Ryan Jones

    I think the heart of the issue here goes deeper. I think most SEOs have a good understanding of what to do and what not to do in certain situations, but I think there’s a lot of them who don’t understand “why” they’re not doing it or doing it.

    Common example was press releases. Everybody said to do it because when one gets picked up you get tons of links from the articles. SEOs took the “what” as “do press releases” instead of the “why” of “get authoritative news mentions” and suddenly tons of free press release sites nobody reads sprung up.

    testing and getting penalties, etc helps SEOs understand more of the “why” instead of just the “what”

    • http://www.greenlaneseo.com Bill Sebald

      I like the eloquence of this comment over mine. Well put.

    • john andrews

      No question that “practice” is a good thing. And every SEO should practice his craft, including the majority of SEO techniques his clients are not requiring.

      But practicing is different from testing in this discussion. You don’t (normally) have to “test” known things that might need to be optimized… like content and structure. Just do them. Optimize.

      You should test unknowns, and things Google says are not a good idea. And you should engage in seo research. Conductiong trails is part of research, but it’s not “testing”.

      So what about over-optimization? What about when Google allows something, just not THAT much? That’s where Google oversteps its authority, stifles innovation, and causes damage to the web ecosystem. There is no way to define a ceiling to optimization, unless you normalize it… hold a practice accountable to some standard reference. And if you normalize it, you are interfering with the market. Google does that… and it’s bad.

      So do you test that? Test how far is too far with known optimizations? You’d like to… but it’s reactive behavior from Google. Test it and you might just bring attention to it, and get it throttled. What did your test prove?

      I think the discussion is silly… if you find exploits that work, you have to work them. Period. If you want to call research & development “testing” then fine… but it’s a different discussion. The truth is, competing in the serps is the pure form of “testing” because competition pushes the top sites higher, until they fall (perhaps from such BS as “over optimization” penalties).

      But that’s not winning… winning is riding that edge and staying at the top.