This last month was all about Google Penguin. Because of that, I thought on asking some of the brightest minds in the SEO industry, to share their opinions on what “Penguin” is and how it affected link building and SEO.
It is important to know that none of the interviewees knew who are the other participants to the interview. This makes the interview unique, by sharing the individual opinions regarding the Google “Penguin” update, of each of the link builders.
This is an in-depth interview made of 7 questions answered by 11 13 link building experts. It covers almost everything regarding the “Penguin” update.
Here are the questions, our link building experts answered:
- How would you compare the “Penguin” Update to the “Panda” Update?
- What does a “Low Quality Link” mean to you?
- Do you think Anchor Text was devalued, and if so … how do you think it is treated now?
- Have you witnessed any False Positives, regarding Google “Penguin”?
- How do you see Link Building 1 year from now?
- Do you think Google can handle the “Negative SEO” issue correctly?
- Why do you think Google named it “Penguin” ?
Ok. So now let’s meet our proud link builders, in the order they have answered the questions.
Note: For a better reading experience, click the name of the link builder you want to check, to highlight his/her answers.
Question 1: How would you compare the “Penguin” Update to the “Panda” Update?
Panda was way more of felt update. It affected much more of the internet. That isn’t to belittle those who were hit by Penguin. Those who were hit by Penguin were hit bad.
But more sites were impacted by Panda than Penguin.
Panda seemed to be more on-site issues – looked like it interpreted thinks like bounce rate, thin content, and etc. Basically, on-site spam. Penguin seems like it’s off-site spam – bad, unnatural link profiles and etc.
To me, Panda was more of a knock on content, and Penguin was more of a knock on links. When Panda first rolled out the main issue everyone was talking about was duplicate content; with Penguin, it was low quality links. They both have similarities and they’re both more than just those simple issues, but it’s my way of summarizing it.
It seemed to me as if Google actually had three updates over the space of a week. They talked about the middle one, as “Penguin” but were deathly quiet on the other two. This means that they covered quite a bit of ground and different people are affected in different ways. In summary – this update was all about addressing unnatural link profiles designed to manipulate PageRank and Google serps. It affected Paid links, link networks, footer links and probably plenty more.
This is a really interesting question as the lines between the two updates are blurring. On a basic level Penguin is designed to be predominantly an off-page webspam policing filter focused on spam as defined by an army of human spam raters Google employed last year. The filter was then written to mimic their findings and enable the search engine to scale it globally.
Panda is a content based filter designed to weed out thin, valueless content created to ‘manipulate’ search engines. It also looks for duplication etc.
Penguin seems like the natural next step after Panda. Panda, as we all learned, was a low-quality algorithm change/filter with targeted on-page. Penguin seems to be a low-quality algorithm as well, to an extent, except it is now targeting the off-site SEO of websites. Part of this is the over-optimization section that we’ve all heard was coming (or interpreted Matt to be saying) which seems to be targeting too aggressive of anchor text, but part of it as well seems to be the deindexation of low quality directories.
I think Penguin is interesting because unlike Panda this update appears to be targeting bad SEO practices. Panda was just trying to sort Websites into “high quality” versus “low quality” whereas Penguin is trying to improve Google’s enforcement of its spam policies. Panda went after Websites that were “Not Spam” in nature — just providing the bare minimum of value to visitors. Penguin went after Websites that were (perhaps in some cases unintentionally or unwittingly) crossing the line and putting too much emphasis on doing things to please the search algorithm.
With both updates there are many people complaining about how their sites were unfairly punished. I don’t think that’s really the case. I’ve had Websites affected by both updates and I can mostly agree with the downgrades, even if I cannot be absolutely sure my most recent downgrade was due to Penguin (it happened a few days earlier). Two of the three Websites where I have taken hits depend on content provided by other people. While that’s a small sampling, it suggests to me that you really have to take your role as Publisher and Editor-in-Chief for your Websites seriously.
Both of these updates were designed to demote Websites that depend on corner-cutting techniques for producing large amounts of content AND to influence search results placement. Penguin, especially, does a really good job of targeting links that were clearly and obviously created only to help Websites outrank other Websites in spite of what the ranking algorithms were trying to do (promote really good, authoritative content).
People easily find exceptions and get angry, but Google never promised these algorithms would clean up 100% of the mess or be 100%, so people are criticizing Google for failing to deliver on promises it never made. Panda improved the quality of search results and Penguin stripped many sites of the unfair advantages they were using to compete in the search results.
Panda was probably more successful than Penguin in achieving its goals, but if — as many of us suspect — there will be more Penguin iterations with improvements, then I think Penguin may become every bit as successful as Panda.
Firstly, I want to make it clear that a lot of what we know on Penguin Update is based on educated guessing. “Educated” cause there are some sparse hints and tips coming from Google reps and “guesses” cause we can only look at the profiles of penalized sites and try to guess what it is that got them penalized. As future iterations of Penguin go live, our guesses will become more educated, as we will learn what actions help in getting out of penalty and what makes no difference.
That said, Penguin seems far more clear cut than Panda. By looking at number of sites that were hit, in different industries and by hearing from other people that are looking at even more sites that were hit, Penguin seems to be targeting low quality link profiles, while Panda is aiming for the low quality on-site content. The fact that Penguin seems far more clear cut probably stems from the ability to say with much more certainty what are spammy links, as opposed to saying what is low quality content. If I refer to the original list of questions published by the Google Search Quality team, that are supposed to serve as guidelines for assessment of quality of content, for example, the answer to the question “Whether I would feel comfortable giving credit card to this site?” can differ between different people. On the other hand, spammy link profile is much easier to spot, especially if you are an SEO – people usually do not link to websites using exact anchor keywords (and they put those exact match anchored links into sidebars or footers even less frequently) so anchor profiles that look like keyword research reports are usually a good sign of manipulation. Due to these differences, I think Penguin penalties will be much easier to diagnose, but harder to clean up, whilst Panda penalized sites will be hard to diagnose, but easier to fix.
We all know that Panda is an onpage update and that the Penguin is backing up the offpage part. First Google raised the content problem and invoked the quality of it. The same thing happened for links too.
These are just balancing updates; they sustain each other with one single purpose: cleaning the SERPs. There were rumors that Google is trying to freak out the SEOs clients toward PPC and many other conspirative directions. They might be true but there is no hard evidence on this matter.
At the beginning Google Panda was pretty “correct” about the sites he evaluates. There is a lot of junk content on web, many misleading sites that make search becoming annoying. Hitting poor content sites, thin affiliate content, scraped content, was a good thing to do.
But then raising up brands in SERPs just because are brands doesn’t seem like a good idea. Not all brands have valuable content or are useful sites (bad user experience, lot of self-promos). So that was not really the proper answer on cleaning the SERPs.
The next “cleaning measure” was to evaluate sites that are sustaining junk sites on SERPs. We now have a new concept about link building and how links should be but the same problem: evaluating and separating the good for the bad.
Devaluating web directories, poor content sites, scraping sites seems normal and it should be done long time ago.
But the next problem coming up from all these measures is if this making spamming less easy? Doesn’t mean that now buying links will be more important? Spamming social media for links will be properly handled by Google? Corrupting newspaper editors and journalists for links will be harder?
So SEOs should think about equilibrium, proportions, things done in the limits of Google`s way are profitable. We heard that before, we will hear it again. That`s the game…
That’s fairly easy actually. The Panda update was more about usability. As such it focused on elements such as thin content and real estate (ads above the fold are a bad idea for example). Conversely the Penguin update was as advertised; a web spam update. That means they’re looking at on-site spam as well as off-site (link building). Many folks seem to get myopic over trying to define Penguin as one or the other. I recently covered potential elements in a post on SEW (http://searchenginewatch.com/article/2174997/Life-After-Google-Penguin-Going-Beyond-the-Name)
I would say that Panda and Penguin are more similar than maybe many think:
• Both are “filters” of the main algorithm, not updates of the same Google algorithm;
• Penguin, as Panda, seems to be a learning update. As Panda refined itself with every new version, I suspect that Penguin will do the same in recognizing the patterns of the “bad link profiles”;
• Both seems to be a serious answer to an old question: what does really Google do against search spam?
I’m not quite sure they are comparable as one deals with thin content and other deals with bad links. Panda was great when I worked directly with big brands myself because it got them to really start moving on the recommendations I’d made. Penguin hasn’t negatively affected any of our clients at iAcquire, so I guess I should say thanks Google?
It looks like the Google Spam Team finally got around to changing some variables on how they handle links with the Penguin update. With Panda we definitely saw the low quality and thin content cause issues so at least they were nice enough to separate the two updates…I think.
Question 2: What does a “Low Quality Link” mean to you?
A link that was not given by the writer for the reason of helping their users find related content or more information on what the author was writing about.
A link that doesn’t make sense in the context of it’s location. Why would a link be here? Does it make sense in this exact spot, page, and with this anchor text? If not, it’s probably a low quality link.
Ones that are created by the hundreds or thousands at the click of a button.
To ME a link defines a relationship between two urls – which means there needs to be a reflective relationship between the two businesses associated with each URL. Some relationships are string – others weak. I think Google has just introduced the concept of Divorce into the equation. If there is no relationship between the two pages of substance, then the link is probably not of a very good quality.
Again difficult to define, hence why there has been so much collateral damage as part of this rollout. Low quality for me is a link from a site either created specifically for the purpose of manipulating PageRank and passing link juice or it offers no extra value to the visitor or site.
A low quality link, in my opinion, is a manipulative link that does not add value to the Internet. I wouldn’t classify an anchor link as always manipulative either, if it adds value to the article or website where it is placed. Hence, I don’t think sites like DMOZ or BOTW are necessarily bad. But all of the directory networks with a DA >20 that are just meant for SEO (and yes, they have worked because Google let them work), I would classify these as low quality
Just about any type of link used “for SEO” strikes me as being low quality. People have put so much weight on links that they have pretty much abandoned any effort to create real value. Hence, every time I see an SEO blog article listing places to get links I write those links off as doomed to become toxic. As soon as enough people abuse those resources the search engines will devalue them. And if people insist on ignoring the search engines’ devaluations they can expect more link-based penalties for their sites.
To me “Low Quality Link” is a link that, if those kinds of links make up the majority of your profile, will get you penalized in Google. I know this definition does not mean anything really, but that is because “Low Quality” is a value judgement and these things can differ from beholder to beholder. If you are competing in a competitive niche, in which Google has for years rewarded sites using a certain kind of links, deciding that those links are “low quality”, would be negligent. Everybody has a 20-20 hindsight now that Google has dropped the hammer on certain techniques and can say “oh, well these are obviously low quality” but hundreds, if not thousands, of sites have made a lot of money over the years by building these links and the smart ones were prepared for this kind of change in algorithmic definition of what is considered quality.
Obvious optimized sites become low quality sites. Is it obvious that the content is optimized for no 1 in SERPs? Ok, then that might be tagged as low quality content and links coming from it can be marked as negative. Until now those kinds of sites were harmless but now are poisons. How much poison can you swallow so you won`t be ending dead?
Next, does your site getting links with keywords anchor texts? If yes, then those links might be tagged as low quality. The new keywords are branded keywords. You are telling Google that you are building a brand and not (obvious) links. Dilute any low quality links with trusted branded links.
Keep in mind Panda and review sites that are giving you links. Are they useful to users? Is there any good user experience? Any proof of social activity?
Also, how many outgoing links does the site have? Is there any proof of link building? Having links on sites that selling lot of links may harm your rankings.
There likely needs to be a bit of a distinction between a low quality link and a spammy one. A perfectly legit link can pass very little equity (PageRank in Google terms), that to me is a low quality link. Again, Penguin is more about spam right? So we want to think in terms of;
links from spammy locales (splogs etc)
Most people know what they are. If it’s feeling ‘too easy’ there’s a good chance it could be classified by the search engines as being spammy.
IMO, “low quality links” must be considered all those ones which rely exclusively on exact matching commercial anchor text, and that placed in clearly not editorial areas (footer, sidebar, link pages, links farms, SEO directories) or that – because of their over optimized nature – seems contraddicting their apparently editorial content (i.e.: forum and comment marketing, blogrolls…)
From a programmer standpoint a low quality link is one that is not contextually relevant to the site being linked to. If I was Google I would devalue these completely across the board. I would measure the semantic value of pages and extract entities using Natural Language Processing and if it the combination of those results didn’t hit a certain threshold I would not count the link at all.
From an SEO standpoint it’s more or less the same except I base it on standard SEO metrics in addition to the quality of the site.
Low quality to me makes me think of SEO forums and Fiverr gigs where you’re getting link wheels, blog comment spam, profile spam and the like. Just because you’ve built a site on “buy whatever” doesn’t mean that you should be able to inflate your own rankings to get top ranking. I may get thrown under a bus for saying this, but if the majority if your link building methods don’t earn you any editorial/naturally given links then you’re business isn’t going to survive in the organic search results.
Question 3: Do you think Anchor Text was devalued, and if so … how do you think it is treated now?
I am not sure.
I’m not sure it was devalued – depends on how you think about it. I think ABUSE was devalued, I think it probably is still super effective to a certain threshold, just turns out that threshold is now lower. Of course, this is my guess, and not based on fact.
Maybe a little bit, but all Penguin did was hurt those who were doing things that they already know was wrong. There’s nothing wrong with exact anchors, and there never will be, but moderation is the most essential part when pursuing them. 99% of the time when I do outreach, I never ask for anchor text; I take what ever is given to me. And that’s why it’s much easier to look natural by letting others do it for you.
Devalued in some instances, yes – but only where the link pattern started to look unnaturally manipulative through anchor text. I think in some ways, anchor text is now more important than ever – except that now, there may be a negative impact if you overdo it. Further – I think that the penalty for overdoing it is much more extreme than previously.
Absolutely. We know Google turned off a key link signal a few weeks ago and Google is NOT about to do that without iterating and improving their core algo. In testing we have carried out to date our feeling is that Relevance of Page has been ‘turned up’ as a rank signal and that makes a lot of sense as part of their wider plans to promote useful, respected web entities and sites within their own respected niches.
I don’t think anchor text was devalued. I think over-done anchor text has been devalued, but anchor text used in balance has not been devalued. I say this because I have seen many examples of specific keywords tanking on sites that had way too many exact anchors for these terms, but other terms on the site are ranking just fine.
Anchor text still necessary for ranking, but must be used in conjunction with branded and other random anchors too in order to rank well.
No. Absolutely not. I have been asking Google to ignore anchor text for years but I don’t think they’ll ever do that. Anchor text plays just as important a role in Google’s algorithms today as it ever has (although the SEO community has way overvalued that role). What Google has done is simply divide the Web into “links we’ll trust” and “links we won’t trust”.
No, I don’t think anchors were devalued, i think certain “obstacles” were placed so that anchor manipulation would not be so profitable as it was before. As far as I can tell, anchor is still the major (if not the exclusive) way of passing relevancy through a hyperlink, however Google is now looking with more care into how your anchor profile looks like. I like to compare it to an accelerator in a car – pushing on the accelerator pedal is the best way of making the car go faster. It also happens to be the best way of getting into a car crash or getting a speeding ticket, but it doesn’t mean that in order to accelerate there is something else to do other than push on that pedal. The fact that there are more cops and that the speed limit was lowered (unless your car is branded with big brand stickers) doesn’t mean that the accelerator does something different than what it did before.
I do not think that the anchor text is now devaluated, I think Google is much more interested in proportions. Remember the equilibrium that I was saying before? It`s available for anchor texts too. Until now it was simple: get back links with exact match keyword anchor texts and you will get high rankings. Now you have to dilute those exact keywords anchor texts with brand names.
So now it`s quite easy to build a brand then struggle to get exact keyword anchor texts.
Again, if we’re talking about link spam, we think in terms of thresholds. Think in terms of ratios. Is it natural to have a high percentage of exact match anchors? Not at all. Look at sites that have done no active link building for SEO and you’ll see there is a greater spread. This doesn’t seem to be the caseas much so far for exact match domains, but time will tell on that one.
I don’t believe that how anchors are treated has changed greatly with the recent updates, as much as the thresholds for what is considered legitimate and what may be flagged as spam.
I believe that right now Google does not need the anchor text the way it needed before in order to understand the “nature” of the page linked. With the technological advance Google had during the last years (Caffeine and Freshness, for instance) and the having Googlebot crawling the web real time, Google really does not need anymore the anchor text as a way of classifing better a page. Just looking at the page “Content Keywords” in Google Webmaster Tool we can see how well Google understand what our site is about.
I don’t believe that anchor text is devalued. I don’t think Google is actually being that sophisticated. I believe they are devaluing a large amount of sites that were passing link equity and therefore a lot of sites are having their legs cut out from under them. I also think they are penalizing sites whose backlink profiles are top heavy with unbranded anchors.
I don’t think it was devalued but more so tweaked to be viewed differently. Go to any search result for any tough keyword and you’ll clearly see that websites with an anchor text distribution geared towards site/brand name “natural” anchors are holding it down quite well. With that being said, you can still easily spam yourself into the top 10 quickly with the same junk links targeting your money keyword. I just think the difference now is that when Penguin re-runs, those sites will get knocked out and the next spammer will take its place.
Question 4: Have you witnessed any False Positives, regarding Google “Penguin”?
Sure but I don’t have permission to share. Google never said there were no false positives that is why they made at http://goo.gl/nt3Pz
There are definitely sites that have wacky looking link profiles, but shouldn’t be hit. A good example is wpmu.org that recently got dinged (more details here) and is a prominent use case. They are amongst a crowd of web designers/Wordpress devs that create themes and SHOULD have their links in the footer based on the constructs of the vertical, but based on the conditions of this update, got creamed. It was really hard for them to know that they should be no-following their links, especially without being prominent SEO experts. Here is quick link analysis by the cognitive SEO tool showing 40% footer links …
Not really, because they’re hard to identify (unless you have a client that got hit that was a false positive, which I didn’t), but that doesn’t mean they’re not out there.
I have a real problem with that question. The answer is absolutely – but at the same time – the site concerned probably deserved a bit of a slap on the wrist. However – I don’t think the owners of the site know where many of their bad links came from and most worryingly, they are banned on their Trademarked brand, even though they have over 40 stores and assets round $100 Million based on that word. As a result they are “enjoying” 70% CTR at 2 pence per click via Adwords on their trademark, whilst Google arguably passes off other sites as theirs on brand. Now this creates a new legal question. It’s Google’s index – so they have every right to slap any site they want… but they probably don’t have the right to then promote other pages on the web using that trademark. The reinclusion request is as yet without comment or answer.
Not to date. Our clients have so far not had any real issues (and that’s an honest answer) so our experience is via website owners contacting them to help them recover. So far that process has been interesting but as Penguin has yet to rerun (to our knowledge) we have not seen any drastic recoveries.
We have been involved in a recovery from an issue that triggered the ‘Notice of Unnatural Links’ email and that site is now where we would expect it to be given that we had to remove a lot of spammy links. The job now is a new content marketing strategy to earn that authority once again.
I did see one false positive recently in regards to anchor text. It’s an EMD(exact match domain) that actually deserves to rank for the terms, insofar that the content is pretty quality. They have not gone crazy on anchor text or too much on directory linkbuilding (though it is a B2B space), but were still hit for certain terms that they had some anchor text in order to be competitive. It seems that Google went too far in this case.
None so far. But then I can only look at so many Websites. Of course, there are a couple of sites where I wonder if they should have been affected by Penguin but their owners disclose so little information that I have no way of knowing when they lost traffic or if it was connected to Penguin. People who lost traffic a week after April 25 are blaming Penguin. Quite possibly what they are seeing is a Cascade Effect where the Websites that link to them suddenly lose value and so the real problem lies 1 or 2 tiers back. These are not false positives, although they are collateral damage.
It is hard to say whether any Penguin positives are false, especially since no one has lifted the Penguin penalty from affected sites yet. The only way to say with any degree of reasonable certainty that your site was affected by Penguin is if you know that the site was involved in link activities that are suspect of causing the penalty AND if the hammer dropped on the 24th of April. Similarly, the only way to know that your Penguin penalty was lifted as a result of your actions, would be by seeing your traffic being restored on the date Google announces that the next round of Penguin was implemented.
I assume that some false positives will appear after these things happen, especially since some Panda (and other) updates happened closely to Penguin, but we need for future roll-outs to happen to know for sure.
Many people who buy links do not manage them properly. That is true at least in Romania where I mostly work. Not knowing which links you are about to lose is a common practice over here. So each time partnerships expire clients panic and think they got penalized by Google. Knowing that Google has a new update for backlinks got them into paranoia.
So first thing they do after the panic is to buy more links so they will get their ranks up again instead of renewing the old partnerships that were good. Some of them use to ask SEOs about what to do and they get proper answers.
Some of them think that they got penalized because they did not invest money in PPC campaigns. Of course, someone must tell them that is more than a paranoid thinking…
To be honest, not really. None of our actual clients (managed campaigns) were affected. We do a fair amount of consulting to sites that were, but in most cases the link profiles were a house of cards and they were playing the ‘it works for the other guy’ (http://searchnewscentral.com/20120509293/General-SEO/seo-excuse-143-the-other-guy-is-doing-it-so-it-must-work.html) game. Meaning they had nasty anchor text ratios and a large portion of the link profiles were made up of some of the suspect types of links I mentioned earlier.
Sincerely not. What I’ve seen is that sites which objectively have bad link profiles were not penalized, or they were not heavily penalized, just because the ratio between commercial anchor text and branded anchor texts was equilibrated or not so strong on the commercial side.
No, I haven’t seen anything like this for our clients. I read a story in the Google Webmaster Forum, but I can’t speak to specifics since the person wouldn’t share them.
You can still go search for “buy viagra” and “buy cialis” in Google and the top spots are hacked .edu’s and spammed to death social profiles. You’d think they’d have a better handle on that but it clearly shows that link spam still works easily for a short period of time….enough time to make some bank so I doubt we’ll see a stoppage to this kind of activity.
Question 5: How do you see Link Building 1 year from now?
Funny, because now there is un-link building going on. I still see link building to be a huge SEO component, but link builders will be much more careful.
Lots of SERPs will be pretty clean, lots of content development – with many people making a lot of crappy content because they have no idea how to execute after having done grey/black hat SEO their entire lives. Lots of businesses will take a long time to catch up, because the easy solutions aren’t easy anymore. But really, I see it being the same – you’ll just get a bigger whiff of people attempting and failing in the background.
More so for traffic, trust, and sales than SEO. The term link building will never go away, but the end goal will start to change. Ranking #1 for a keyword 3 years ago is a lot different than what it is today for some queries, because Google is starting to crowd SERPs with more ads, different media types, and their own properties. All this means is that the rewards of ranking towards the top are diminishing over time. I talk about this more here.
I see it much more aligned with more traditional marketing activity. I still see it as extremely important, through, but there needs to be more thought about the context and meaning behind the links.
We have been guest posting for around 10 months now and I see this naturalised form or online PR replacing manipulative practices. The key now is to become a knowledge centre and work hard on promoting that fact to the key influencers in your niche.
Linkbuilding has been and is now more moving towards being more PR. It’s content built for and outreached to targeted personas for links. I think tactics like broken linkbuilding, etc will still work, but those aren’t strategies – they’re tactics for having a better chance of getting a response.
The future of linkbuilding is earning the links, not just finding places to easily place your links. We’ve been beating this drum for a while, but I think it is finally coming to fruition.
Just as spammy as ever. The SEO community just does not want to let go of really bad ideas. In my experience it takes people about 2 years to stop sharing bad SEO practices. Google came out in June 2009 and said that PageRank Sculpting never did what SEOs promised it would do, and that they had changed the way “rel=’nofollow’” worked more than a year previously to diminish the harm that this bad SEO practice was inflicting on Websites. Nonetheless, even in 2011 I was still seeing people talk about the benefits of PageRank Sculpting.
And there have been other examples going farther back in time where bad SEO practices were clearly given the boot by search engines but people continued to use them for years afterward. Even today people are still talking about Keyword Density, which plays absolutely no role in Google’s ranking algorithms.
If the things stay on the same course, I think there will be a much bigger focus on where the link is coming from vs. what the anchor of the link is. I can tell you already now, that I feel much better when the link I get has the brand name or the URL or some long tail variation of the head term for anchor, rather than the exact match head term itself. This will probably be hard on small businesses, with low industry online presence, that find it much harder to create linkable content. < sarcasm > Surprisingly, this may drive these businesses towards other forms of advertising, for example paid advertising through Adwords . However, I trust the community to find new and creative ways to build “natural” links and continue that evolutionary arms race between the marketers and the search engines.
Dropped dead. (joking)
Links value will evolve from basic signals to social signals by adding the social proof and behavior. In fact it`s the same kind of link building but what differ is the way links are valued. It will always count if you get links from content but now you need to confirm the “popularity” with social signals.
Building brands is the new link building, that’s for sure. How you will build it it`s the key.
That’s really hard to say. Again, at my firm we tend to do a great deal of more organic link building in combination with some foundational types of approaches. I see our role as being builders of authority and trust, not just links. As we’ve seen, Google can turn on a dime and simply getting links for the sake of a link, doesn’t hold water for me as a long term strategy and use of client resources.
And that’s the crux; we have no idea what’s to come. We watched Panda updates for some 18 months and tried to keep adapting. The same will likely be true for the Penguin roll outs. At the end of the day, not to sound TOO Googly, I believe strong content balanced with traditional and social promotion is something worth doing. Again, we want to do more than just get links. We want to build the client a following and build authority and trust.
I see it as a renewed activity. Surely harder, but without a doubt more interesting and pleasant to do. Link Building will be more connected (or even more connected) to Content Marketing and Brand Marketing, evolving from a somehow “isolated” activity to a more integrated tactic.
I would personally like to see a bigger move toward content strategy, but SEO needs to grow up more before we can all make that happen across the board.
I don’t think much will have changed from what’s going on now other than more people will be building junky links with “natural” anchor text variations. That and more people will be trying out “negative SEO” on their competitors because I don’t see why it wouldn’t work. If you can go buy 500 junk blog post links and get yourself in trouble, why couldn’t a competitor do the same thing?
Even if the low quality linking schemes still work for a while, I think that it will force a lot of companies and brands to do more creative things to attract links naturally. This is how is should be done but it’s more expensive, results come slower and your bosses might be breathing down your neck for results.
Doing fun and crazy things like contests, infographics, link bait, getting interviewed, real news citations and the like will not only bring on those lovely links, but the social followers that make a well balanced “brand meal”. Combine that all with an amazing on site user experience and you’ll not have to worry about major algo tweaks so much in the future.
Question 6: Do you think Google can handle the “Negative SEO” issue correctly?
I think they believe they are. It is hard to prove otherwise without having their insight and data on who was hit for what. I believe they cannot handle the perception of the negative SEO issue in the industry.
I’m not sure. It’s a delicate subject that will take some time to work out, but something big and public really does get crushed by it, there will be adjustments. But otherwise I see it being something that keeps coming and going from the public opinion in the near future until something monumental happens.
Not without admitting their wrong, and I don’t see that happening anytime soon. There aren’t many ways out, but the ones that do correct this issue (and not flawlessly; there are still definitely some kinks to sort out) have already been suggested and ignored/shot down by someone from the big G.
Only by giving Webmasters the chance to disavow links they do not want within Webmastertools. I am surprised they rolled out the “stick” without the corresponding “carrot”.
I’m not sure how easy that will be for them but they will have to look at it. As they’ve clearly made the move from devaluation to penalising ‘bad’ links it does, in theory, make negative SEO a potential issue…
I’m still not fully convinced that “negative SEO” works on real brands. I’ve seen it work on spam-blogs that have a very thin backlink profile, but legit companies? I’ve yet to see a real example, though I am completely open to it if anyone reading has some examples they would like to share.
That said, I think Google needs to be aware that this could become an issue and provide a way for webmasters to either a) cut links that they did not build (which webmasters may use to cut links that they DID build, but doesn’t this also help Google by getting rid of crap links that may still be propping up rankings?), or b) provide a way for webmasters to let Google know about these links, and show the webmasters that Google is actually actively doing something about it.
There really is no “Negative SEO” issue — not yet. If enough people talk it up I suppose it could become a PR problem for Google but the plain and simple truth is that the only real “poster boy” for recent Negative SEO (Dan Thies) was a false positive. The real reasons why his site temporarily lost some search visibility have now been explained on many Websites.
We have always had negative SEO practices. Back when Direct Hit was popular people would point click-manipulating software at their competitors’ listings. They sometimes created fake doorway pages pointing to competitive Websites hoping to get those sites penalized in Altavista and Google. A couple of years ago someone directed 10s of thousands of bad links at one of my own Websites. A few years ago one of my clients called me to ask if I was responsible for 1 million links that had suddenly appeared (and Google had demoted their Website in the rankings).
There is no more negative SEO capability or incentive today than there has been in the past. But right now the wrong people are talking about negative SEO like it’s something new or just became more powerful.
If spammers want to “get back” at Google by pointing toxic links at good Websites, all Google has to do is confirm with those sites that the links are hostile and not manipulative — and then the “victims” of negative SEO become honeypots for Google to track the link networks.
So anyone who wants to use “negative SEO” really needs to understand they are pointing all their cannons at their own decks and blasting holes in their own ships. That’s not a very smart strategy.
Google has a very easy way of handling Negative SEO – allow webmasters to manually disqualify links in Google Webmaster Tools. This is obviously not the perfect solution, as the perfect solution would be for Google to build an algorithm that recognizes between editorial links and links that were built in order to manipulate rankings or harm the website, but we know they don’t have a great record in that field. It would even be easier if, instead of defining what links to discount, Webmasters would be able to mark the links that should count, thus not being swamped by every bozo that decides to carpet bomb their link profile with Chinese forum signatures, but I don’t see this happening any time soon. In addition to being a great tool for Webmasters, it would also be a great tool for guys like me that would use the tool to experiment with value that links pass and identifying which links work and which don’t and I don’t think Google would want to provide us with that level of information.
It`s should be easy to handle now that Penguin can detect irrelevant and low value links. I`m sure Google seen them before. I`m not sure about the “correctly” term because this is not how Google works.
“Negative SEO” is not new, we`ve seen tones of junk links before and Google did almost nothing about them. Just because of Penguin update we expect Google to treat them correctly, but I`m not sure if this is the focus of Google. “Negative SEO” is not that common to make Google sensible about it. Ok, it will treat some of the examples that are exposed by the community but many sites can suffer penalties without getting analyzed.
I think many of “Negative SEO” penalties will be solved after the community reports with manual reviews.
That one is going to be a much larger problem. I hate to keep dropping links, but I’ve recently written at length about this as well on SEW (http://searchenginewatch.com/article/2169138/Negative-SEO-Looking-for-Answers-from-Google).
I would love to see Google implement something like this.
And from what I know, they’re open to the idea. Ultimately we can’t wait upon Google to do this unfortunately. Part of the new world of SEO means watching your link profiles like a hawk. Be aware of anything suspect that comes along and make notes of it. Be aware of not only potential negative actions, but also what the client or contractors are doing.
Honestly this is a question I cannot really answer to. My hope is that Google can handle it, but the best is always to “help” Google creating a strong, relevant and trusted link profile and monitoring the link acquisition of our sites and register every anomaly. For instance, to detect suspicious spikes in links’ acquisition is surely a good practice we all will need to start.
Not until Skynet becomes self-aware. Search is algorithmic and algorithms can always be exploited once you figure out what makes it tick.
While there’s not a whole lot of solid proof it’s happening on a large scale, I think they’ll just setup filters for certain types of links and ignore them….that’s what should be happening. My belief is that all those unnatural link warnings that went out were just a scare tactic for now to calm everyone down and throw the link spammers into a tizzy. While everyone is complaining the SERPs are worse now, I don’t think Google is anywhere near done making changes and the next year should be an interesting one to say the least.
Question 7: Why do you think Google named it “Penguin” ?
It is a friendly looking character, better than Panther.
Hard to make up a reason here without being mean to Google. So, insert mean Google joke here.
Because they’re tobogganing down a slippery slope when they make things like negative SEO and fake link removal notices possible
Because they are starting to flap and are no longer able to fly?
Black and White animals…someone thought that might be funny?! I’d rather it be called the Webspam Update to help with clarity at this confusing time.
I think someone on the Search Quality team stumbled across this Penguin infographic and thought “Man, this is confusing as crap. Let’s name this algorithm something similar because we’re about to throw a ton of stuff at people and they’ll be as confused as I am right now!”
I suspect there may be a few Batman fans at Google — or perhaps they like the movie “Happy Feet”. They have so many project and update code names I don’t worry about why they pick one over the other.
Because it walks like a duck and quacks like a duck, but isn’t really a duck, but something else completely? I don’t know, and to be honest, this whole trend of humorously naming the updates by cute cuddly animals is somewhat distasteful in my opinion. People are getting fired, losing their income, forced to close businesses and while we can argue whether that is due to Google’s ruthlessness or lack of foresight on the size of business owners, I find it a tad in a bad taste to be treating the whole issue with humor and lightness.
Google intend to create a series of updates named by black and white animals. We have panda and penguin, up next will have updates named as skunk, zebra, cow, rabbit, orca, lemur, tiger, and. I call it „Google Zoo” and I expect to have more of these updates.
I wander how skunk update will look like.
Oh my… I have no bloody idea there. They tend to have reasons, but I don’t recall seeing that anywhere. It’s certainly better than the original ‘over optimization’ update or even the ‘web spam’ update (though that one is the more fitting one). We have been joking in my community (SEO Dojo) that it seems black and white animals are the thing, (interesting given the white-hat / black-hat nature of the biz). And really, while a killer whale or a skunk might seem more appropriate to many, the cute and cuddly approach also seems a theme.
Recently I read a post saying that Penguin was choosen because of the sense of smell these (once) cute animals have. That could be a good explication to its name
The same reason they are the overlords of the internet — because they can. I suggest they call the next one skunk.
I don’t know if I can find a funny reason but Penguins are a social bird and in large groups so what if it’s a sign that Google+ and other social signals have come more into play? Links are too easily gamed these days and it makes perfect sense that social signals like shares from the collective “leaders” in your niche should do wonders for your content. If Matt Cutts, Eric Ward, Jason Acidre and Ross Hudgens all shared my latest post, what kind of signal is Google picking up from that? Most likely a good one.
Well, all these answers will take some time to read & digest … but the time spent reading them is really worth it, as they are the point of view of some of best people in the SEO industry.