Koozai > Blog > How To Recover From A Google Ranking Penalty

How To Recover From A Google Ranking Penalty

| 11 minutes to read

Drop in Google RankingsThis guide is intended as a first point of reference for site owners who have suffered a Google ranking penalty or suspect they may be victims of algorithm updates. By working through the three steps below, you will be taking the first tentative steps in restoring your site’s lost value and getting your site back to its previous ranking glory.

Step 1 – Diagnose

2012 was a tough year for website owners and SEOs, with new algorithm updates and ranking penalties dished out by Google like hot meals. Many websites who suffered a ranking penalty invested a great deal of time and money into fixing the issues to regain organic traffic and many more sites simply haven’t been able to get near the rankings they once enjoyed.

Diagnose Cause of Website Ranking Drop

At this stage, consider the evidence that has led you to believe you have suffered a ranking penalty. Hopefully, you’ll be using Google Analytics for your site traffic analysis (if not, you should fix this first!)

Review your traffic over a long period of time, ideally comparing it to the same period in the previous year. This will allow you to see if your traffic drop is particularly unusual. Perhaps this time last year was also a quiet period for your business? Be sure to check traffic from organic sources only; is there a clear decline in natural search traffic specifically?

If you’re confident you have suffered a Google penalty, you need to work out which Google algorithm update may have been responsible. Unless you know why you have been punished, it is very difficult to fix the problem.

Firstly, check in Google Webmaster Tools. Again, if you’re not using this you really should be: it is the only place Google will ever give you advice on improving your site or notify you if there is a problem. It can also be used to check the technical health of your site and identify issues such as broken links and duplicate Meta.

In Webmaster Tools, look for alerts or warnings on the dashboard. If you have received an unnatural link warning then you know exactly what the problem is and stage one of the process ends here. The message would look something like this:

Unnatural Link Warning in Google Webmaster Tools

However, it isn’t always as cut-and-dry as this. Firstly, not all sites which suffer a ranking penalty get a notification in Webmaster Tools. In fact, most don’t, so you may have to dig a little deeper.

There is a great free tool at your disposal to help with this: Panguin by Barracuda. After logging into your Google Analytics account through the tool, you can see your site’s organic traffic with the dates of Google updates overlaid.

Barcuda Panguin Tool Sample Image

If you have been hit by a ranking penalty, there’s a good chance it will be perfectly obvious which update caused it, as the sudden drop in organic traffic will likely coincide with either a Penguin or Panda update. The tool also shows other known updates, including the Exact-Match Domain and Venice updates. Hovering over any update line will give you more information on the release.

When you know what caused the problem, you can begin trying to rectify it.

Step 2 – Resolve

When Penguins Attack – If Bad Links Are The Problem

If you believe that your issues have been caused by Penguin, whether or not you have received an unnatural link warning, you must carefully evaluate your link profile. Harmful backlinks can include:

  • Site-wide links (usually in footers, sidebars or blogrolls)
  • Low-value directory links
  • Paid links
  • Links on irrelevant or low-quality sites

Use a number of link analysis tools to evaluate your link profile. No single tool will show you every backlink you have, so use as many as possible to create a report that is as accurate as possible. I recommend the following tools. Some of these are free or have free options and some you need to pay for, but without a comprehensive analysis you cannot hope to carry out an effective link review:

  • Webmaster Tools
  • Link Detox
  • Majestic SEO
  • Open Site Explorer

Export the data from each of these tools into spreadsheets. The format of the data will be different in each, so create a uniform column and data structure before combining all the links into a single spreadsheet. You can then remove duplicates and will have a good indication of your complete backlink profile.

Now try to identify toxic links by looking out for the following signs:

  • A large number of links with exact match keyword anchor text
  • Large number of backlinks from the same referring domain
  • Large number of referring domains from the same referring IP
  • Large number of backlinks from the same referring IP
  • Links from low quality or spammy websites, blogs or directories

Any links which appear to be unnatural need to be removed. Even if you cannot get them removed, you must prove to Google that you have tried to do so before you can get them disavowed (more on that later). Highlight any links which appear unnatural and move them to a separate spreadsheet for action. Be warned though: depending on the strength of your site and number of backlinks, you may have dozens, hundreds or even thousands of toxic links to review!

For each unnatural link, you should try to find contact details for the site owner and send a polite message requesting that they remove the link as you believe it may be harming your search engine rankings.

In your unnatural links spreadsheet, add in some extra columns for recording why the link is unnatural, the email address of or contact method for the site owner, action taken and the date of action.

For each domain referring an unnatural link, you need to look for an email address, website contact form or phone number. If you cannot find any of the above on the site itself, you could try using a WhoIs tool to see if the domain owner has an email address registered or even look inside the website source code which occasionally contains an email address in comment tags or hidden content. Run a find for the “@” symbol within the source code to see if you can find one.

Once you have carried out these checks for each unnatural link and reported your findings in the spreadsheet, begin contacting the sites for which you have found contact details.

The success rate of the link removal process depends entirely on the quality of the contact information and the way you approach the site owners, but generally you will only get around 5-10% of the links removed. Unfortunately though, you must go through this process to prove to Google that you have made every effort to get the links removed. For those links which could not be removed, or for any you could not find contact details for, it is time to consider the Google Disavow Tool.

Google Disavow Tool

The Disavow Tool is used to tell Google which of your links you do not want them to consider when ranking your site and have been unable to get removed manually. It involves a simple process of uploading a plain text file of the domains and URLs which you want disavowed and waiting for Google to review it and approve.

Google Disavow Tool Upload

It is worth noting however that the tool is still in its infancy and many SEOs advise exercising caution before using it. Be certain that the Penguin update has caused a penalty, as some believe that by uploading to the Disavow tool you may highlight your unnatural link profile to Google. However, if you are sure that your unnatural backlinks have caused a serious ranking issue, I strongly recommend using this tool which has seen hundreds of sites resolve their ranking issues.

In order to restore search engine rankings after receiving an unnatural link warning and getting these links removed, you must complete a reconsideration request. This instructs Google that you have fixed the issue and believe your backlink profile is now made up of only natural links. It can sometimes take several reconsideration requests before Google agree to re-index your site, if they believe they can still see unnatural backlinks, so it is especially important to be brutal with your link removal activities if your received this warning.

When Pandas Go Bad – If Poor or Duplicate Content Is The Problem

If you believe Panda was the cause of your ranking penalty, you must assess the content across your site and other sites, looking in particular for the following things:

  • Duplicate content
  • Pages thin on content
  • Keyword stuffing
  • Keyword bolding
  • Poor quality content

Start by reviewing the content on your site, looking for pages which may be perceived as duplicate. For example, if you have a page called “Ice Cream Flavours” and one called “Flavours of Ice Cream” (each initially created to target the different keyword terms) these are likely to be considered duplicate, especially if the content is similar. Remove one of these pages, set up a redirect pointing it towards the other and make the one remaining page strong with quality content.

Next, ensure each page has a good amount of quality content. It is recommended that you have at least 250-300 words per page. Also check the structure of your content. Techniques such as keyword bolding and keyword stuffing (excessive use of your keywords) are no longer viable optimisation techniques. Review your content site-wide and change accordingly.

Lastly, it is important that no other sites on the web have the same content as yours, as this can be considered plagiarism and is often punished by Google. Copyscape is a cheap tool which allows you to search the web for content duplicate to that on your site. There is a free version, but this does not crawl as much of the web as the paid version and does miss some sources of duplicate content issues. For just a few cents, it is worth using the paid version and ensuring there is no content anywhere else on the web that could be considered duplicate.

If your search yields results, look into them carefully to understand the source. If it is someone else’s site, you need to make your content unique from theirs and usually the only way to do this is to change your own. This may not seem fair, yours might have been there first, but it is the only way to be sure Google knows you aren’t guilty of plagiarism.

If the content comes from sites you control (such as sister sites or alternative domains), make sure you change any duplicate instances of copy to make it unique.

When you have carried out all of these checks and are satisfied that your content is unique, purposeful and well-constructed for the user and not the search engine, keep an eye on your rankings and organic traffic. It can take a few weeks or more but if you have resolved the issue you should see things start to improve. If not, unfortunately it is back to the drawing board; only this time the site is already more refined and it should be easier to identify the problematic content.

When Exact Match Domains Are Punished… And You’re Sure About It

If you are certain that the EMD update caused the ranking penalty, not the Panda 30 update which happened around the same time, you need to concentrate on improving the strength of your brand identity.

Create strong, high-quality backlinks with branded anchor text and actively engage with others on social networks. You should secure your brand name on as many social profile and networking sites as possible, as well as niche directories and forums relevant to your industry. Look for high-quality link building opportunities through guest blogging. Lastly consider investing in the production of high quality content or useful infographics which can be used promote your brand across the web.

The last resort solution to recovering from an EMD update-related penalty is of course to rebrand with a new domain that could not be considered exact-match keywords. You would then need to set up redirects from your original site and begin promoting your new brand and website.

For example, if your website www.sunglasses.com was hit by the update and no amount of brand promotion restored your rankings, you may wish to rebrand and purchase a new domain such as www.sunshieldsunglasses.com or www.smithsunglasses.com. Rebranding is not ideal and can be expensive, but it may be your last chance at restoring your website’s standing with Google.

Remember: if you think you have been hit by the EMD update, it is worth double checking that it wasn’t actually the Panda update implemented around the same time. Many companies have taken drastic measures to fix their ranking issues, like buying a new domain, and have found that it was actually content that caused the problems. Refer back to step 1 and check vigorously before blaming EMD.

Step 3 – Review

Once you have taken all the necessary action to fix your site’s problems, you need to allow time for Google to re-crawl and re-evaluate your site.

Monitor your traffic carefully over the next few months, again comparing to the same period in a previous year to identify normal seasonal trends. If you have effectively rectified the problems, you should start to see traffic increase once more as search engine rankings recover. It won’t happen overnight, and it may not reach the heights of the past, but your site should start to get back on its feet.

If you have tried everything you can to get re-indexed and are still feeling the wrath of Google’s updates, contact Koozai today for more information on our SEO packages which include site audits and backlink analysis and removal services.

Share your thoughts, advice and questions below

If you have managed to get back into the rankings after a Google penalty, let us know how you did it! Did you get harmful backlinks removed? Get rid of low quality or duplicate content? Share your experience and advice by commenting below.

Image Credits

Graph Image from Bigstock
Diagnosis Image from Bigstock

Responses

  1. […] Google “Panda” update began penalizing sites with duplicate content. In order to address certain keywords, you might have two existing pages with similar content but […]

  2. […] information on how to carry out comprehensive backlink analysis, please see my earlier post on how to recover from a Google ranking penalty or download our Complete Guide to Backlink Analysis and Removal whitepaper. You may also want to […]

  3. mrz avatar
    mrz

    Thanks alot for yor reply.

  4. mrz avatar
    mrz

    Dear Emma

    Your comments are very useful. If we see that archiving in Archive.org is reducing alot , what can be the problem?
    and also I checked redirect on a web site which was facing many negative scores from Alexa.com, it had 302 redirects. and also many backlinks were removed.

    1. Emma North avatar

      Hi again,

      If you are concerned about your site performance, etc. I wouldn’t be using Alexa. As I mentioned, they are just guide and should not be used to make any decisions on your site.

      302 redirects only mean that your pages are being temporarily redirected to another a resource. If those redirects will not ever change and are permanent, you should change them to 301s.

      You would be better off carrying out a site audit and trying to identify the possible cause/s of any issues. You might find the following guides useful:

      https://www.koozai.com//search-marketing/essential-2013-seo-health-check-guide/

      https://www.koozai.com//search-marketing/why-have-rankings-dropped/

      Alternatively, please feel free to give us a call if you’d like us to look into your website and possible issues in more detail.

  5. mrz avatar
    mrz

    Thanks Emma,
    your paper was full of notes. I’m starting to study seo. If we recieve negative scores in alexa is it accurate? I need to know how we can know our rank, and how we can know negative scores on our site? Is Alexa rank accurate or google web master tools ?

    Regards.

    1. Emma North avatar

      Hi there. Metrics such as Alexa and PageRank are indications and should be used as such – they are not intended to provide true indications of site value or performance. The actual Alexa value has no impact on the performance of your site in search so I wouldn’t be focusing on how to improve it or anything like that.

      What issues are you having to make you check Alexa ranks? What is it that you are looking to understand?

      1. mrz avatar
        mrz

        Thanks for your good comments.
        you know when we started activities in Facebook and twitter etc, huge negative scores came to our site(in Alexa). I checked and saw that we have many redirects on the web pages that return 302 code.
        and also search traffic decreases when we increased our social network activities. now because of getting negative scores in Alexa I don’t know is it related to those redirect 302 or not? Is it accurate?

        Thanks alot

  6. oliver smith avatar
    oliver smith

    Hi,

    I have a website. I did the audit but i found so many natural links. All the things are OK. But still not ranked. I spoke to one webmaster he said i have a problem in web hosting server. Can you tell me what is the relation between website hosting and keywords rank.

    i will glad if you replied with proper solution.
    thanks

    1. Emma North avatar

      Hi Oliver, thanks for getting in touch. Have you received a manual penalty in Google Webmaster Tools? If so, what does the message read?

      I’m not sure exactly what you mean when you say you suspect a problem with your web hosting server but if you have received a manual penalty the message in Webmaster Tools will give you a good indication of the problem.

  7. Kyle avatar
    Kyle

    Hi Emma,

    Thanks for the great post! Our SERP rankings took an overnight hit recently that resulted in a 70% decline in organic traffic to our blog. We believe that Jobamatic was the culprit. If you’re not familiar, Jobamatic offers a free job board service and is part of the job aggregator SimplyHired. When you sign up for the service, you are given the option of using their host name (yourcompany.jobamatic.com) or setting up your own domain name. We chose to set up our own subdomain name (jobs.ourdomain.com). Google indexed 111 pages from this subdomain. Of course, all of those pages are duplicate content. As soon as those pages were indexed, our traffic crashed.

    To correct, we verified the subdomain in Webmaster Tools and requested removal. The removal request was completed within 24 hours. We abandoned Jobamatic altogether and there is no trace of it on our site.

    My question: I believe that your blog post indicates that recovery from such an instance is possible. Is this correct? Or is our domain forever doomed?

    Thank you so much for any insight you can offer!

    1. Emma North avatar

      Hi Kyle

      There are certainly a lot of things you can do to resolve issues caused as a result of duplicate content and getting rid of it all is the first essential step.

      However you should also check that those pages, or in fact duplicate content at all, were definitely the only cause of any rankings drops you suffered. It is possible that the sudden hike of duplicate content set off alarm bells and triggered a drop, but equally possible that there are also other factors in play that may have only just come to light.

      In order to diagnose the causes of the damage I would check in Analytics exactly when you started to see drops in organic traffic. You can then cross-reference this with dates of key Google updates to see if you can gain further insight. I would also have a look into your backlink profile to look for any unnatural, spammy links which may be holding you back.

      If you would like us to take a more detailed look at the situation, please feel free to get in touch for information on our Site Audit service or SEO packages.

      I hope this helps! :-)

  8. mir helal avatar
    mir helal

    i can’t get my web site by using google search .is it penalty of google ? if so then please help me to recovery my web site.could you tell me please what i have to do.it’s a new web site.
    i am waiting for you response.
    thanks

    1. Emma North avatar

      Hi there. Try searching in Google for your site like “site:yoursite.com” to see how many search results you get. This indicates how many pages are actually indexed.

      If you have pages indexed but you cannot see them in normal searches then it is likely you have dropped down the rankings and need to do some work on optimising your site to improve it’s position.

      If however you do not see any pages in the search result but you once did, you have been deindexed from Google and have a lot of work to do to get back on track.

      I hope this helps, but please feel free to give us a call to discuss your site in more detail.

  9. Bruno avatar

    Hello Emma,

    all your reports are nice, and i think only you can help me!! :D

    My URL is: http://www.falandodeviagem.com.br

    I have a big problem with the google indexing!! We lost more than 70% on Analytics and Adsense.
    How get it back?

    I contacted “Webmasters Tool – reconsideration request” and the answer optive described below. (end of the msg)

    I think my problem was at the time the site has changed the server and when we did the redirects of our other registered domain name, not the aim but were indexed with the content below.

    on the old server who entered http://www.smack.com.br, fall into the url http://www.falandodeviagem.com.br
    new server who entered http://www.smack.com.br fall in content of http://www.falandodeviagem.com.br, but still with the “smack.com.br” address on the top bar URL.

    With this in our “external links” we get over 1 million links to the site.
    smack.com.br 964,558
    ingressobrasil.com.br 122,330

    Outside that our content was indexed in google with these domains, something that made ​​no sense and noticed that he was totally wrong.

    We’ve removed altogether as redirects also deactivated the DNS on both domains that occurred this.

    Can you help us how to get back to our old indexing?

    Whenever we collaborate with the rules of Google, this episode was a failure for not understanding our server and very much less of direction or redirection.

    Images of our Analytics and Webmasters Tools
    http://www.falandodeviagem.com/imagens/b1/problem

    Thank you !!
    Bruno

    Dear site owner or webmaster of https://www.falandodeviagem.com.br/,
    We received a request from a site owner to reconsider https://www.falandodeviagem.com.br/ for compliance with Google’s Webmaster Guidelines.
    We reviewed your site and found no manual actions by the webspam team that might affect your site’s ranking in Google. There’s no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.
    Of course, there may be other issues with your site that affect your site’s ranking. Google’s computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.
    If you’ve experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site’s content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you’ve changed the URLs for a large portion of your site’s pages. This article has a list of other potential reasons your site may not be doing well in search.
    If you’re still unable to resolve your issue, please see our Webmaster Help Forum for support.
    Sincerely,
    Google Search Quality Team

    1. Emma North avatar

      Hi Bruno

      Google’s reconsideration request tool is only for instances where a manual penalty has been applied to your site which would be indicated to you by a message in Webmaster Tools. As you have not suffered a manual penalty, your loss of rankings is related to the algorithm.

      We can certainly look at this in more detail for you. Please give us a call to discuss your requirements in more detail and we can provide you with more information on our services.

  10. ayuda a domicilio avatar

    Emma , amazing tips.
    I have suffered a penalty from google in my web:
    http://www.ayudaadomicilio.com
    and this struggling to get out of it.

    I ´m trying to resist not to lose my project , is in Spain visiting nurses at home.

    i think that is a good proyect and could be bigger in the future, you know something like you in spain to be my adviser.
    thanks

    1. Emma North avatar

      Hi there, thanks for your comment.

      We can provide multi-lingual SEO services which would allow us to review your site in detail and report potential causes of your ranking penalty or SEO restrictions. Please give us a call to discuss this.

  11. Emma North avatar

    Hi Jan, thanks for your comment.

    Firstly, 500 backlinks from 2 forums definitely could cause problems and getting them removed completely would be the first place to start. However it is equally important to build new quality backlinks as it is to remove harmful ones. It sounds like you should carry out a full review of your backlinks to evaluate which ones are adding value and which could be jeopardising your rankings.

    When it comes to getting indexed again, you shouldn’t need to submit a reconsideration request unless you have received a specific notification from Google about unnatural backlinks. If you haven’t then it is just a case of developing your link profile and improving your website in compliance with Google’s best practices until it is seen to provide more value than its keyword competitors.

    I hope this helps, but please let me know if we can help any further. And good luck! :-)

  12. Jan avatar
    Jan

    Many thanks for a very helpful article.

    Do you know whether links (unfortunately about 500) from 2 forums could harm a new site? If so, will the site recover by itself once these links disappear or do I need to ask for a Google reconsideration?

    I created a site for a customer (not overly optimised but unfortunately an exact match domain that they had bought months before) in Dec 2012 and used signature links from 2 forums to get it indexed (1 forum used the url, the other the company name as the anchor text – I thought it was non-spammy but hadn’t heard of the EMD algo change then). Unfortunately, I went away over Christmas, got involved with other things and didn’t go back to the forum to change the sig links – when the site fell from page 3-4 (first week of so after indexing) to beyond page 16 (a few weeks later) I realised with horror that all 500 previous forum entries had been reindexed and were pointing to this new site.

    Is there anything you could recommend that I do to correct this? I’ve obviously removed all forum signatures and the links have fallen from 500 to 100 (still no improvement to the site).

    Just to complicate matters, when the website first fell beyond page 16, I did originally think that the cause was due to UNDER optimisation (that the original position was just die to a honeymoon period) and so added in a few keywords to the home page. The site did come back to page 2-3 for a few days then fell like a stone again. (I’ve now taken the keywords out again.)

    If it were a backlinks penalty/filter, would the website have come back to page 2/3 for a few days after my changes were indexed????

    Would appreciate any advice from someone. Many thanks again.

  13. Patrick Tiley avatar

    Thanks for this very useful post Emma. Our site was hit dramatically about 14th Sept 2012 and stopped dropping traffic about 30th September. So was affected by either the improved SERP diversity update or the Panda 3.92 update. As we had been number 1 for virtually all baby shower searches up to then my guess is the diversity update. We are still not ranked for baby shower, baby shower decorations and baby shower games. Seems unfair as back in 2003 we were probably the first dedicated UK baby shower company. Spending time after being invited to promote the idea on radio, TV and newspapers. Any ideas on removing this diversity penalty?

    1. Emma North avatar

      Hi Patrick. The SERP diversity update was only a small update designed to reduce the number of pages shown from a single domain. It is highly unlikely that it would eliminate the presence of your site in the SERPs altogether: the diversity update does not cause “penalties”. There are likely to be other factors at play here and as there was a Panda update around that time I would take a closer look at content first.

  14. Joshua Jacoby avatar

    This is good advice, great advice. One more I will throw in – go to view page source, and do ctrl + f and search for noindex – make sure it has not crept in there – while you are at it check nofollow and nocache.

    Emma, what do you think should be the litmus test for starting over after penalty? Is there any point when you should say enough and switch domains?

    1. Emma North avatar

      Thanks for the comment Joshua and great point about checking for noindex, nofollow and nocache. Particularly with sites using a CMS, it is not unheard of for directives like these to appear on pages unintentionally.

      Regarding a cut-off point: unless you are absolutely certain that the EMD update caused the issues, I don’t think switching domains would be beneficial for quite some time. All the good stuff you’ve built up with your site (strong backlinks, domain age and other “Google Juice”) would be lost. I think switching domains really is a last resort.

  15. Shaun Thomas avatar
    Shaun Thomas

    Hi Mike,

    Yes, that’s what I meant. I had some problems when the system first re-launched with Google because my publisher image was replacing my authorship image in blog posts, so I did away with it for a while, but I’ve put it back in now as everything seems to work fine.

    I also encourage my authors to use their Google+ profile, but for outreach posts on behalf of clients, I use Google+ profiles that I control because I can be sure to benefit from the author rank for my future outreach campaigns this way.

  16. Mike Essex avatar

    @Shaun Hi Shaun. Do you mean rel=”publisher”? We do use that sitewide to help Google associate our main brand Google+ page with our website. It doesn’t work as well as I’d like – for example it would be nice to show the Koozai logo on our service pages – but is a useful tag to have alongside rel=”author”.

    For outreach we encourage employees to use their G+ profiles and we tag Koozai blog posts with them too.

  17. Brad avatar
    Brad

    Its a daily grind finding and removing scraped content. Why does the original producer of the content pay the price for this? This tells me Google hasn’t got a solution for it so they penalize the domain owner for not cleaning it up. It promotes content scraping and supports content scraping.

    1. Emma North avatar

      Hi Brad. It is frustrating and this is not to say that Google doesn’t have a solution for this, but it may not always get it right. Of course, as discussed earlier, Authorship is the latest solution and will play an increasingly important role in SEO and content but currently we have seen clear cases of scraping issues where the wrong site has been penalised. We can only hope this will be resolved by Authorship and updates.

      1. Brad avatar
        Brad

        I passed on a full page of info to Danny Sullivan and he passed it onto his contact at Google but nothing has been done.

        Google authorship does NOTHING and has made NO difference whatsoever. Google hasn’t done anything which is effective.

        They do however have a list of protected domains that can do anything they like. Its favoritism and a completely unfair system.

        They need to be held accountable.

    2. Emma North avatar

      I’m sorry to hear that Brad; sounds like you’ve felt the wrath of Google! Unfortunately, this is why we often recommend making the content unique on your own site and having authorship set up. It isn’t ideal and it certainly isn’t fair but when sites have been hit by Panda it can be the only way to restore rankings.

      Authorship may not currently seem to have a great effect but in the long term it is my belief that it will be an essential feature of all sites.

  18. emilio bole avatar

    I am very interested in knowing if you would “analize” my web site and give me a detailed report. I have been badly penalized by googel, i can not figure out what is going out and I am almost going out of business. My question is, how much would you charge to “analize” my web site to figure out what is causing the drop (almost 70%) in organic visits. I am the sole owner of the site and my knowledge is rather in the dresses field. But if you could do an analisys and give me the details I will do the rest.

    1. Emma North avatar

      Thanks Emilio. We can certainly provide you with a quote for a site audit. If you give the office a call, one of our sales team will be happy to help.

  19. Shaun Thomas avatar
    Shaun Thomas

    Thanks for a thorough post Emma, I just need to say something about the disavow tool.

    I have used it to get rid of around 18,000 links that were affecting my ranking because I used an automated tool on one of my satellite sites (ie. not one that mattered, just one I used to get temporary and thus throw away link juice.)

    The thing is, Matt Cutts has said that negative SEO will not affect websites, so I wouldn’t bother with the disavow tool unless you have been actively using automatic link-building or directory submission tools for the site you are performing the SEO on.

    I know there are many people who come in after another SEO pro (in the vaguest sense) has made a mess of things and then the disavow tool is useful. Needless to say, I no longer use tools to automatically build PR, but they did have a place at one time.

    I firmly believe the Authorship feature is going to be big and will make huge dents in PR for some pretty big players this time next year, so I agree that getting your Google+ profile active and busy and linking it to any good content you produce is a key factor in the defence of hard won PR.

    1. Emma North avatar

      Thanks for your feedback Shaun.

      From an SEO’s point of view, the disavow tool is a godsend, particularly, as you say, when a client arrives with us with a poor backlink profile. However there is now a lot of evidence that this tool has worked for many people and it is becoming a good way to get those dodgy links discounted if you cannot get them removed.

      Regarding authorship, I completely agree that it will be an increasingly important factor over the next couple of years and website owners who embrace and utilise it effectively now will reap the rewards later.

      1. Shaun Thomas avatar
        Shaun Thomas

        Thanks for a prompt replay Emma,

        I see you have set up publisher-ship(?) a bad phrase I know. I tend to use relate my publisher profile to posts on my own site, but not for my outreach posts just in case.

        Is that something you do?

    2. Christian avatar
      Christian

      Shaun, after using the disavow tool, did you see any measurable recovery of your previous rankings? If so, did it take until the next Penguin update before these rankings started coming back? I’ve heard mixed reviews about the disavow tool, but I have heard that it needs an algo refresh to take effect.

      I’ve used the disavow tool (against many people’s advice) on some Penguinized sites I own, but I’ve yet to see any recovery. On these sites I was also able to get some low quality links removed and clean up my anchor text percentages to get them more in line with a “branded” anchor text distribution. I haven’t received unnatural links notices, and traffic dropped off a year ago when the initial penguin rolled out. Any more info you have on your disavow usage would be a great help. Thanks.

  20. Victor Pan avatar
    Victor Pan

    I asked that because from the screen grab above alone, it seems like you’re disavowing all links to your website (which isn’t how the tool works). Thought it’d provide more value to have the next step shown.

    You’re absolutely right about stealing content. Seen it done, and that’s why I recommend it – to protect yourself.

    Thanks for the speedy reply and have a great day!

    1. Emma North avatar

      Thanks Victor, I see. I used this screen grab to create familiarity as it is the first screen a user sees when arriving at the disavow tool, but I can certainly see the value of using the next screen also. I will look to add another image soon, thanks for the suggestion.

      1. Emma North avatar

        As a follow-up, I have added the second screenshot from the Disavow tool. Thanks for the suggestion Victor!

  21. Victor Pan avatar

    I think instead of editing your content when you find scrapers… a better solution would be to implement authorship – or instantly tweet your new page on twitter. Usually Google’s pretty good at finding the originating source of duplicate content, regardless of edits. We just have to make sure the spammers don’t get an edge on us.

    Just my two cents.

    Btw, do you have a screen grab of what happens after you click Disavow links Emma?

    Thx, Victor

    1. Emma North avatar

      Hi Victor, thanks for your comment.

      Absolutely, having Google authorship set up is a great way to verify your created content but I’m not sure that it is a solution alone. For example, there would be nothing to stop a site with Google authorship scraping content from a site without it and taking credit for it as new content. It is definitely worth mentioning though as all sites should be set up for this to protect themselves long-term.

      Regarding the disavow image, I will see what I can do and let you know.

    2. Patrick De Wachter avatar
      Patrick De Wachter

      Hi Victor,

      When using Twitter, is it important to use the direct link or is a link shortener like Bitly no problem for Google?

      thx

      Patrick

      1. Victor Pan avatar
        Victor Pan

        Last time I checked (3 months ago) there’s no issue with bit.ly shorteners. It’s pretty cool b/c when you update a web page, tweet what a great job you think you did, you’re sometimes rewarded with a new, more enticing snippet for that page.

        You could always run a quick test – make two new pages with unique H1’s of some unique nonsense keyword (Patridewachterista vs Wachtedepatrick) and then some filler content for both. Tweet one with Bitly, the other one with a direct link. Don’t take my word for it ;) Test it :)

Leave a Reply

Your email address will not be published. Required fields are marked *

Sophie Roberts

Managing Director

As MD, Sophie ensures smooth sailing at Koozai and oversees digital strategy across all our clients. A seasoned marketer with over 30 years’ experience, Sophie has delivered hundreds of effective marketing solutions for leading brands including Golden Wonder, Airfix & Humbrol, and Victorinox Swiss Army Knives. A big foodie and a self-confessed geek, Sophie treats every day like a school day. She’s business driven and solution focused, priding herself on being able to make digital simple. Providing digital solutions to your business issues is what motivates her. Sophie has appeared in The Business Magazine, Portsmouth News, The Daily Echo, Yahoo News, The Caterer & HVP Magazine.

Sophie Roberts Read more about Sophie Roberts

Digital Ideas Monthly

Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget – it’s free, so why haven’t you signed up already?