1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Do you think internal broken links would cause Google SERP drop?

Discussion in 'Off Topic' started by TheBigK, Dec 8, 2012.

  1. TheBigK

    TheBigK Well-Known Member

    After several weeks of investigation, I've come to a conclusion that if a site has large number of internal broken links (originating from a domain and terminating on non-existent locations on itself), Google would push the domain down. There seems to be a mixed opinion / experience on this issue though. Some of the webmasters have experienced that if the site has internal broken links - then the site rankings go down (and come up as the count goes down). There are several others who say that such links won't hurt your rankings.

    What's your opinion / experience?
     
  2. rollthebones

    rollthebones Active Member

    I wouldn't be surprised, it shows lack of care.
     
  3. De Obertei

    De Obertei Active Member

    Yes, lowers PR. After cleaning, let me immediately +2. Need to constantly look for broken links a special program. I have Xenu
     
  4. TheBigK

    TheBigK Well-Known Member

    Not always. In our case it was a Javascript error in a wordpress plugin that we use on the site. By the time we figured out it was the plugin, the error count had gone up to 99k in just few days.

    Thanks. Could you tell me a bit more about it please? I fixed the error approximately 1.5 months ago. But Google's taking its own sweet time to drop the error count from its log. Did you recover your traffic after GWT reported decrease in error count? Or did you see the jump immediately after your fixed the error? Also, what was your error count? I didn't get the '+2' part in your message.

    I really appreciate responses :)
     
  5. CyclingTribe

    CyclingTribe Well-Known Member

    The difficulty I've always had is that you can't tell Google when to drop the checking of broken/expired links; it's still checking and reporting errors on 404s at my site that became invalid over a year ago. It would be great if there was a simple way of tagging "known" broken/invalid/expired links and have Google immediately (and permanently) remove them from its index.

    Cheers,
    Shaun :D
     
  6. De Obertei

    De Obertei Active Member

  7. De Obertei

    De Obertei Active Member

    CyclingTribe likes this.
  8. TheBigK

    TheBigK Well-Known Member

    You can mark the errors as 'fixed' (if you've already fixed them) - and Google won't show them again in GWT. That's why my observation in the past 3 month says. Were the errors internal? Did you notice any drop in rankings of the threads?

    Yeah, we look nice in the Google PR information. The problem we've been suffering is that there are so many internal broken links that this idiotic plugin created that Google 'might' have decided to rank us down. The threads that used to rank very high have been pushed down in search results a bit. I hope to recover as Google recognises that all the errors have been fixed.

    Good to know that. We've so many thousand URLs that are 'broken'. We can't remove each URL one by one.
     
  9. De Obertei

    De Obertei Active Member

    My domain previously belonged to other people. Google is still giving the search links to the old site. Since 2008. And all the while an error. Time did not help Google to stop producing an error because the old site, which is no more. Google is very sentimental and remember every old. ;) Perhaps you can organize your links or redirect to the correct page. For example, we had a few thousand broken links as site.com/work/......./. We closed only site.com/work/ in robots.txt disallow: */work
    With Yandex such problems, we never had. He removes all. Yandex give your site a citation index 30.
    I hope Google quickly forget your broken links without any work. :)
     
  10. De Obertei

    De Obertei Active Member

    found:
    "Need to hide from indexing the page using construction robots, which is prescribed in the head section on the page itself. Here's how it looks:

    <meta name="robots" content="noindex,nofollow">

    Need to configure the software engine in such a way that when a 404 loaded plug - own error page (set it in just one line. Htaccess on Unix-hosting). In the head section of the page stub put this code above. Now Google detection of broken links will not add them to the index".
    (http://translate.google.com/)
    https://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
     
  11. De Obertei

    De Obertei Active Member

  12. TheBigK

    TheBigK Well-Known Member

    /forum/ was redirected to /community/ last December. I can't figure out whether the above report from Xenu is good or bad - or it's telling me to fix something. What exactly is 'timeout' ? Is it the number of redirected URLs?
     
  13. TheBigK

    TheBigK Well-Known Member

    May I request more opinions, please?
     
  14. TheBigK

    TheBigK Well-Known Member

    Update:

    We got the error count down to 'Zero' by 26 of December. Today, it's at 58, which is okay because it's reporting the web-pages we actually removed long time ago. So I'm not bothered about the 404s anymore. The bad URLs are gone.

    Just wondering, how long we might have to wait for Google to restore our SERPs?
     

Share This Page