Do you think internal broken links would cause Google SERP drop?

TheBigK

Well-known member
After several weeks of investigation, I've come to a conclusion that if a site has large number of internal broken links (originating from a domain and terminating on non-existent locations on itself), Google would push the domain down. There seems to be a mixed opinion / experience on this issue though. Some of the webmasters have experienced that if the site has internal broken links - then the site rankings go down (and come up as the count goes down). There are several others who say that such links won't hurt your rankings.

What's your opinion / experience?
 
Yes, lowers PR. After cleaning, let me immediately +2. Need to constantly look for broken links a special program. I have Xenu
 
I wouldn't be surprised, it shows lack of care.
Not always. In our case it was a Javascript error in a wordpress plugin that we use on the site. By the time we figured out it was the plugin, the error count had gone up to 99k in just few days.

Yes, lowers PR. After cleaning, let me immediately +2. Need to constantly look for broken links a special program. I have Xenu
Thanks. Could you tell me a bit more about it please? I fixed the error approximately 1.5 months ago. But Google's taking its own sweet time to drop the error count from its log. Did you recover your traffic after GWT reported decrease in error count? Or did you see the jump immediately after your fixed the error? Also, what was your error count? I didn't get the '+2' part in your message.

I really appreciate responses :)
 
The difficulty I've always had is that you can't tell Google when to drop the checking of broken/expired links; it's still checking and reporting errors on 404s at my site that became invalid over a year ago. It would be great if there was a simple way of tagging "known" broken/invalid/expired links and have Google immediately (and permanently) remove them from its index.

Cheers,
Shaun :D
 
The difficulty I've always had is that you can't tell Google when to drop the checking of broken/expired links; it's still checking and reporting errors on 404s at my site that became invalid over a year ago. It would be great if there was a simple way of tagging "known" broken/invalid/expired links and have Google immediately (and permanently) remove them from its index.

Cheers,
Shaun :D
You can mark the errors as 'fixed' (if you've already fixed them) - and Google won't show them again in GWT. That's why my observation in the past 3 month says. Were the errors internal? Did you notice any drop in rankings of the threads?

What kind of website do you say?
For http://www.crazyengineers.com/ is a good result
PageRank = 5.00 real PR = 6.09 TrustRank(sb) = 6.15
Yeah, we look nice in the Google PR information. The problem we've been suffering is that there are so many internal broken links that this idiotic plugin created that Google 'might' have decided to rank us down. The threads that used to rank very high have been pushed down in search results a bit. I hope to recover as Google recognises that all the errors have been fixed.

http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663416
https://www.google.com/webmasters/tools/url-removal?hl=en&siteUrl=http://www.site.com/

We have removed the broken links and pages. And as we closed indexing all outgoing links. When Google gave PR, we have added two and trust.

Good to know that. We've so many thousand URLs that are 'broken'. We can't remove each URL one by one.
 
My domain previously belonged to other people. Google is still giving the search links to the old site. Since 2008. And all the while an error. Time did not help Google to stop producing an error because the old site, which is no more. Google is very sentimental and remember every old. ;) Perhaps you can organize your links or redirect to the correct page. For example, we had a few thousand broken links as site.com/work/......./. We closed only site.com/work/ in robots.txt disallow: */work
With Yandex such problems, we never had. He removes all. Yandex give your site a citation index 30.
I hope Google quickly forget your broken links without any work. :)
 
found:
"Need to hide from indexing the page using construction robots, which is prescribed in the head section on the page itself. Here's how it looks:

<meta name="robots" content="noindex,nofollow">

Need to configure the software engine in such a way that when a 404 loaded plug - own error page (set it in just one line. Htaccess on Unix-hosting). In the head section of the page stub put this code above. Now Google detection of broken links will not add them to the index".
(http://translate.google.com/)
https://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
 
/forum/ was redirected to /community/ last December. I can't figure out whether the above report from Xenu is good or bad - or it's telling me to fix something. What exactly is 'timeout' ? Is it the number of redirected URLs?
 
Update:

We got the error count down to 'Zero' by 26 of December. Today, it's at 58, which is okay because it's reporting the web-pages we actually removed long time ago. So I'm not bothered about the 404s anymore. The bad URLs are gone.

Just wondering, how long we might have to wait for Google to restore our SERPs?
 
Top Bottom