1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Google WT & SEO Expert

Discussion in 'Off Topic' started by lazer, Jul 6, 2013.

  1. lazer

    lazer Well-Known Member

    Hey there :)

    I would really appreciate a one to one with someone who is very experienced in Google WT and SEO practises.

    We have recently been suffering from a load of Google crawl errors:
    Capture.PNG
    We would like to know...
    • Why Google are repeatedly trying to get to areas of the site that are not publicly available (nodes) - and then subsequently reporting these as "Access Denied" - is there a way to tell them to stay out of specific nodes and de-list the existing indexed content?
    • Why Google is still trying to index content that hasn't been present on the site for over a year...but reports it as an error. We switched to xF in April 2012 but very old vBulletin and Wordpress content appears all the time in the "Soft 404" error list. There are no remnants of vBulletin or Wordpress on the server.
    • What to do with all those "not found" 404's - all of which are threads or content that has been removed from the site permanently.
    Here's our Robots.txt if that helps at all.
    Code:
    User-agent: Mediapartners-Google
    
    Disallow:
    
    User-agent: *
    Disallow: /xf/find-new/
    Disallow: /xf/account/
    Disallow: /xf/attachments/
    Disallow: /xf/goto/
    Disallow: /xf/login/
    Disallow: /xf/admin.php
    Disallow: /xf/mobiquo/
    Disallow: /xf/members/
    Allow: /
    
    Would love to chat with someone about this.

    Thanks.
     
  2. Claudio

    Claudio Well-Known Member

    Just add to robots file the path to those nodes. The robots file helps search engines to "understand" what they don't have to find.
     
  3. lazer

    lazer Well-Known Member

    I appreciate your reply (to the first point) but that won't really work for the older URL's as they were mostly in the root. I suppose I could add an entry for current nodes that are not publicly accessible but I shouldn't have to do that...surely? How do other forum owners manage this when, say, they replace all of the content of one forum, with the Resource Manager and then shut down the original node?
     
  4. smartpixels

    smartpixels Active Member

    I usually ignore those not found and 404 errors. I have a site with more than 70,000 errors still doing well in searches. Google's bot are very aggressive and they will crawl everything even meaningless URL's. Server and soft 404's can be reduced and use the fixed now option
     
  5. lazer

    lazer Well-Known Member

    Hmm..it's difficult to ignore it really. We're trying to fathom why we have seen a 20% drop in traffic during June and July, without changing anything. The only major change we have made during that time was to add APC but I can't think that would make this type of difference, if any at all.
     
  6. smartpixels

    smartpixels Active Member

    http://moz.com/google-algorithm-change. Check you stats with this history you will know the reason. I actually never look at these updates just produce good quality stuff to readers and the rest takes of itself. If you worry about SEO I feel you are more likely to do something that google doesnt want you to do and that in turn will cause you to lose what you intended to gain.
     
  7. lazer

    lazer Well-Known Member

    Thanks @smartpixels but I am already aware of MoZ.
    As I said above, we haven't done anything, heck I don't understand enough about SEO to tinker anyway :p
     
  8. Anthony Parsons

    Anthony Parsons Well-Known Member

    Did you do a speed test of pages prior to APC being installed, then after? I have found some caching can actually slow page responsiveness, not speed it up. Maybe check that to see if that is the issue first, as stated, it is the only change.

    If not... Google decrease site traffic for thousands of automatic reasons, some of which you may have made two years ago and they're simply catching up with your site now due to human review, and not algorithmic.
     
  9. smartpixels

    smartpixels Active Member

    I am not introducing you to Moz but showing the google alogrithm change history.

    Correlate that data with your own traffic details, then you will have a good idea of what might have affected your ranking

    In Penguin, google went after bad links
    In Panda, google is going after bad content

    You get the picture like that it helps in investigating things rather than being in the dark.

    Good advice that can be a reason, Matt cutts has gone on record saying that site speed is a ranking factor.Google even has a free service called pagespeed, its not a CDN but a front end optimiser. They are in beta now and accept forums. I have one of my forums enrolled into it.

    You can use http://www.webpagetest.org/
     
  10. Anthony Parsons

    Anthony Parsons Well-Known Member

    I say that because I did testing with different caching methods utilising Googles pagespeed service, and found that if you hadn't set them up correctly, i.e. not only implemented them at the server but also at the software to utilise it, then you could actually slow the page call speed down and it gave worse scores.

    It's also worth adding caching to your .htaccess directly to cover time for all aspects concerned, which pagespeed will refer you to if an issue.
     
  11. lazer

    lazer Well-Known Member

    Thanks guys, my server guy is already working on the relevant parts of optimization.
     

Share This Page