Reducing amount of links not found and crawl errors and seo

Discussion in 'Off Topic' started by EnergyFreak, Jun 27, 2011.

  1. EnergyFreak Customer

    Hi,

    I was curious to know if you guys had tricks to reduce the amount of crawl errors (Soft 404s and Not found) in google webmaster tools.This is what I have blocked in my robots.txt to hopefully eliminate duplicate content, but I think it has harmed me more then it helped.

    Disallow: /printer.php
    Disallow: /userforgot.php
    Disallow: /searchresults.php
    Disallow: /login.php
    Disallow: /toplistings.php
    Disallow: /userjoin.html

    Also off topic, I have dropped in pagerank by 1, when my site is increasing in traffic, more ads are added and bounce rate is lower. Any suggestions to improve it? I know Seymour suggested better hosting, and I will look into that.

    Thanks.
  2. seymourjames All Hands On Deck

    I doubt your robots.txt file has anything to do with it. You use the robots file to tell the search engines what to index and what to ignore. Do you really want them to index your login page unless it is full of valuable content. Its a big subject and the only thing you can do is read about it. The robots file allows you to sculpture the structure of your site to a certain degree so useless pages are left alone.
  3. EnergyFreak Customer

    Ya I was just saying that because I have over 4,000 pages not found, and some of them are caused by my restrictions set in my robots.txt file, but others are probably deleted ads or expiring ads. I read somewhere that having a lot of 404 pages can decrease ranking in search engines.
  4. seymourjames All Hands On Deck

    It is not the number of pages but their quality - focus on qulity. Yes too many 404 pages can be taken negatively but they will drop out of SERPS. For example, if your listings are duplicates on other sites you will suffer. this is why many big sites don't even index their listings. They know they are duplicates.

Share This Page