Help! Google Base is rejecting all my items.

One possible issue is your robots.txt file. The text below came from a Google webmaster post, http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

What do you want to do?

Generate a robots.txt file using the Generate robots.txt tool

  1. On the Webmaster Tools Home page, click the site you want.
  2. Under Site configuration, click Crawler access.
  3. Click the Generate robots.txt tab.
  4. Choose your default robot access. We recommend that you allow all robots, and use the next step to exclude any specific bots you don't want accessing your site. This will help prevent problems with accidentally blocking crucial crawlers from your site.
  5. Specify any additional rules. For example, to block Googlebot from all files and directories on your site:
    1. In the Action list, select Disallow.
    2. In the Robot list, click Googlebot.
    3. In the Files or directories box, type /.
    4. Click Add. The code for your robots.txt file will be automatically generated.
  6. Save your robots.txt file by downloading the file or copying the contents to a text file and saving as robots.txt. Save the file to the highest-level directory of your site. The robots.txt file must reside in the root of the domain and must be named "robots.txt". A robots.txt file located in a subdirectory isn't valid, as bots only check for this file in the root of the domain. For instance, http://www.example.com/robots.txt is a valid location, but http://www.example.com/mysite/robots.txt is not.

· The rules specified in a robots.txt file are requests, not enforceable orders. Googlebot and all reputable robots will respect the instructions in a robots.txt file. However, some rogue robots—such as those of spammers, scrapers, and other nogoodniks—may not respect the file. Therefore, we recommend that you keep confidential information in a password-protected directory on your server. Also, different robots can interpret robots.txt files differently, and not all robots support every directive included in the file. While we do our best to create robots.txt files that will work for all robots, we can't guarantee how those files will be interpreted.

· Google currently follows directions in the first 500KB of text in your robots.txt file. Content after the maximum file size may be ignored.

To check that your robots.txt file is behaving as expected, use the Test robots.txt tool in Webmaster Tools.

Was this answer helpful? 0 Users Found This Useful (2 Votes)