lyndseo.com

Search Engine Optimization :: Lyndsay Walker :: Toronto, Ontario

  • Michael VanDeMar says:

    Lyndsay,

    Unfortunately, no. Straight from Robotstxt.org:

    To exclude all files except one
    This is currently a bit awkward, as there is no “Allow” field. The easy way is to put all files to be disallowed into a separate directory, say “stuff”, and leave the one file in the level above this directory:

    User-agent: *
    Disallow: /~joe/stuff/

    Alternatively you can explicitly disallow all disallowed pages:

    User-agent: *
    Disallow: /~joe/junk.html
    Disallow: /~joe/foo.html
    Disallow: /~joe/bar.html

    Sorry. :)

    May 23, 2008 at 9:12 am
  • nate says:

    Robotstxt.org is way out of date with the features supported by the major engines. This is the best reference on the web: http://janeandrobot.com/post/Managing-Robots-Access-To-Your-Website.aspx (caveat, I work at Microsoft Live Search, and I help write this article).

    You could use the ‘allow’ directive, or build a creative pattern using the pattern matching.

    Make sure to test this using google’s webmaster tools!

    nate

    January 15, 2009 at 7:03 pm

Your email address will not be published. Required fields are marked *

*