# robots.txt 2011/11/18 adamw $ # # Site: J.Crew.com WWW # # This file is retrieved automatically by crawlers conforming to # the Robots.txt standard. It defines what URLs should/shouldn't # be indexed. # See # # Format: # User-agent: # Disallow: | # ----------------------------------------------------------------------------- User-agent: AdsBot-Google Allow: / User-agent: AdsBot-Google-Mobile Allow: / # All User Agent Exclusions User-agent: * Disallow: /account/ Disallow: /checkout/ Disallow: /AST/filterAsst/ # Removing fragmented store locator. The container page still exists here: http://www.jcrew.com/footer/StoreLocator.jsp Disallow: /help/include/inc_storelocator_right.jsp Sitemap: http://www.jcrew.com/sitemap.xml