# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
#
# To ban all spiders from the entire site uncomment the next two lines:
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# User-Agent: *
# Disallow: /
# Disallow: /
User-Agent: *
# Add a 1 second delay between successive requests to the same server, limits resources used by crawler
# Only some crawlers respect this setting, e.g. Googlebot does not
# Crawl-delay: 1
# Based on details in https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/routes.rb, https://gitlab.com/gitlab-org/gitlab-ce/blob/master/spec/routing, and using application