-
Ben Bodenmiller authored
Update default robots.txt rules to disallow irrelevant pages that search engines should not care about. This will still allow important pages like the files, commit details, merge requests, issues, comments, etc. to be crawled.
595a93ee
To find the state of this project's repository at the time of any of these versions, check out
the tags.