Add per-tenant sitemap to robots.txt file
While we ping some search engines (currently, only Google) when generating the sitemap files, we weren't telling search engines accessing through the `robots.txt` file where to find the sitemap. Now we're doing so, using the right sitemap file for the right tenant.
This commit is contained in:
@@ -1,15 +0,0 @@
|
||||
# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
|
||||
#
|
||||
# To ban all spiders from the entire site uncomment the next two lines:
|
||||
# Disallow: /
|
||||
|
||||
User-agent: *
|
||||
Disallow: /users/
|
||||
Disallow: /comments/
|
||||
|
||||
Disallow: /*?*locale
|
||||
Disallow: /*?*order
|
||||
Disallow: /*?*search
|
||||
Disallow: /*?*locale-switcher
|
||||
Disallow: /*?*filter
|
||||
Disallow: user_id
|
||||
Reference in New Issue
Block a user