Add per-tenant sitemap to robots.txt file

While we ping some search engines (currently, only Google) when
generating the sitemap files, we weren't telling search engines
accessing through the `robots.txt` file where to find the sitemap. Now
we're doing so, using the right sitemap file for the right tenant.
This commit is contained in:
Javi Martín
2022-09-30 16:04:56 +02:00
parent 5100884110
commit 468761253b
4 changed files with 33 additions and 0 deletions

View File

@@ -1,15 +0,0 @@
# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# Disallow: /
User-agent: *
Disallow: /users/
Disallow: /comments/
Disallow: /*?*locale
Disallow: /*?*order
Disallow: /*?*search
Disallow: /*?*locale-switcher
Disallow: /*?*filter
Disallow: user_id