While we ping some search engines (currently, only Google) when generating the sitemap files, we weren't telling search engines accessing through the `robots.txt` file where to find the sitemap. Now we're doing so, using the right sitemap file for the right tenant.
20 lines
498 B
Ruby
20 lines
498 B
Ruby
require "rails_helper"
|
|
|
|
describe "robots.txt" do
|
|
scenario "uses the default sitemap for the default tenant" do
|
|
visit "/robots.txt"
|
|
|
|
expect(page).to have_content "Sitemap: #{app_host}/sitemap.xml"
|
|
end
|
|
|
|
scenario "uses a different sitemap for other tenants" do
|
|
create(:tenant, schema: "cyborgs")
|
|
|
|
with_subdomain("cyborgs") do
|
|
visit "/robots.txt"
|
|
|
|
expect(page).to have_content "Sitemap: http://cyborgs.lvh.me:#{app_port}/tenants/cyborgs/sitemap.xml"
|
|
end
|
|
end
|
|
end
|