You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 15, 2024. It is now read-only.
Just leaving this here as Discord doesn't have threads and I dont want to lose it :)
I've noticed the robots.txt file on a couple of sites are all displaying the local version. Even though the environment in the settings is set to live.
Sitemap: https://www.domain.com/sitemaps-1-sitemap.xml
# local - disallow all
User-agent: *
Disallow: /
Is there another setting I should be aware of, I cannot see anything other settings relating to robots.txt. I have cleared all caches too. Thanks in advance.
Craft 3.1.25
Running seomatic 3.1.50 on both sites.
Devmode false
The text was updated successfully, but these errors were encountered:
Adding this in case some else lands here via search. I had a similar situation with a newly-live site. After changing the .env value pair to ENVIRONMENT=live, I needed to disable the robots.txt output and then re-enable it. The refresh got things working as expected.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Just leaving this here as Discord doesn't have threads and I dont want to lose it :)
I've noticed the robots.txt file on a couple of sites are all displaying the local version. Even though the environment in the settings is set to
live
.Is there another setting I should be aware of, I cannot see anything other settings relating to robots.txt. I have cleared all caches too. Thanks in advance.
The text was updated successfully, but these errors were encountered: