Skip to content
This repository has been archived by the owner on Oct 15, 2024. It is now read-only.

Robots.txt showing local environment / Disallow all when LIVE #383

Open
mark-chief opened this issue May 2, 2019 · 1 comment
Open

Robots.txt showing local environment / Disallow all when LIVE #383

mark-chief opened this issue May 2, 2019 · 1 comment

Comments

@mark-chief
Copy link

mark-chief commented May 2, 2019

Just leaving this here as Discord doesn't have threads and I dont want to lose it :)

I've noticed the robots.txt file on a couple of sites are all displaying the local version. Even though the environment in the settings is set to live.

Sitemap: https://www.domain.com/sitemaps-1-sitemap.xml
# local - disallow all
User-agent: *
Disallow: /

Is there another setting I should be aware of, I cannot see anything other settings relating to robots.txt. I have cleared all caches too. Thanks in advance.

  • Craft 3.1.25
  • Running seomatic 3.1.50 on both sites.
  • Devmode false
@cap-akimrey
Copy link

Adding this in case some else lands here via search. I had a similar situation with a newly-live site. After changing the .env value pair to ENVIRONMENT=live, I needed to disable the robots.txt output and then re-enable it. The refresh got things working as expected.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants