-
Notifications
You must be signed in to change notification settings - Fork 790
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Block web crawlers on v1.2-branch
#3855
Block web crawlers on v1.2-branch
#3855
Conversation
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
/lgtm |
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
Signed-off-by: Mathew Wicks <[email protected]>
/lgtm |
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: james-jwu The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Google keeps surfacing links from very old versions of the Kubeflow docs website, which is confusing to users.
This PR adds the
<meta name="robots" content="noindex">
meta tag to tell Google to stop indexing the pages from thev1.2-branch
branch.It will take probably a few months (or longer) for Google to re-index these pages, I have left
nofollow
off, so that google will re-index the whole site quicker by following links.It also backports the changes from #3863 so the version selector is consistent across each archive version.
And updates the OWNERS file of this very old branch to align with the current
OWNERS
inmaster
to make it easier to approve PRs in the future without GitHub admin overrides.