Skip to content

doc: update robots.txt to exclude more old docs #14492

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
dbkinder opened this issue Mar 13, 2019 · 3 comments
Closed

doc: update robots.txt to exclude more old docs #14492

dbkinder opened this issue Mar 13, 2019 · 3 comments
Assignees
Labels
area: Documentation bug The issue is a bug, or the PR is fixing a bug priority: low Low impact/importance bug

Comments

@dbkinder
Copy link
Contributor

The current robots.txt excludes old docs (from 1.4.0 through 1.8.0) from search engines but is letting really old docs from 1.2.0 and 1.3.0 through. Add these two folder to the Disallow list too (or just delete some of these really old doc sets.

@dbkinder dbkinder added bug The issue is a bug, or the PR is fixing a bug area: Documentation labels Mar 13, 2019
@rljordan-zz rljordan-zz added the priority: low Low impact/importance bug label Mar 15, 2019
@rljordan-zz rljordan-zz assigned dbkinder and unassigned nashif Mar 15, 2019
@rljordan-zz
Copy link

@dbkinder If you do not have access to do this, please let me know.

@dbkinder
Copy link
Contributor Author

dbkinder commented Mar 15, 2019

@rljordan I don't have access to do this. That's why I originally assigned it to Anas, since he does have access.

@dbkinder dbkinder removed their assignment Mar 15, 2019
@nashif
Copy link
Member

nashif commented Mar 17, 2019

done

@nashif nashif closed this as completed Mar 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: Documentation bug The issue is a bug, or the PR is fixing a bug priority: low Low impact/importance bug
Projects
None yet
Development

No branches or pull requests

3 participants