doc: update robots.txt to exclude more old docs #14492
Labels
area: Documentation
bug
The issue is a bug, or the PR is fixing a bug
priority: low
Low impact/importance bug
The current robots.txt excludes old docs (from 1.4.0 through 1.8.0) from search engines but is letting really old docs from 1.2.0 and 1.3.0 through. Add these two folder to the Disallow list too (or just delete some of these really old doc sets.
The text was updated successfully, but these errors were encountered: