Opened 15 months ago
Last modified 15 months ago
#56595 new enhancement
Add a Site Health check for a non-virtual robots.txt file
Reported by: |
|
Owned by: | |
---|---|---|---|
Milestone: | Awaiting Review | Priority: | normal |
Severity: | normal | Version: | |
Component: | Site Health | Keywords: | needs-patch dev-feedback |
Focuses: | Cc: |
Description
At WordCamp Nederland 2022 @joostdevalk held a talk about unnecessary bot traffic and how to prevent it.
One slide caught my interest:
https://docs.google.com/presentation/d/13Ngq-T2Qdbz1b8apUiioTCBmcsB5s411xBKcklmKyNQ/edit#slide=id.g152f65bfa26_0_87
Blocking those unneeded bots is easy in theory (there is a filter available to change the virtual robots.txt file), but is not easy to build, because we need to look at many use cases and edge cases.
For high traffic sites, it would be better to have a non-virtual robots.txt file, to prevent PHP/WordPress handling this.
But if we create a robots.txt file it is easily missed that now WordPress is not handling it anymore.
Therefore I suggest adding a check to Site Health if there is non-virtual robots.txt file in the root directory.
Maybe we could also add the content of this file in the info area and/or in the tools section of the plugin.
Happy to work on a patch if this idea gets confirmation.