Wordpress should not serve robots.txt when it is not installed at the root of a site
|Reported by:||solarissmoke||Owned by:||ryan|
|Component:||Rewrite Rules||Keywords:||has-patch dev-feedback|
Visiting http://some.domain.tld/wordpress/robots.txt causes wordpress to generate a robots.txt file, when actually such a file below the root directory has no meaning.
Desired behaviour: the robots.txt file is only meaningful when placed at the root of a domain. So wordpress should only do robots.txt handling if it is installed at the root of a domain, otherwise return 404 for all such requests.
Is this too late for 3.0? Feel free to punt. I'm working on a patch anyway though.