|Reported by:||miqrogroove||Owned by:|
do_robots(), in wp-includes/functions.php, is currently responsible for handling robots.txt requests for sites that do not have a robots.txt file.
The default rules are quite lame. All it does is allow or disallow the entire site based on the privacy setting.
There should be default rules such as
Disallow: /?s= Disallow: /search/ Disallow: /wp-includes/ Disallow: *?replytocom
Change History (11)
- Keywords has-patch removed
- Milestone 3.0 deleted
- Resolution set to wontfix
- Status changed from new to closed