Opened 15 years ago
Closed 15 years ago
#11918 closed enhancement (fixed)
do_robots() Enhancement
Reported by: | miqrogroove | Owned by: | |
---|---|---|---|
Milestone: | 3.0 | Priority: | normal |
Severity: | normal | Version: | |
Component: | Optimization | Keywords: | |
Focuses: | Cc: |
Description
do_robots(), in wp-includes/functions.php, is currently responsible for handling robots.txt requests for sites that do not have a robots.txt file.
The default rules are quite lame. All it does is allow or disallow the entire site based on the privacy setting.
There should be default rules such as
Disallow: /?s= Disallow: /search/ Disallow: /wp-includes/ Disallow: *?replytocom
Attachments (1)
Change History (11)
#4
in reply to:
↑ 1
@
15 years ago
-1 on secondv's patch. The lack of search rules makes this as useless as it was in the first place.
#5
@
15 years ago
The lack of search rules makes this as useless as it was in the first place.
IMO, Theres no need to block search engines from Search results, It'll cause more load on some servers, but its up to the user to disable crawls of their search pages IMHO.
#6
@
15 years ago
Fair enough. I vote wontfix. Excluding wp-admin does very little in practice, and the absence of search exclusion is contrary to best practice.
#7
@
15 years ago
- Keywords has-patch removed
- Milestone 3.0 deleted
- Resolution set to wontfix
- Status changed from new to closed
I agree with all of the -1 votes. Closing as wontfix.
I do not think we should exclude search URLs here - for me this looks as a plugin area. Last 2 rules can be added. I would also add rule to exclude /wp-admin/. It will be also good to exclude /wp-content/ (at least plugin dir), but some plugins may provide files which should be indexed (although I do not know any at this moment), so this is not a good candidate.