Make WordPress Core

Opened 13 years ago

Closed 13 years ago

#19251 closed defect (bug) (fixed)

Respect more restrictive blog_public settings when disabling crawling

Reported by: ryan's profile ryan Owned by: ryan's profile ryan
Milestone: 3.3 Priority: normal
Severity: normal Version: 3.2.1
Component: General Keywords:
Focuses: Cc:

Description

[17891] introduced a pre_option_blog_public that returns 0 when replytocom is set in GET. This works fine for core but could thwart some privacy plugins that use blog_public values other than 0 or 1. To accommodate plugins, blog_public should be overridden only when it is currently set to 1.

Attachments (2)

19251.diff (1.7 KB) - added by ryan 13 years ago.
19251.2.diff (3.5 KB) - added by nacin 13 years ago.
Patch I worked up independently.

Download all attachments as: .zip

Change History (5)

@ryan
13 years ago

#1 @ryan
13 years ago

That patch plays with adding a disallow_crawling filter. Not sure I want to go there, but have a look.

@nacin
13 years ago

Patch I worked up independently.

#2 @ryan
13 years ago

I like .2.diff. It avoids unintended consequences with manipulating blog_public and is more obvious.

#3 @ryan
13 years ago

  • Owner set to ryan
  • Resolution set to fixed
  • Status changed from new to closed

In [19304]:

Introduce wp_no_robots(). Call it for pages that should never be indexed, regardless of blog privacy settings. Props nacin. fixes #19251

Note: See TracTickets for help on using tickets.