Make WordPress Core

Opened 4 years ago

Closed 4 years ago

#19251 closed defect (bug) (fixed)

Respect more restrictive blog_public settings when disabling crawling

Reported by: ryan Owned by: ryan
Milestone: 3.3 Priority: normal
Severity: normal Version: 3.2.1
Component: General Keywords:
Focuses: Cc:


[17891] introduced a pre_option_blog_public that returns 0 when replytocom is set in GET. This works fine for core but could thwart some privacy plugins that use blog_public values other than 0 or 1. To accommodate plugins, blog_public should be overridden only when it is currently set to 1.

Attachments (2)

19251.diff (1.7 KB) - added by ryan 4 years ago.
19251.2.diff (3.5 KB) - added by nacin 4 years ago.
Patch I worked up independently.

Download all attachments as: .zip

Change History (5)

@ryan4 years ago

comment:1 @ryan4 years ago

That patch plays with adding a disallow_crawling filter. Not sure I want to go there, but have a look.

@nacin4 years ago

Patch I worked up independently.

comment:2 @ryan4 years ago

I like .2.diff. It avoids unintended consequences with manipulating blog_public and is more obvious.

comment:3 @ryan4 years ago

  • Owner set to ryan
  • Resolution set to fixed
  • Status changed from new to closed

In [19304]:

Introduce wp_no_robots(). Call it for pages that should never be indexed, regardless of blog privacy settings. Props nacin. fixes #19251

Note: See TracTickets for help on using tickets.