WordPress.org

Make WordPress Core

Opened 3 years ago

Closed 3 years ago

Last modified 3 years ago

#15647 closed defect (bug) (worksforme)

robots.txt not correctly changing between allow and disallow

Reported by: ipstenu Owned by:
Milestone: Priority: normal
Severity: major Version:
Component: Accessibility Keywords:
Focuses: Cc:

Description

From this thread: http://wordpress.org/support/topic/robotstxt-set-to-disallow-cant-change

In wp-includes/functions.php file from the 1779 line starts do_robots function:

do_robots function () {
header ('Content-Type: text / plain; charset = utf-8');

do_action ('do_robotstxt');

$ output ='';
$ public = get_option ('blog_public');
if ('0 '== $ public) {
$ output .= "User-agent: * \ n ";
$ output .= "Disallow: / \ n ";
Else {}
$ output .= "User-agent: * \ n ";
$ output .= "Disallow: \ n ";
}

apply_filters echo ('robots_txt', $ output, $ public);
}

That first if should be:

if ('0 '== $ public) {
$ output .= "User-agent: * \ n ";
$ output .= "Allow: / \ n ";

Prod ellp

Change History (2)

comment:1 nacin3 years ago

  • Milestone Awaiting Review deleted
  • Resolution set to worksforme
  • Status changed from new to closed

The code looks fine to me.

If ! public, then Disallow: / (everything)

If public, then Disallow: (nothing)

Posting in the thread.

comment:2 ipstenu3 years ago

It may just be becuase Google's a freakin' dink. I read through their webmaster whoopla, and it LOOKS like they're giving weighted preference to allow vs disallow. So while both are, technically, correct, they won't always scan a Disallow: (nothing).

I'm playing around with their webmaster tools, and seeing different results with 'fake' robots.txt files when I set it as disallow nothing or allow everything.

Note: See TracTickets for help on using tickets.