Skip to content

Commit

Permalink
Improve output for readability
Browse files Browse the repository at this point in the history
  • Loading branch information
jdevalk committed Feb 22, 2024
1 parent 865f0c0 commit c86bf13
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ If you're developing on this plugin, you will probably want to run tests and lin
The default output of this plugin is as follows:

```txt
# This site is very specific about who it allows crawling from. Our default is you're not allowed to crawl:
# This site is very specific about who it allows crawling from. Our default is to not allow crawling:
User-agent: *
Disallow: /
Expand Down
6 changes: 4 additions & 2 deletions src/class-plugin.php
Original file line number Diff line number Diff line change
Expand Up @@ -60,12 +60,14 @@ public function modify_robots_txt( $output, $site_public ) {
return "User-agent: *\nDisallow: /\n";
}

$robots_txt = "# This site is very specific about who it allows crawling from. Our default is you're not allowed to crawl:\n";
$robots_txt = "# This site is very specific about who it allows crawling from.\n";
$robots_txt .= "# Our default is to not allow crawling:\n";
$robots_txt .= "User-agent: *\n";
$robots_txt .= "Disallow: /\n";

$robots_txt .= "\n# Below are the crawlers that are allowed to crawl this site.\n";
$robots_txt .= "# Below that list, you'll find paths that are blocked, even for them, and then paths within those blocked paths that are allowed.\n";
$robots_txt .= "# Below that list, you'll find paths that are blocked, even for them,\n";
$robots_txt .= "# and then paths within those blocked paths that are allowed.\n";
foreach ( $this->get_allowed_spiders() as $crawler ) {
$robots_txt .= "User-agent: $crawler\n";
}
Expand Down

0 comments on commit c86bf13

Please sign in to comment.