innoweb/silverstripe-robots

Adds a Robots.txt file that is configurable from /admin/settings/.

Installs: 8 737

Dependents: 1

Suggesters: 0

Security: 0

Stars: 0

Watchers: 4

Forks: 1

Open Issues: 0

Type:silverstripe-vendormodule

5.1.1 2024-02-27 06:05 UTC

This package is auto-updated.

Last update: 2024-03-27 06:17:00 UTC


README

Version License

Overview

Adds a Robots.txt file that is configurable from /admin/settings/ and injects robots meta tag into all pages.

This module supports single site as well as multisites and configured-multisites setups.

Requirements

  • Silverstripe CMS 5.x

Note: this version is compatible with SilverStripe 5. For SilverStripe 4, please see the 4 release line.

Installation

Install the module using composer:

composer require innoweb/silverstripe-robots dev-master

Then run dev/build.

Configuration

Robots.txt

On the SiteConfig (or Site if Multisites is installed) there is a setting in the CMS that lets you set the robots mode. The three options are:

  • Allow all
  • Disallow all
  • Custom content

The output of all three states is managed through templates and can be overwritten for an app or theme.

You can force the state using the following .env variable (e.g. for dev or test environment):

FORCE_ROBOTS_MODE="allow|disallow|custom"

Allow all

When switched to 'allow all' the module uses the template Innoweb/Robots/RobotsController_allow.ss with the following default content:

<% if $GoogleSitemapURL %>Sitemap: {$GoogleSitemapURL}<% end_if %>
User-agent: *
Disallow: /dev/
Disallow: /admin/
Disallow: /Security/

The module checks whether the Google Sitemaps module is installed and injects the sitemap URL automatically.

It allows access to all pages and disallows access to development and security URLs by default.

Disallow all

When switched to 'disallow all' the module uses the template Innoweb/Robots/RobotsController_disallow.ss with the following default content:

User-agent: *
Disallow: /

This disallows all robots from accessing any page on the site.

Custom content

This setting reveals a text field in the CMS where custom code can be entered.

The template contains the following code and doesn't add anything to the custom code entered:

$RobotsContent.RAW

A good standard robots.txt configuration for Silverstripe looks as follows. This is used as default when the module is switched to 'allow all':

Sitemap: https://www.example.com/sitemap.xml
User-agent: *
Disallow: /dev/
Disallow: /admin/
Disallow: /Security/

Robots meta tag

The module injects a robots meta tag into every page. The injection of the meta tag can be disabled using the following config, e.g. if the robots meta tag is managed manually in the template:

Page:
  robots_enable_metatag: false

By default, all pages are set to index, follow with the following exceptions:

  • The Robots.txt setting on the site if set to 'Disallow all'
  • The environment is set to test or dev
  • The current page is displayed by the Security controller
  • The Priority setting for the page is -1 (see Google Sitemaps module)

Additionally, for each page type a config value can be set to control the meta tag. By default, the following values are set:

Page:
  robots_noindex: false
  robots_nofollow: false

SilverStripe\CMS\Model\VirtualPage:
  robots_noindex: true
  robots_nofollow: true

SilverStripe\ErrorPage\ErrorPage:
  robots_noindex: true
  robots_nofollow: true

This can be customised for any custom page types as needed.

License

BSD 3-Clause License, see License