Generate custom robots.txt for subsites
robots.txt for each subsite.
This module aims to prevent indexing of subsite-specific folder assets that belong to other subsites. It creates a
robots.txt file with
Disallow rules for folders belonging to other subsites (ie. not folders that are common or for the current subsite).
composer require rotassator/silverstripe-subsites-robotstxt
Set the site to live mode to see subsite-specific
test environments, robots are disallowed for all files.
See Environment management documentation for more details.
robots.txtfor live site
# robots.txt for Example 1 User-agent: * Disallow: assets/example2/ Disallow: assets/example2-documents/
# robots.txt for Example 2 User-agent: * Disallow: assets/example1/
# robots.txt for Example 1 User-agent: * Disallow: /
Module rating system helping users find modules that are well supported. For more on how the rating system works visit Module standards
Score not correct? Let us know there is a problem