Adds a Robots.txt file that is configurable from /admin/settings/.
This module will allow you to have a dynamic robots.txt
$ composer require michaeljjames/silverstripe-robots
You can also install manually by downloading the zip and extracting it into your site root.
You'll also need to run dev/build.
It is completely configurable inside /admin/settings
To configure your robots.txt file paste in your robots.txt confiuration into the textarea inside the robots tab in admin/settings/ or use the example below.
User-agent: * Disallow: /admin Disallow: /dev Disallow: /?flush Disallow: /assets Allow: / User-Agent: Googlebot-Image Disallow: /admin Disallow: /dev Disallow: /?flush Allow: / Sitemap: https://www.domain.com/sitemap.xml
Module rating system helping users find modules that are well supported. For more on how the rating system works visit Module standards
Score not correct? Let us know there is a problem