robots.txt for Kirby 2 websites.
For Kirby 3 you can use this SEO Kit: Kirby 3 SEO Kit
kirby plugin:install thepoddi/kirby-robots
Include this repository as a submodule git submodule add https://github.com/thepoddi/kirby-robots.git site/plugins/kirby-robots
or copy it manually to /site/plugins/
. Attention: Plugin directory must named like the plugin file (kirby-robots).
This plugin sets a robots file to /robots.txt
as a kirby route. There is no actual file generated.
The robots file can be configured via Kirby’s config file /site/config/config.php
.
Ignore specific pages by URI - example: 'blog/my-article'. (array) Default: error
c::set( 'robots.ignore.pages', array('error') );
Ignore pages by intended templates. (array) Default: error
c::set( 'robots.ignore.templates', array('error') );
Ignore invisible pages. (boolean) Default: true
c::set( 'robots.ignore.invisible', true );
Set sitemap file in robots.txt. (string) Default: sitemap.xml
c::set( 'robots.sitemap', 'sitemap.xml' );
Patrick Schumacher - GitHub Website
This project is licensed under the MIT License - see the LICENSE file for details.