Robots.txt is a middleware plugin for Traefik which add rules based on
ai.robots.txt or on custom rules in /robots.txt
of your website.
# Static configuration
experimental:
plugins:
robots-txt:
moduleName: github.com/solution-libre/traefik-plugin-robots-txt
version: v0.2.1
# Dynamic configuration
http:
routers:
my-router:
rule: host(`localhost`)
service: service-foo
entryPoints:
- web
middlewares:
- my-robots-txt
services:
service-foo:
loadBalancer:
servers:
- url: http://127.0.0.1
middlewares:
my-robots-txt:
plugin:
robots-txt:
aiRobotsTxt: true
Name | Description | Default value | Example |
---|---|---|---|
aiRobotsTxt | Enable the retrieval of ai.robots.txt list | false |
true |
customRules | Add custom rules at the end of the file | \nUser-agent: *\nDisallow: /private/\n |
|
overwrite | Remove the original robots.txt file content | false |
true |
Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.
The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors