Skip to content

solution-libre/traefik-plugin-robots-txt

Repository files navigation

Robots.txt Traefik plugin

Table of Contents

  1. Description
  2. Setup
  3. Usage
  4. Reference
  5. Development
  6. Contributors

Description

Robots.txt is a middleware plugin for Traefik which add rules based on ai.robots.txt or on custom rules in /robots.txt of your website.

Setup

# Static configuration

experimental:
  plugins:
    robots-txt:
      moduleName: github.com/solution-libre/traefik-plugin-robots-txt
      version: v0.2.1

Usage

# Dynamic configuration

http:
  routers:
    my-router:
      rule: host(`localhost`)
      service: service-foo
      entryPoints:
        - web
      middlewares:
        - my-robots-txt

  services:
   service-foo:
      loadBalancer:
        servers:
          - url: http://127.0.0.1
  
  middlewares:
    my-robots-txt:
      plugin:
        robots-txt:
          aiRobotsTxt: true

Reference

Name Description Default value Example
aiRobotsTxt Enable the retrieval of ai.robots.txt list false true
customRules Add custom rules at the end of the file \nUser-agent: *\nDisallow: /private/\n
overwrite Remove the original robots.txt file content false true

Development

Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.

Fork this repo on GitHub

Contributors

The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors

About

Traefik plugin to create, overwrite or complete the robots.txt file

Topics

Resources

License

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •