summaryrefslogtreecommitdiff
path: root/docs/administration/CLI_tasks/robots_txt.md
blob: 7eeedf57119d2d41ece8ba973985d32e4c248b63 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
# Managing robots.txt

{! backend/administration/CLI_tasks/general_cli_task_info.include !}

## Generate a new robots.txt file and add it to the static directory

The `robots.txt` that ships by default is permissive. It allows well-behaved search engines to index all of your instance's URIs.

If you want to generate a restrictive `robots.txt`, you can run the following mix task. The generated `robots.txt` will be written in your instance [static directory](../../../configuration/static_dir/).

=== "OTP"

    ```sh
    ./bin/pleroma_ctl robots_txt disallow_all
    ```

=== "From Source"

    ```sh
    mix pleroma.robots_txt disallow_all
    ```