0
0
Fork 0
mirror of https://codeberg.org/forgejo/docs.git synced 2024-11-28 18:42:51 -05:00
forgejo-docs/docs/admin/search-engines-indexation.md
Earl Warren 24def40ad3 sync with Gitea d3982bcd814bac93e3cbce1c7eb749b17e413fbd
git diff abe8fe352711601fbcd24bf4505f7e0b81a93c5d..d3982bcd814bac93e3cbce1c7eb749b17e413fbd -- $(find . -type f -name '*en-us*')
2024-02-25 18:03:28 +00:00

1.4 KiB

title license origin_url
Search Engines Indexation Apache-2.0 d3982bcd81/docs/content/administration/search-engines-indexation.en-us.md

Search engines indexation of your Forgejo installation

By default your Forgejo installation will be indexed by search engines. If you don't want your repository to be visible for search engines read further.

Block search engines indexation using robots.txt

To make Forgejo serve a custom robots.txt (default: empty 404) for top level installations, create a file with path public/robots.txt at the root of the Custom File Root Path as displayed in the /admin/config page.

Examples on how to configure the robots.txt can be found at https://moz.com/learn/seo/robotstxt.

User-agent: *
Disallow: /

If you installed Forgejo in a subdirectory, you will need to create or edit the robots.txt in the top level directory.

User-agent: *
Disallow: /forgejo/

Disallow crawling archives to save disk space

If the archive files are crawled, they will be generated dynamically and kept around which can amount to a lot of disk. To prevent that from happening, add the following to the robots.txt file:

User-agent: *
Disallow: /*/*/archive/

See also a more complete example at Codeberg.