0
0
Fork 0
mirror of https://codeberg.org/forgejo/docs.git synced 2024-11-20 17:26:56 -05:00
forgejo-docs/docs/admin/search-engines-indexation.md
Earl Warren 6aa9b491a0 sync with Gitea e865de1e9d65dc09797d165a51c8e705d2a86030
cd docs
git diff d3982bcd814bac93e3cbce1c7eb749b17e413fbd..e865de1e9d65dc09797d165a51c8e705d2a86030 -- $(find . -type f -name '*en-us*')
2024-04-23 07:16:10 +00:00

1.4 KiB

title license origin_url
Search Engines Indexation Apache-2.0 e865de1e9d/docs/content/administration/search-engines-indexation.en-us.md

Search engines indexation of your Forgejo installation

By default your Forgejo installation will be indexed by search engines. If you don't want your repository to be visible for search engines read further.

Block search engines indexation using robots.txt

To make Forgejo serve a custom robots.txt (default: empty 404) for top level installations, create a file with path public/robots.txt at the root of the Custom File Root Path as displayed in the /admin/config page.

Examples on how to configure the robots.txt can be found at https://moz.com/learn/seo/robotstxt.

User-agent: *
Disallow: /

If you installed Forgejo in a subdirectory, you will need to create or edit the robots.txt in the top level directory.

User-agent: *
Disallow: /forgejo/

Disallow crawling archives to save disk space

If the archive files are crawled, they will be generated dynamically and kept around which can amount to a lot of disk. To prevent that from happening, add the following to the robots.txt file:

User-agent: *
Disallow: /*/*/archive/

See also a more complete example at Codeberg.