mirror of
https://codeberg.org/forgejo/docs.git
synced 2024-12-17 21:33:56 -05:00
6aa9b491a0
cd docs git diff d3982bcd814bac93e3cbce1c7eb749b17e413fbd..e865de1e9d65dc09797d165a51c8e705d2a86030 -- $(find . -type f -name '*en-us*')
42 lines
1.4 KiB
Markdown
42 lines
1.4 KiB
Markdown
---
|
|
title: 'Search Engines Indexation'
|
|
license: 'Apache-2.0'
|
|
origin_url: 'https://github.com/go-gitea/gitea/blob/e865de1e9d65dc09797d165a51c8e705d2a86030/docs/content/administration/search-engines-indexation.en-us.md'
|
|
---
|
|
|
|
## Search engines indexation of your Forgejo installation
|
|
|
|
By default your Forgejo installation will be indexed by search engines.
|
|
If you don't want your repository to be visible for search engines read further.
|
|
|
|
### Block search engines indexation using robots.txt
|
|
|
|
To make Forgejo serve a custom `robots.txt` (default: empty 404) for top level installations,
|
|
create a file with path `public/robots.txt` at the root of the `Custom File Root Path` as displayed in the `/admin/config` page.
|
|
|
|
Examples on how to configure the `robots.txt` can be found at [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt).
|
|
|
|
```txt
|
|
User-agent: *
|
|
Disallow: /
|
|
```
|
|
|
|
If you installed Forgejo in a subdirectory, you will need to create or edit the `robots.txt` in the top level directory.
|
|
|
|
```txt
|
|
User-agent: *
|
|
Disallow: /forgejo/
|
|
```
|
|
|
|
### Disallow crawling archives to save disk space
|
|
|
|
If the archive files are crawled, they will be generated dynamically
|
|
and kept around which can amount to a lot of disk. To prevent that
|
|
from happening, add the following to the `robots.txt` file:
|
|
|
|
```txt
|
|
User-agent: *
|
|
Disallow: /*/*/archive/
|
|
```
|
|
|
|
See also a more complete example [at Codeberg](https://codeberg.org/robots.txt).
|