#415 - robots.txt editor (formerly: "Allow exclusion of contents from sitemap for all content types")
0 guests and 0 members have recently viewed this.
The top 3 point earners from 7th Dec 2025 to 14th Dec 2025.
| PDStig |
|
|
|---|---|---|
| Gabri |
|
|
| Master Rat |
|
|
There are no events at this time
Having an editor for robots.txt inside Composr would not hurt.
A simple robots.txt editor is now implemented for v11, which will make editing robots.txt a little easier.
I thought about what was written here about keeping robots.txt and the XML Sitemap in sync, or about having content options to exclude it via robots.txt and the XML Sitemap. The problem with it is that it assumes a binary - that the content is either not to be crawled by anything, or it is. robots.txt allows specifying which crawlers have access to content, and the XML Sitemap is not specifically for crawlers (it could be used by an HTML Validation tool for example). So it doesn't line up very well with the necessary flexibility of these formats.
Because the implementation is problematic, I think just putting robots.txt in the hands of the user, but helping them to edit it, is the correct approach. Easier for us to do too.