#3569 - Blocking nefarious crawlers via .htaccess changes

Identifier #3569
Issue type Feature request or suggestion
Title Blocking nefarious crawlers via .htaccess changes
Status Open
Tags

Type: Spam (custom)

Handling member Deleted
Addon General / Uncategorised
Description I am not sure if this is suitable for a distributable CMS, but it does sound like a useful level of protection I don't think we currently have. The .htaccess method sounds the simplest given I'm not sure how many end users will have root server access. Maybe some of the other snippets could be useful too, but primarily the blocking of bad crawlers I think would be beneficial to security and bandwidth/processing?

https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker
Steps to reproduce

Related to

#3315 - Bundle default robots.txt

#415 - robots.txt editor (formerly: "Allow exclusion of contents from sitemap for all content types")

Funded? No
The system will post a comment when this issue is modified (e.g., status changes). To be notified of this, click "Enable comment notifications".

Rating

Unrated