#3569 - Blocking nefarious crawlers via .htaccess changes
| Identifier | #3569 |
|---|---|
| Issue type | Feature request or suggestion |
| Title | Blocking nefarious crawlers via .htaccess changes |
| Status | Open |
| Tags |
Type: Spam (custom) |
| Handling member | Deleted |
| Addon | General / Uncategorised |
| Description | I am not sure if this is suitable for a distributable CMS, but it does sound like a useful level of protection I don't think we currently have. The .htaccess method sounds the simplest given I'm not sure how many end users will have root server access. Maybe some of the other snippets could be useful too, but primarily the blocking of bad crawlers I think would be beneficial to security and bandwidth/processing?
https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker |
| Steps to reproduce | |
| Related to | |
| Funded? | No |
The system will post a comment when this issue is modified (e.g., status changes). To be notified of this, click "Enable comment notifications".


Comments
So essentially this is addition of a block list in the default configuration.
I probably would solve this via a tutorial to be honest, but there's a lot to read over, certainly could be some good ideas.