View Issue Details
ID | Project | Category | View Status | Date Submitted | Last Update |
---|---|---|---|---|---|
5570 | Composr | galleries | public | 2024-01-21 15:53 | 2024-07-25 19:50 |
Reporter | Chris Graham | Assigned To | Guest | ||
Priority | normal | Severity | feature | ||
Status | new | Resolution | open | ||
Summary | 5570: Generative AI poisoning | ||||
Description | Consider optional integration of a library to poison uploaded images so that AI bots screw up their models if they try to use them: https://nightshade.cs.uchicago.edu/whatis.html | ||||
Tags | Type: Anti-big-tech | ||||
Attach Tags | |||||
Time estimation (hours) | 4 | ||||
Sponsorship open | |||||
|
A typical Composr user will likely not have a web server capable of running Glaze or Nightshade. These tools require quite a large amount of data / RAM and a decent GPU. I'd suggest we make some sort of AI protection / awareness documentation page and talk about Nightshade and Glaze and how to use it in conjunction with Composr rather than implementing it in Composr directly. Or, if we can find an external SaaS API that does this, we could incorporate that in Composr. |
Date Modified | Username | Field | Change |
---|---|---|---|
2024-01-21 15:53 | Chris Graham | New Issue | |
2024-01-21 15:54 | Chris Graham | Tag Attached: Type: Anti-big-tech | |
2024-01-21 17:13 | PDStig | Note Added: 0008220 | |
2024-01-21 17:14 | PDStig | Note Edited: 0008220 | |
2024-01-21 17:14 | PDStig | Note Edited: 0008220 | |
2024-01-21 17:16 | PDStig | Note Edited: 0008220 | |
2024-07-25 19:50 | Chris Graham | Relationship added | related to 5548 |