View Issue Details

IDProjectCategoryView StatusLast Update
5570Composrgalleriespublic2024-07-25 19:50
ReporterChris Graham Assigned ToGuest  
PrioritynormalSeverityfeature 
Status newResolutionopen 
Summary5570: Generative AI poisoning
DescriptionConsider optional integration of a library to poison uploaded images so that AI bots screw up their models if they try to use them:
https://nightshade.cs.uchicago.edu/whatis.html
TagsType: Anti-big-tech
Attach Tags
Time estimation (hours)4
Sponsorship open

Sponsor

Date Added Member Amount Sponsored

Relationships

related to 5548 Not AssignedGuest Allow easy GPT crawl blocking 

Activities

PDStig

2024-01-21 17:13

administrator   ~8220

Last edited: 2024-01-21 17:16

A typical Composr user will likely not have a web server capable of running Glaze or Nightshade. These tools require quite a large amount of data / RAM and a decent GPU.

I'd suggest we make some sort of AI protection / awareness documentation page and talk about Nightshade and Glaze and how to use it in conjunction with Composr rather than implementing it in Composr directly.

Or, if we can find an external SaaS API that does this, we could incorporate that in Composr.

Add Note

View Status
Note
Upload Files
Maximum size: 32,768 KiB

Attach files by dragging & dropping, selecting or pasting them.
You are not logged in You are not logged in. This means you will not get any e-mail notifications. And if you reply, we will not know for sure you are the original poster of the issue.

Issue History

Date Modified Username Field Change
2024-01-21 15:53 Chris Graham New Issue
2024-01-21 15:54 Chris Graham Tag Attached: Type: Anti-big-tech
2024-01-21 17:13 PDStig Note Added: 0008220
2024-01-21 17:14 PDStig Note Edited: 0008220
2024-01-21 17:14 PDStig Note Edited: 0008220
2024-01-21 17:16 PDStig Note Edited: 0008220
2024-07-25 19:50 Chris Graham Relationship added related to 5548