Nightshade: Data Poisoning to Fight Generative AI with Ben Zhao - 668
The TWIML AI Podcast with Sam Charrington The TWIML AI Podcast with Sam Charrington
19.6K subscribers
2,067 views
81

 Published On Jan 22, 2024

Today we’re joined by Ben Zhao, a Neubauer professor of computer science at the University of Chicago. In our conversation, we explore his research at the intersection of security and generative AI. We focus on Ben’s recent Fawkes, Glaze, and Nightshade projects, which use “poisoning” approaches to provide users with security and protection against AI encroachments. The first tool we discuss, Fawkes, imperceptibly “cloaks” images in such a way that models perceive them as highly distorted, effectively shielding individuals from recognition by facial recognition models. We then dig into Glaze, a tool that employs machine learning algorithms to compute subtle alterations that are indiscernible to human eyes but adept at tricking the models into perceiving a significant shift in art style, giving artists a unique defense against style mimicry. Lastly, we cover Nightshade, a strategic defense tool for artists akin to a 'poison pill' which allows artists to apply imperceptible changes to their images that effectively “breaks” generative AI models that are trained on them.

🔔 Subscribe to our channel for more great content just like this: https://youtube.com/twimlai?sub_confi...


🗣️ CONNECT WITH US!
===============================
Subscribe to the TWIML AI Podcast: https://twimlai.com/podcast/twimlai/
Join our Slack Community: https://twimlai.com/community/
Subscribe to our newsletter: https://twimlai.com/newsletter/
Want to get in touch? Send us a message: https://twimlai.com/contact/


📖 CHAPTERS
===============================
00:00 - Research background
3:15 - Fawkes
10:59 - The impact on artists’ identity and livelihood
15:46 - Movement on certified human-created art
17:08 - New classes of social problems
19:52 - Glaze
24:10 - Challenges
30:11 - Nightshade
33:36 - Training datasets
36:28 - The company data collection process and its role in the attack mechanism


🔗 LINKS & RESOURCES
===============================
Nightshade - https://nightshade.cs.uchicago.edu/
Glaze: Protecting Artists from Style Mimicry by Text-to-Image Models - https://people.cs.uchicago.edu/~raven...

For a COMPLETE LIST of links and references, head over to https://twimlai.com/go/668.


📸 Camera: https://amzn.to/3TQ3zsg
🎙️Microphone: https://amzn.to/3t5zXeV
🚦Lights: https://amzn.to/3TQlX49
🎛️ Audio Interface: https://amzn.to/3TVFAIq
🎚️ Stream Deck: https://amzn.to/3zzm7F5

show more

Share/Embed