- cross-posted to:
- privacy@programming.dev
- privacy@lemmy.ml
- cross-posted to:
- privacy@programming.dev
- privacy@lemmy.ml
Google confirms its latest update can scan all your photos to “use actual images of you and your loved ones” in AI image generation. That means Gemini seeing who you know and what you do. You likely have tens or hundreds of thousands of photos. They’re all exposed if you update.



An AI update that is ‘opt in’ and not ‘opt out’?
Truly Shocking!
For now! Probably a legal requirement.
Nothing stopped github from forcing me to opt out of sharing all my code with their AI, so I don’t think its a legal thing?
I think they’re considered differently in law. Your employer can claim all your works while you’re employed, and you agreed to the terms of GitHub. I’m pretty sure they can lay claim to your work as well.
Your likeness has more legal protection, I believe.