Abovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 6 months agoApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkexternal-linkmessage-square23fedilinkarrow-up191arrow-down16
arrow-up185arrow-down1external-linkApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkAbovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 6 months agomessage-square23fedilink
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up2arrow-down2·6 months agoDepends on their legal status. Could they get sued by a victim?
minus-squarepotentiallynotfelix@lemmy.mllinkfedilinkarrow-up1arrow-down1·6 months agoThere wouldn’t be a victim, it’s AI.
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up3·6 months agoA minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s
Depends on their legal status. Could they get sued by a victim?
There wouldn’t be a victim, it’s AI.
A minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s