Sapphire Velvet@lemmynsfw.com to Technology@lemmy.worldEnglish · 1 year agoChild sex abuse images found in dataset training image generators, report saysarstechnica.comexternal-linkmessage-square12fedilinkarrow-up1100arrow-down110file-textcross-posted to: [email protected][email protected]
arrow-up190arrow-down1external-linkChild sex abuse images found in dataset training image generators, report saysarstechnica.comSapphire Velvet@lemmynsfw.com to Technology@lemmy.worldEnglish · 1 year agomessage-square12fedilinkfile-textcross-posted to: [email protected][email protected]
The report: https://stacks.stanford.edu/file/druid:kh752sm9123/ml_training_data_csam_report-2023-12-20.pdf
minus-squareSapphire Velvet@lemmynsfw.comOPlinkfedilinkEnglisharrow-up3arrow-down2·1 year agoThey’re not looking at the images though. They’re scraping. And their own legal defenses rely on them not looking too carefully else they cede their position to the copyright holders.
minus-squaresnooggums@kbin.sociallinkfedilinkarrow-up5arrow-down1·1 year agoTechnically they violated the copyright of the CSAM creators!
They’re not looking at the images though. They’re scraping. And their own legal defenses rely on them not looking too carefully else they cede their position to the copyright holders.
Technically they violated the copyright of the CSAM creators!