Images depicting war-torn Ukraine are being generated by AI services, sold on stock photo websites and used in media coverage of the conflict.
Images depicting war-torn Ukraine are being generated by AI services, sold on stock photo websites and used in media coverage of the conflict.
Yeah, I tried to get that across with my phrasing… I’m not saying we need to change the technology. I mean it’s out there and it’s too late anyways. Plus it’s a tool, and tools can be used for various purposes, and that’s not the tool’s fault. I’m also not arguing to change how kitchen knifes, axes, etc work, despite them having potential to do harm…
But: It doesn’t need to be 100% waterproof or we can’t do anything. I’m also not keeping my knife collection on the living room table when a toddler is around. But at the same time I don’t need to lock them in a vault… I think we can go 90% the way, help 90% of people and that’s better than do nothing because we strive for total perfection… I’m keeping the bleach and knifes somewhere kids can’t reach. And we could say the AI services need to filter images of children. (I think the big ones already do.) And put invisible watermarks in place for all AI generated content. If anyone decides to circumvent that, that’s on them. But at least we solved the majority of very easy misuse.
And I mean that’s already how we do things. For example a spam filter isn’t 100% accurate. And we use them nonetheless.
(And I’m just arguing about service providers. That’s what the majority of people use. And I think those should be forced to do it. But the models itself should be free. Otherwise, we put a very disruptive technology solely in the hands of some big companies… And if AI is going to change the world as much as people claim, that’s bound to lead us into some sci-fi dystopia where the world revolves around the interests of some big corporations… And we don’t want that. So we need AI tech to be shaped not just by Meta and OpenAI. IMO That means giving access to the technology to the public.)