At 12/18/23 02:14 AM, HenrySanchezNG wrote:
You may be right with the scraping happening anyway but I feel an extra layer of protection gives more power to the individual. For example if somebody comes up with a unique way of making art and somebody trains a model with it he won't be able to talk about it so openly about it. Plus it's an extra layer to win a legal battle and a way to demand royalties or compensation.
CivitAI has communities, where people can hire other people to train models based on artist style. It is not legal, and it still happens and the host has no plans to stop this as its financially profitable. It is what it is.
There are also already AI users saying you must scrape and train before Nightshade and EU new laws get out. So if you ever heard of 'dead internet theory' it is no longer conspiracy theory, it already is today. I.e. one artist said, when she googled her own name with new Google (available in US for select accounts) - Google image search generated fake image of her style and called it hers. And then we have this 'style cannot be copyrighted' nor can 'ai works' - loop. Someone steals artists style, but then their work isn't copyrighted either and then you go read X-posts how AI generative trolls are upset someone using their prompts. Clown world.
I personally have made peace with it. I touch more grass, and do more 3D prints and props for myself and let people in fake world have their fake images and realities.
remember AI doesn't only steal from images, it steals peoples likeness, voices, music and so on. We have already had cases, where voice actors voice been used against their consent by a fan in some game mod and the VA was pretty shocked. Case of Elias Toufexis and his voice being used to add extra lines to Adam Jensen. He was not amused. Not at all. We also have people steal singers voices and making them sing whatnot or deepfakes people saying things they never did.
New common scam circulates, where people likeness and voice gets stolen from social media and then calls are made for their family to claim they need money sent to X account. or revenge ... po...eee...... adult videos are done with deepfakes because of social media has enough data to mimic someone.