• 0 Posts
  • 242 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle




  • AI use for defamatory purposes, such as deepfake porn mentioned in another post here, applies whether one is a a massive celebrity or a regular person. As the technology becomes more common, don’t you think there will be people using it on their school and work colleagues and neighbors, for a variety of petty reasons?

    You talk about how horrible it would be for people to sell their likeness, without considering that without such laws and protections they can just have their likeness taken with no consent or compensation.

    I am seeing a lot of grandstanding of how these laws are just the powerful taking rights away from the common man, but it seems to be exclusively from the angle of how that affects the AI user, not the regular people whose likenesses might get used by AI.

    To be fair here’s good reason to be careful over how this matter is legislated, as media companies love to use any excuse for overreach. But the solution is not leaving the internet a wild west of people smearing each other by faking videos.

    Consider that the advent of the camera created a need for many laws, because before then even the most realistic image was known to be fabricated rather than a replica of reality. Now AI and other new media technologies are creating possibilities which we never had before, for which our previous laws are insufficient.




  • Other than vague slippery slope fearmongering I don’t see how banning the creation and distribution of deepfake porn is going to make AI monopolized by corporations. If have your own personally trained and run AI model, you have complete control of what sort of content it’s generating. Why would you have issues with deepfake porn laws if you are not generating and hosting that content?

    It just doesn’t add up, there’s some logical leap here that seems almost on the level of conspiracy theories. As much as governments do tend to favor corporations over regular people there is nothing so far even vaguely suggesting that AI would be so profoundly restricted that only corporations could use it. In fact, what has been described of what is proposed so far does not target the technology at all, only the users who engage in this kind of bad conduct.

    But I profoundly disagree with this “nothing to be done about it”. How would fighting it be worse than letting people suffer for it? It’s not like drugs where the main person who might have issues is the user themselves, this affects unrelated vulnerable people.

    If it is identified who is making deepfake porn and where it’s being hosted, it can be taken down. You could argue that not every single responsible person will be identified, but it might still be enough to diminish the prevalence and number of victims. And to the point that the remaining ones will have to be sneaky about it, that still might lead to less harassment to the victims.

    You compare it to the war on drugs. Meanwhile I think of the rise of the automobile, with people crying that seat belts and traffic lights were ruining their freedom and “there’s nothing to be done” about people dying in car crashes.



  • Welp, you deleted the one I had replied to and cut off my response. I had replied this:

    Deepfakes don’t happen by accident. It’s also not “perfectly legal” to distribute and alter a photo you have no permission to use.

    Your argument essentially seems to be that because people will try to find ways around it, no law should be created and no action should be taken to prevent it, is this right? Because this could be said of pretty much any law and it isn’t a particularly compelling argument. Part of enforcing the law is getting around the tricky ways people try to disguise their actions.

    Nevermind that this proposed law is supposed to protect the victims who are harassed because of it. If it was so invisible, they wouldn’t be suffering.

    If this will eventually lead to AI models getting limitations to prevent people from using them for deepfake porn… Good? Who loses beyond the people trying to make deepfake porn




  • You do have a point about the excesses of police work, but if you want to talk about empathy you should also consider the position of the kid who is harassed and traumatized over something they didn’t even have any say over. There is some discussion to be had over what degree of punishment ought to be appropriate, and the need to limit police brutality, well beyond this particular matter.

    But as far as demanding that every such work is taken down, and giving vulnerable people the means to demand so without exposing themselves further, it is perfectly reasonable.

    Realistically, it’s as likely to happen as prosecution of kids who “get into fights” for assault. Kids tell mean lies about each other but that is not resolved in civil suits over defamation. Even between adults, that’s not the usual thing.

    Except that in the case of deepfake porn it’s not a matter of fuzzy two-sided conflicts. One side is creating the whole problem, and one side is just the victim of it despite not being involved in any way. That’s the whole point of deepfake. The most that lies might play into it is in finding out that the porn is real, and in such case there is even more reason to take it down.

    Civil suits under this bill would be mainly targeted against internet services, because they have the money. And it would largely be used over celebrity fakes. That’s the overwhelming part of fakes out there and they have the money to splurge on suing people who can’t pay. It would be wealthy, powerful people using it against horny teens.

    Gotta say I have a hard time feeling sorry for the people who can’t be satisfied by the frankly immense amount of porn we have and decided that they absolutely must have porn from that one specific person who never consented to it. Maybe they are wealthy and powerful, sure. Does that mean it’s a free pass to fabricate deepfake porn with their likenesses? I don’t think so. Nobody is owed that. As much as you insist that it will be used by the powerful against the poor masses, it still seems to me that whatever regular dude decides to do it is crossing serious boundaries. This is not brave freedom fighter, it’s just an asshole.

    I think most likely what will happen is that these internet services will just take those down. As they should.


  • Nah, making deepfake porn illegal doesn’t require making all of AI illegal. As proposed this law would neither apply to candid photography generation nor to entirely imaginary AI porn. As proposed it’s targetting those generating and distributing such images rather than the technology itself, and giving victims means to defend themselves against being publicly humilliated.

    It could be handled much like any matter of copyright is, that anyone hosting and sharing it must take it down or face the punishment.

    Technology allows many things to be done quickly and easily, but whether they are legal and protected is a whole different matter. The models can be as good as they want, as quick as copying a file, it doesn’t mean that people won’t be sued over it.

    It seems a bit questionable to assume that everything that is technologically possible ought to be permitted, no matter who is harmed. And frankly this is much more harmful than any piracy or infringement.


  • Don’t you know? People already do have rights over their likeness and we already have laws regarding that. To some extent you are allowed to record public locations and events, and you don’t need to seek permission to every passerby. But it doesn’t mean you can record people and use their images in every location and situation.

    Not to mention, we are talking about deepfakes made to look like specific people. I don’t think you are going to accidentally pass by someone’s deepfake porn while taking selfies on the streets, so there’s not much point of bringing this up.



  • I’m as suspicious of “think of the children” stuff as anyone here but I don’t see how we are fighting for the rights of the people by defending non-consensual deepfake porn impersonation, of children or anyone.

    If someone makes deepfake porn of my little cousin or Emma Watson, there’s no scenario where this isn’t a shitty thing to do to a person, and I don’t see how the masses are being oppressed by this being banned. What, do we need to deepfake Joe Biden getting it on to protest against the government?

    Not only the harassment of being subjected to something like this seems horrible, it’s reasonable to say that people ought to have rights over their own likeness, no? It’s not even a matter of journalistic interest because it’s something completely made-up.



  • We can be less bigoted than the past while also having a long way to go still. You could even count as a sign of this improvement that these issues are taken seriously and discussed rather than ignored as “just the way things are”.

    But we can’t take it for granted, because progress is not guaranteed and equality can decline. Say, such as the matter of abortion rights in the US and consequently how pregnancies are policed, leading to possible arrests even for natural miscarriages.

    If you acknowledge that we aren’t finished fighting bigotry, I don’t really understand what’s your concern here.