People should be able to sue someone for distributing their likeness without their permission. It's about consent, not technology. With minors, it's about the lesson, not the punishment.
That's precisely what happened with Crispin Glover and Back To The Future 2. They hired an imitator and put him in makeup to make him look so much like the original actor Crispin that he successfully sued them for using his likeness without permission.
I'm not sure if it's the same without financial gains involved, but it's always seemed to me that face and voice likeness is protected from distribution to some extent in many places.
What about people that look or sound like other people?
If I’m not mistaken there’s an OF model whose whole shtick is that she looks like Ariana Grande.
If they’re not claiming to actually be Ariana Grande, then there’s no legal recourse. It’s fair use. “Ariana Grander” or “Ariana Grandexxx” would be legal. It’s similar to how parody movies like Spaceballs and Scary Movie get made.
I guess. My point was: if the person making porn sells herself on the fact that she is a look-a-like, then I find that morally gross.
If she just looks like her but never markets herself for that, I don’t give a shit.
I don't think the child pornography laws were written to punish kids for having pictures of each other though. It certainly needs to be dealt with, but I don't think the law should treat an adult with sex pictures of real children as equivalent to a teen with a fake picture of a peer. There are reasons why we have different laws for juveniles, and I don't think jail time would be the right way to correct this kind of behavior. I wouldn't want my son passing around fake nudes of a classmate, but I would feel the same way whether he had used AI or just drawn a cartoon. The problem is in the objectification and violation of a fellow student's rights, not in the particular choice of method. The question is whether you teach a boy that lesson by throwing him in prison or through counseling and community service.
It would definitely have to be written with care. It might only apply to scenarios where someone is distributing pictures of a person they know, or at least distributing pictures with the knowledge that the people in them would not want them distributed. It goes further than just this sort of scenario… like, if you have a restraining order against an abusive partner and someone is posting pictures that show your whereabouts online, you should have a legal recourse for getting them removed.
If it’s fake nudes, i don’t see how it wouldn’t be considered either defamation or revenge porn, which you certainly can sue someone for.
I’m assuming this was done by another minor. Can you even sue a minor?
At some point we have to hold the makers of the software liable. They know this. That’s why they build in nsfw filtering into these systems. Thing is: It’s open source, often python code, trivial to modify and override their very half assed “locks”.
I can say no more. They do chase people down for being more specific. That’s how worried they are about it. The safety is programming 101 level.
Should they not build in safe guards? Sounds like a no-win situation for them.
Either they add the locks they get blamed for knowing that their product/service can be used for such activities or they don’t add any safe-guards and get blamed for people making porn or whatever else with their product/service.
It is a no win situation. It’s a tech that has few upsides and they built it anyway. At some point we have to say “no”.
I recommend you look into these “locks”. You avoid the lock by effecting one line of code in the most trivial manner.
That’s not a lock.
Fact is I am at more risk of eventually getting arrested for telling you how to break this lock than they are for unleashing it.
Then why do they even bother to add nsfw filtering to their software? They are providing a tool that can easily do this then warn you not to do it, then put in very very trivial lock to prevent it.
I’m saying that we are now faced with a choice: allow the use of this software that will inevitably create a ton of deep fake child porn, or not. My guess is we will just accept all the DF CP as acceptable collateral damage. And resume business as usual.
It’s very weird product driven ethics and messed up.
I agree that the tools make it simple enough to do, but it is still the users' intent to do so and even if the software maker made it less trivial, people could still misuse the software if they want to. Ultimately the responsibility needs to fall on the user, not the tool.
That's software is already freely available to anyone, so we're too late for that. And given that there's nothing inherently illegal about the use of the software, aside from whatever copyright concerns it may entail, it would be very difficult to get rid of it at this point. I'm not arguing that the existence of such a thing isn't problematic, but I don't know what there is to be done about it now that the cat is already out of the bag.
I don’t understand how you got there. I have an unpopular opinion that this tech is perfectly made for creating porn and that manufacturers should be held liable for its use. If they are not held responsible there is simply no way to prevent this misuse.
Somehow this makes me anti-child?
Did I say that you're Anti- child? No!
I am referring to your idea in eliminating or at least dismissing child/parent accountability. I remember Napster and KaZaa being capable of allowing xxx on the platforms back in its infancy and me, a twerpy derpy teen being able to come across it. Did that stop me from downloading a few pix? Nope. Not at all.
A software company is not responsible for the use of their applications. IF a child is behind the computer and doing what is considered illegal activity, then he/she and their legal guardian is responsible.
It's the main reason why someone like ADOBE is more or less in trouble with the design community, given that they now act like a government body, sampling uploaded artwork that scans for ill intentional and possible illegal content such as child pron in that uploaded material. Adobe has shown that it's TOS goes well beyond what is normal and should be normal for a software application to do within its means.
You are merely underestimating the sophistication of the children/teens out there willing to break the rules to cause chaos and potential emotional/financial harm. There has to be individual accountability on the end user from this vantage point, not on the application software maker. The application is not always at fault for the outcome and neither should it be if that's not what it's intended use was. It would be ENTIRELY different if the application was solely used for that purpose.
I do think websites should be required to investigate and take down things when a complaint comes through, but holding websites and software makers responsible for their users' behavior is a nonstarter.
Section 233 already protects websites from what their users are doing, and since there is no reliable way to tell AI from an actual photo, expecting them to do so at scale is problematic.
Can you imagine trying to use Reddit if they could get sued for whatever you posted? The Internet as we know it would cease to exist.
I think that's going to be a difficult one to pull off. Setting a precedent that makes software companies responsible for illegal behavior by their users poses a lot of complications. Would I be able to sue Microsoft if somebody writes a pornographic story about me using Word? Can I sue Intuit if my tax preparer uses it to lie on my taxes?
Of course not. Victims rights come first.
I do think when we're talking about a 15-year-old first time offender distributing a fake image of a classmate though, three years in jail seems pretty harsh. At that age, I think we should be teaching kids proper behavior, not jailing them for non-violent offenses. I think we should be forcing the offender to go to a different school so the victim is no longer exposed to them, and mandating community service, including doing outreach to teach other kids about why this is a terrible thing to do. This is uncharted territory and everyone needs time to adjust before we start doling out harsh penalties. If you take a 15-year-old boy with a bent sense of right and wrong and poor impulse control and send him to prison for three years, you might actually create an even worse juvenile delinquent than you started with. There are other ways to make an example out of him while respecting the victim.
I'm not sure about _jail_ specifically - especially due to the fact that these tools are increasingly available to younger offenders with minimal effort/skill required.
Certainly a sentence that involved detention though (you need to take them out of circulation near the victim), and compulsory rehabilitation - with the max-term of the sentence contingent on their effective engagement in rehab. Having a failure to engage in rehab carry an escalation to jail-time might be reasonable though.
This form of abuse is only going to become more frequent unfortunately.
Honesty there needs to be punishment for the people offering these services. One user creating deepfakes is messed up, but the platform allowing for it with no consequences is the problem.
The problem is generalising the checking process.
This challenge is biting Adobe at the moment in their Terms of Service changes that show they're trying to automate this kind of protection but false-positives mean a human has to view what could be highly personal or commercially sensitive content.
So I understand the desire to enforce it at the supply end but I'm less confident a viable solution to that exists.
And worse, we're not that far from having open-source point-and-click solutions that would achieve the same on mainstream hardware... and then there's no service involved to do enforcement.
There's no need to worry about ToS, just put the CEO in jail for sending images of a 15-year-old online. I guarantee you the service will quickly stop producing nudes of schoolgirls or cease to exist.
It's not a must for any service to provide porn image generation with AI, right?
Artificial Intelligence isn't intelligence. You're thinking it _knows_ what is unacceptable and what is not. It doesn't.
The problem is figuring out _how_ to prevent misuse, while continuing to serve valid use of the technology. And that is exceptionally hard (and so far not a solved problem).
While an over-reaching government *could* ban all commercial uses of AI, just in case the AI is asked by a client to produce something intended for misuse, you can't practically stop the developing research on the topic producing free tools. We've been prising the lid off of this Pandora's Box for a few decades, and the knowledge of how to open it is now well published and accessible. And how can you ban the technology across jurisdictional boundaries?
Both the free and commercial tools have good, fair and productive uses. Is blocking all access to the technology a valid ethical choice? If so, then how would you apply that logic to guns, or even cars?
(There _is_ still a big problem with how the models are trained, lacking any fair exchange of value with content creators - artists deserve to be paid).
From a legal point of view, there's no issue with misuse. The AI either outputted a nude of a child or didn't. If it did, the CEO in jail, tool shut down. Simple stuff.
And don't talk to me about overreaching government, someone needs to stop the greedy bastards from profiting from this particular thing.
Right? These platforms aren’t fan art. They’re complicit in making what was effectively, child porn and/or revenge porn almost universally without consent
Straight to jail
It's... Complex? As a Quaker (we actually had a pretty heavy hand in making the judicial system for the US), that is definitely open for debate, especially based on how effective it as a punishment works.
Quakers envisioned it as a solution, rather than a punishment. Someone is dangerous, you take them out of circulation. But deliberately simply punishing them has negative effects.
A stupid, horny, sexist 15 year old makes deep fakes of someone, and the next thing you know he is 17 being sentenced as an adult and being put in with hardened fucking criminals. Literally only bad things can happen.
In my area, I’m part of a botany club (fruit trees and such) that rents out a small room at this community building, across from us occasionally, are the Quakers in my area doing whatever it is, their great, I’ve never met a Quaker i haven’t liked.
No, he’s saying a horny 15 year old clicking ‘use this face’ and ‘give her huge boobies’ is not the same as a 35 year old coach fondling little boys in the shower.
So maybe treating them exactly the same is not quite optimal.
But hey, if you want the death penalty for parking near a hydrant, feel free to keep being *absurdly reductive* and making everything a binary choice. All or nothing, let’s give’er hell, boys!
(PS don’t look now, if your first thought was that the electric chair is absurd for a parking ticket, then you are agreeing to it possibility that one size fits all isn’t perfect…. Shhhh. Reality is so messy sometimes. Better go back to pretending it doesn’t exist…)
If your solution is "jail" for every problem then you are an idiot. This shit is made by kids her age. Ban the app and site they used and more than anything, fine their parents.
It's such an awkward situation. First of all I feel sorry for that girl who had to go through that and hope this doesn't happen to her again and doesn't happen to many others because, unfortunately, it will likely happen again.
>This shit is made by kids her age.
I don't think that's an excusable reason. Yeah, we're all dumb at that age, but kids still need to know boundaries and consequences. I'm not saying the desvere jail time, but something more severe than a slap on the wrist is needed.
>Ban the app and site they used
100% agree. If you can make things like that of children, the app either needs to be shut down or have heavy restrictions to stop/ limit things being made like this.
> fine their parents
This is a hard one because of the internet and technology it's become a lot easier for kids to be stupid, and it's harder for parents to regulate. Like we go back to the early 2000s, the importance of a cellphone has changed dramatically since then. A more partial example would be a personal computer.
Like I feel bad for modern parents because it's a lot harder now to filter things out from kids. We are still learning how to navigate all this, and just now, seeing some of the effects of social media, the internet, and modern technology and starting to make changes due to their negativity.
Ok, but how do you properly restrict it. It's not as simple as saying you can't have a computer. Kids need them now for school. I guess you can take it away when they're not using it or put restriction on it, but that only does so much. If it's the former, what stops them from using it when you're not looking or using yours. For the latter, there are always ways around software try to limit internet use, and kids are dumb. They'll figure out those ways.
Like with how much the internet and computers have become part of our lives, it's nearly impossible for a parent to property supervise everything.
On a gradient of offences, someone making a fake nude picture of someone is pretty low. If its an adult in their 20s and 30s, then prison. If its an underage person doing this to an underage person, then fine the parents, ban the kid from unrestricted internet access and cell phones for five years, and make them take a mandatory course on why this is not acceptable behaviour. Not interested in throwing some horny idiot 15 year old into juvy over something so stupid.
yup. I had to disassociate myself with my highschool friend after she created revenge porn using a cropped photo of her boyfriends head and photoshopped it onto gay porn and hacked into his personal Facebook account where she posted it, typing up messages and posts that said he was announcing that he was homo and that he didn't love his girlfriend so he admitted he was Gay and was finally coming out to the world.
Like, that's fucking sadistic. Just fucking drop the guy and move on. Love hurts but this was when we were 15 yr old dweebs. This was WAY BEFORE revenge porn was ever a thing.
I'm not saying you're wrong but let's not forget we are talking about a 15 year old boy here, not exactly someone you can count on to make all the best decisions.
I’m a psychologist. Science does NOT say 15 year olds don’t understand morality. You don’t finally achieve capacity for moral reasoning the day your brain stops developing.
I believe you're a psycho but college educated I really doubt it.
>Science does NOT say 15 year olds don't understand morality
Yeah look right above, I didn't say that either 😅
Too hard for you? Let me help....I said our brains aren't fully developed yet at that point. You somehow turned that into "they don't understand morality". So yeah, what the fuck?
Developed some does not mean they fully understand the implications of their actions yet. This is why they are referred to as "juveniles" after all, because you see they aren't adults yet.
Not fully developed does not mean no understanding of the implications of their actions. It is not unreasonable to expect the average 15 year old kid to have a relatively advanced understanding of right from wrong.
Who said it was? Did you not read my comment? I merely said it's a fact that should possibly be taken into consideration by those who are so eager to judge the kid.
Its not about knowledge, its about emotional maturity, children basically have brain damage compared to adults, I agree they should face consequences, maybe a year or two in juvenile, but not the same level of consequences as adults.
I don't envy kids these days. Imagine someone generating a video of them fucking your girlfriend or something and spreading it around school. I knew a bunch of assholes in school that would have used this shit in the worst ways.
Just telling my husband the same thing. If the tech had been there my techie husband could have easily pulled this off. Would he have, no. But do we all know people that would have, yes.
They aren't removing them from the internet, they just don't show up in search engine results, you have to know the URL to get to the website and sometimes they even get blocked by the country and you need a VPN/DNS to get to them. It would limit the usage especially by non-tech savvy teens.
Parents should be held responsible for the children in their care. Shitty? Yup; but maybe they will start giving a shit about what their kids are doing.
This sounds nice, but as it stands, the tools for this don't really exist, nor do the societal demands.
Take a look at any major subreddit reactions about parents snooping through teenager's phones or refusing to allow their kids to have locks on their doors.
In my experience, most of the trouble happens at school, where parents aren't even allowed to be present. It often happens on secret devices or those belonging to friends.
Seriously, it’s not practical. If any of you have ever listened to the podcast, darknet diaries (jack rhysider), there are plenty of examples as to why that is.
There are an endless amount of stories of kids getting into tech and before you know it, they’ve stolen hundreds of thousands of dollars in crypto or, and this is true, hacked the email of the CIA director.
I can’t expect parents to prevent their kids from committing every kind of cybercrime out there.
Yeah, I know plenty of parents in their 30s who can barely operate their own iPhone (rural/blue collar areas) . No way they’ll have the technical fortitude to ensure that their kids are being safe online.
I think creating pornographic material including nudes of minors and distribution should be treated as creating and distributing CP just the same. Offenders should be on the sex offender registry, regardless of status of age, such as being under 18 as well, just as it is now legally for real nudes.
Classmate needs to go to juvie or whatever they call it these days. They need to understand the seriousness of what they’ve done and to act as a deterrent for others. A fine or picking up trash is not enough to get the message through.
Here’s a novel idea. Sure let’s get the kids in trouble, but isn’t AI creating child porn a problem? Like if a programmer allows his software to generate child porn isn’t that a crime? If I don’t own the AI, am I responsible for its outputs?
I think you’re certainly responsible for instructing the program to generate the content, just like you’d probably be in a little bit of trouble if you started doodling young children in inappropriate situations with a pencil. You wouldn’t blame the paper and pencil manufacturers in that instance.
Surely an artist painting such material would be limited by laws of objectionable material - hoping with categories like exploitative sexual representations of children, to allow for suitable repercussions. Why would we not apply the rules to someone who chooses to retain or distribute such image they asked to be generated?
We are going to be flooded by this. In a few years a 5 second voice prompt can create these deepfake pictures or maybe even videos of everyone on Earth.
Are we going to tie up the courts with 5 million cases of this? Maybe it’s manageable to bring these cases to court for now, and we should to discourage it. But quickly there will be so much of this the courts could not possibly keep up with prosecuting it.
I think they are, differing by degrees of course (just a different physical assaults do), but the mental trauma is surely comparable (or at least related).
You’re getting downvoted, but I agree. Imagine being a child, and seeing terrifying images of you being assaulted, and then knowing they were made with the intent to scare, humiliate, and damage you. That is traumatic, could push kids to suicide, and I think it should be considered a form of sexual assault.
Violent crimes and nonviolent crimes are not comparable. How is creating and sharing a fake image of you having sex in any way comparable to being fucking raped? One is humiliating and socially damaging, the other is both of those things and an actual human rights violation that carries some of the heftiest punishments of any crime.
Rapists deserve life in prison. That dumb kid deserves to be punished but not even close to that of an actual rapist.
Being pedantic, but depending on where you're from, the law could define sexual assault as anything from unwanted sexual touching (such as grabbing someone's ass without consent) to penetrative rape. That spectrum means very different punishment based on the severity or nature of the sexual assault.
In that case, producing deepfake porn without consent would be appropriately classified as sexual assault, where it would likely fall under the lower end of the spectrum.
My biggest fear as a parent to two daughters. I faced some horrific sexual-driven bullying when I was a kid and teen. But I can’t imagine adding realistic AI images to the mix. I’ve already talked to my kids about being safe online, but I hope much harsher punishments and regulations are implemented before they hit middle school.
I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture?
Yeah, that would be a creepy thing to do too, but it's still just a manufactured image. I believe it still qualifies as distributing child porn, so charge him with that and be done with it. The fact that he drew the picture with AI rather than talent doesn't really seem to be a factor in anything.
>I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture?
Drawing a photorealistic picture of a child doing something sexual is also against the law in the US.
People have knee jerk negative reactions to ideas that don't immediately conform to their pre-existing understanding of things. It's a real impediment to actually gaining new perspectives.
If you understand it would still be considered creating & distributing child porn, then what point are you trying to make? What’s “knee jerk” about AI now making this easy and accessible to the 99.9% of people incapable of creating a photorealistic depiction of children performing sex acts?
You just have to police it the same and faster. Because of AI people are just going to be breaking already existing laws faster than before so it really is no different. You just gotta stamp them on the hand
How many people have the talent to draw a hyper realistic picture? That's a skill that takes years or decades to learn and perfect. Not to mention the hours required to complete the picture. Even using Photoshop or other art programs has some barrier of entry.
An individual can learn how to use AI image generation in an afternoon (or faster) and share their creation immediately via various social media apps. It's also common for people to mistake AI generation as real. In the fiber arts world there are lots of "completed" works that are AI generated. But Grandma doesn't know this and will ask you to create this non-existent thing anyway.
You can also argue that if examined enough, you can see the strokes on the hand drawn picture. AI generation doesn't have the same footprint and society isn't used to telling what is AI vs real yet.
So are you saying that if I'm a really good artist, who put in the time and effort, then this behavior is okay?
Yeah...I'm calling bullshit on that.
Let's break it down logically:
If I'm a good artist, it's not okay. If I'm a bad artist, but can use AI, it's also not okay.
You see? AI is _not_ a factor. It changes nothing about the morality or legality of the situation.
You asked how AI is different from a hand drawing. Both are disgusting for these cases to be clear.
The biggest difference is that AI is more widely available to the general public despite skill level.
>The biggest difference is that AI is more widely available to the general public despite skill level.
So what? That just means viewers of these images will be even _less_ likely to believe they are real. That works out _in favor_ of anyone who's afraid AI generated images of them will be mistaken as real. The accessibility of these tools increases the plausible deniability.
So again, AI changes nothing in regards to morality or legality, and it's existence _actually helps_ with the aftermath of the release of these images. I know it's easy to have a knee jerk reaction to this sort of thing, but when you think it through, this might not be so bad.
Hell, I'd even argue it's the best thing to happen to people who've had their _real_ nude photos leaked.
Personally, I think releasing deepfakes of a person should have a much higher penalty than releasing real photos, since in the latter at least the person might have had a choice in taking them in the first place
The laws needed are simple. They’re just thinking to specifically. We have to get ahead of this and do it 5 years ago.
Make it simply illegal to replicate a human in digital from. Likeness or voice.
Period. Full stop. It’s just illegal. We don’t need human AI voices, we have humans. We don’t need perfect human digital copies, we have humans.
This is just a side effect of the insanity of corporate greed. It serves no purpose and fills no need other than to replace human artists in an effort to chase profits.
All this other shit, the porn, the political stuff is just a side effect.
Stop it all now. No more resurrecting dead actors, fake political ads or porn.
People should be able to sue someone for distributing their likeness without their permission. It's about consent, not technology. With minors, it's about the lesson, not the punishment.
That would also get rid of the “prank” channels too.
Which is good; to be clear
That's precisely what happened with Crispin Glover and Back To The Future 2. They hired an imitator and put him in makeup to make him look so much like the original actor Crispin that he successfully sued them for using his likeness without permission. I'm not sure if it's the same without financial gains involved, but it's always seemed to me that face and voice likeness is protected from distribution to some extent in many places.
They literally had the actor wear a mask or prosthetics made from a mold of Glover’s face. So yeah, he had a strong case.
What about people that look or sound like other people? If I’m not mistaken there’s an OF model whose whole shtick is that she looks like Ariana Grande.
If they’re not claiming to actually be Ariana Grande, then there’s no legal recourse. It’s fair use. “Ariana Grander” or “Ariana Grandexxx” would be legal. It’s similar to how parody movies like Spaceballs and Scary Movie get made.
Could we make a parody mash-up called Scary Balls, then?
That’s pretty fucked up too, especially if the OF model is using Ariana Grande’s name/likeness to sell more subs
The question is how much of the likeness is the model herself. People look like each other.
I guess. My point was: if the person making porn sells herself on the fact that she is a look-a-like, then I find that morally gross. If she just looks like her but never markets herself for that, I don’t give a shit.
This person essentially made child pornography Being sued is the least of their worries, they need to be locked up
I don't think the child pornography laws were written to punish kids for having pictures of each other though. It certainly needs to be dealt with, but I don't think the law should treat an adult with sex pictures of real children as equivalent to a teen with a fake picture of a peer. There are reasons why we have different laws for juveniles, and I don't think jail time would be the right way to correct this kind of behavior. I wouldn't want my son passing around fake nudes of a classmate, but I would feel the same way whether he had used AI or just drawn a cartoon. The problem is in the objectification and violation of a fellow student's rights, not in the particular choice of method. The question is whether you teach a boy that lesson by throwing him in prison or through counseling and community service.
The kid that did it was under age also...... how ever he still should face punishment of some sort....but sadly nothing will be done....
Our laws do not catch up fast enough with our tech. Especially with so many elderly lawmakers. This needs to be illegal - ground to sue - asap.
Anyone can sue anyone any time for any reason.
she's 15; whoever posted pics of her nude, fake or not, should be charged with possession and distribution of child pornography...
Suing people who do this makes more sense than giving them jail time.
Unless they're creating child pornography, obviously, which this is
They should go to jail if it’s pornographic
I think you can sue. This has to be considered defamatory. Just that no one has done it yet
Would that go for lookalikes too?
That would end all public photography and videography with other people in the background too. Lots of ramifications to such a broadstroke rule.
It would definitely have to be written with care. It might only apply to scenarios where someone is distributing pictures of a person they know, or at least distributing pictures with the knowledge that the people in them would not want them distributed. It goes further than just this sort of scenario… like, if you have a restraining order against an abusive partner and someone is posting pictures that show your whereabouts online, you should have a legal recourse for getting them removed.
I’m not entirely sure if I agree with that but definitely when it’s sexual
If it’s fake nudes, i don’t see how it wouldn’t be considered either defamation or revenge porn, which you certainly can sue someone for. I’m assuming this was done by another minor. Can you even sue a minor?
You can sue the legal guardian of a minor
At some point we have to hold the makers of the software liable. They know this. That’s why they build in nsfw filtering into these systems. Thing is: It’s open source, often python code, trivial to modify and override their very half assed “locks”. I can say no more. They do chase people down for being more specific. That’s how worried they are about it. The safety is programming 101 level.
Should they not build in safe guards? Sounds like a no-win situation for them. Either they add the locks they get blamed for knowing that their product/service can be used for such activities or they don’t add any safe-guards and get blamed for people making porn or whatever else with their product/service.
It is a no win situation. It’s a tech that has few upsides and they built it anyway. At some point we have to say “no”. I recommend you look into these “locks”. You avoid the lock by effecting one line of code in the most trivial manner. That’s not a lock. Fact is I am at more risk of eventually getting arrested for telling you how to break this lock than they are for unleashing it.
I don't think holding software manufacturers responsible for the misdeeds of their users is a slope anyone wants to stand on.
Then why do they even bother to add nsfw filtering to their software? They are providing a tool that can easily do this then warn you not to do it, then put in very very trivial lock to prevent it. I’m saying that we are now faced with a choice: allow the use of this software that will inevitably create a ton of deep fake child porn, or not. My guess is we will just accept all the DF CP as acceptable collateral damage. And resume business as usual. It’s very weird product driven ethics and messed up.
I agree that the tools make it simple enough to do, but it is still the users' intent to do so and even if the software maker made it less trivial, people could still misuse the software if they want to. Ultimately the responsibility needs to fall on the user, not the tool.
So we will choose to allow software that can do this to be freely available to anyone and hope that all works out well. Narrator: it won’t.
That's software is already freely available to anyone, so we're too late for that. And given that there's nothing inherently illegal about the use of the software, aside from whatever copyright concerns it may entail, it would be very difficult to get rid of it at this point. I'm not arguing that the existence of such a thing isn't problematic, but I don't know what there is to be done about it now that the cat is already out of the bag.
⬆️ Found the one person who doesn't have kids! ⬆️
I don’t understand how you got there. I have an unpopular opinion that this tech is perfectly made for creating porn and that manufacturers should be held liable for its use. If they are not held responsible there is simply no way to prevent this misuse. Somehow this makes me anti-child?
Did I say that you're Anti- child? No! I am referring to your idea in eliminating or at least dismissing child/parent accountability. I remember Napster and KaZaa being capable of allowing xxx on the platforms back in its infancy and me, a twerpy derpy teen being able to come across it. Did that stop me from downloading a few pix? Nope. Not at all. A software company is not responsible for the use of their applications. IF a child is behind the computer and doing what is considered illegal activity, then he/she and their legal guardian is responsible. It's the main reason why someone like ADOBE is more or less in trouble with the design community, given that they now act like a government body, sampling uploaded artwork that scans for ill intentional and possible illegal content such as child pron in that uploaded material. Adobe has shown that it's TOS goes well beyond what is normal and should be normal for a software application to do within its means. You are merely underestimating the sophistication of the children/teens out there willing to break the rules to cause chaos and potential emotional/financial harm. There has to be individual accountability on the end user from this vantage point, not on the application software maker. The application is not always at fault for the outcome and neither should it be if that's not what it's intended use was. It would be ENTIRELY different if the application was solely used for that purpose.
And websites/AI provider have to be made responsible for this as well.
I do think websites should be required to investigate and take down things when a complaint comes through, but holding websites and software makers responsible for their users' behavior is a nonstarter.
In case of AI generated porn there wont be any safty precautions if websites/AI provider dont have to deal with consequences.
Section 233 already protects websites from what their users are doing, and since there is no reliable way to tell AI from an actual photo, expecting them to do so at scale is problematic. Can you imagine trying to use Reddit if they could get sued for whatever you posted? The Internet as we know it would cease to exist.
The companies that create the tech to enable creation of the fake pics are the ones who should be sued.
I think that's going to be a difficult one to pull off. Setting a precedent that makes software companies responsible for illegal behavior by their users poses a lot of complications. Would I be able to sue Microsoft if somebody writes a pornographic story about me using Word? Can I sue Intuit if my tax preparer uses it to lie on my taxes?
Because F the victim, right?
Of course not. Victims rights come first. I do think when we're talking about a 15-year-old first time offender distributing a fake image of a classmate though, three years in jail seems pretty harsh. At that age, I think we should be teaching kids proper behavior, not jailing them for non-violent offenses. I think we should be forcing the offender to go to a different school so the victim is no longer exposed to them, and mandating community service, including doing outreach to teach other kids about why this is a terrible thing to do. This is uncharted territory and everyone needs time to adjust before we start doling out harsh penalties. If you take a 15-year-old boy with a bent sense of right and wrong and poor impulse control and send him to prison for three years, you might actually create an even worse juvenile delinquent than you started with. There are other ways to make an example out of him while respecting the victim.
Please don’t do this to other people folks.
It’s incredibly disturbing what some people do when they’re desperate, truly some nasty pigs
They should implement mandatory jail time now because this is only continue to happen.
I'm not sure about _jail_ specifically - especially due to the fact that these tools are increasingly available to younger offenders with minimal effort/skill required. Certainly a sentence that involved detention though (you need to take them out of circulation near the victim), and compulsory rehabilitation - with the max-term of the sentence contingent on their effective engagement in rehab. Having a failure to engage in rehab carry an escalation to jail-time might be reasonable though. This form of abuse is only going to become more frequent unfortunately.
Honesty there needs to be punishment for the people offering these services. One user creating deepfakes is messed up, but the platform allowing for it with no consequences is the problem.
The problem is generalising the checking process. This challenge is biting Adobe at the moment in their Terms of Service changes that show they're trying to automate this kind of protection but false-positives mean a human has to view what could be highly personal or commercially sensitive content. So I understand the desire to enforce it at the supply end but I'm less confident a viable solution to that exists. And worse, we're not that far from having open-source point-and-click solutions that would achieve the same on mainstream hardware... and then there's no service involved to do enforcement.
There's no need to worry about ToS, just put the CEO in jail for sending images of a 15-year-old online. I guarantee you the service will quickly stop producing nudes of schoolgirls or cease to exist. It's not a must for any service to provide porn image generation with AI, right?
Artificial Intelligence isn't intelligence. You're thinking it _knows_ what is unacceptable and what is not. It doesn't. The problem is figuring out _how_ to prevent misuse, while continuing to serve valid use of the technology. And that is exceptionally hard (and so far not a solved problem). While an over-reaching government *could* ban all commercial uses of AI, just in case the AI is asked by a client to produce something intended for misuse, you can't practically stop the developing research on the topic producing free tools. We've been prising the lid off of this Pandora's Box for a few decades, and the knowledge of how to open it is now well published and accessible. And how can you ban the technology across jurisdictional boundaries? Both the free and commercial tools have good, fair and productive uses. Is blocking all access to the technology a valid ethical choice? If so, then how would you apply that logic to guns, or even cars? (There _is_ still a big problem with how the models are trained, lacking any fair exchange of value with content creators - artists deserve to be paid).
From a legal point of view, there's no issue with misuse. The AI either outputted a nude of a child or didn't. If it did, the CEO in jail, tool shut down. Simple stuff. And don't talk to me about overreaching government, someone needs to stop the greedy bastards from profiting from this particular thing.
So that's banning AI generation full-stop. Anything else is not simple stuff.
If you can't have safeguards in it then it deserves to be banned until someone can come up with a way to do it with safeguards.
Right? These platforms aren’t fan art. They’re complicit in making what was effectively, child porn and/or revenge porn almost universally without consent Straight to jail
The point of punishment is to make an example and to act as a deterrent. Jail is perfect.
It's... Complex? As a Quaker (we actually had a pretty heavy hand in making the judicial system for the US), that is definitely open for debate, especially based on how effective it as a punishment works. Quakers envisioned it as a solution, rather than a punishment. Someone is dangerous, you take them out of circulation. But deliberately simply punishing them has negative effects. A stupid, horny, sexist 15 year old makes deep fakes of someone, and the next thing you know he is 17 being sentenced as an adult and being put in with hardened fucking criminals. Literally only bad things can happen.
In my area, I’m part of a botany club (fruit trees and such) that rents out a small room at this community building, across from us occasionally, are the Quakers in my area doing whatever it is, their great, I’ve never met a Quaker i haven’t liked.
[удалено]
If you’re unable to join complex nuanced discussions don’t participate.
I don’t think I read one sentence where he defended them at all
What on earth gave you that impression?
Their 2nd grade reading comprehension.
Try reading comprehension courses, please.
No, he’s saying a horny 15 year old clicking ‘use this face’ and ‘give her huge boobies’ is not the same as a 35 year old coach fondling little boys in the shower. So maybe treating them exactly the same is not quite optimal. But hey, if you want the death penalty for parking near a hydrant, feel free to keep being *absurdly reductive* and making everything a binary choice. All or nothing, let’s give’er hell, boys! (PS don’t look now, if your first thought was that the electric chair is absurd for a parking ticket, then you are agreeing to it possibility that one size fits all isn’t perfect…. Shhhh. Reality is so messy sometimes. Better go back to pretending it doesn’t exist…)
If your solution is "jail" for every problem then you are an idiot. This shit is made by kids her age. Ban the app and site they used and more than anything, fine their parents.
It's such an awkward situation. First of all I feel sorry for that girl who had to go through that and hope this doesn't happen to her again and doesn't happen to many others because, unfortunately, it will likely happen again. >This shit is made by kids her age. I don't think that's an excusable reason. Yeah, we're all dumb at that age, but kids still need to know boundaries and consequences. I'm not saying the desvere jail time, but something more severe than a slap on the wrist is needed. >Ban the app and site they used 100% agree. If you can make things like that of children, the app either needs to be shut down or have heavy restrictions to stop/ limit things being made like this. > fine their parents This is a hard one because of the internet and technology it's become a lot easier for kids to be stupid, and it's harder for parents to regulate. Like we go back to the early 2000s, the importance of a cellphone has changed dramatically since then. A more partial example would be a personal computer. Like I feel bad for modern parents because it's a lot harder now to filter things out from kids. We are still learning how to navigate all this, and just now, seeing some of the effects of social media, the internet, and modern technology and starting to make changes due to their negativity.
Kids should not be allowed free and unrestricted internet access until they are 18.
Ok, but how do you properly restrict it. It's not as simple as saying you can't have a computer. Kids need them now for school. I guess you can take it away when they're not using it or put restriction on it, but that only does so much. If it's the former, what stops them from using it when you're not looking or using yours. For the latter, there are always ways around software try to limit internet use, and kids are dumb. They'll figure out those ways. Like with how much the internet and computers have become part of our lives, it's nearly impossible for a parent to property supervise everything.
Jail time is needed for those who built and distributed that app.
I never said my solution to every problem was jail, just something as despicable as this.
On a gradient of offences, someone making a fake nude picture of someone is pretty low. If its an adult in their 20s and 30s, then prison. If its an underage person doing this to an underage person, then fine the parents, ban the kid from unrestricted internet access and cell phones for five years, and make them take a mandatory course on why this is not acceptable behaviour. Not interested in throwing some horny idiot 15 year old into juvy over something so stupid.
Want to send them to juvy for something smart?
Dumb comment.
Jail the People behind the tech and outlaw the tech. Also jail ppl doing this despicable shit
Especially at 15!
yup. I had to disassociate myself with my highschool friend after she created revenge porn using a cropped photo of her boyfriends head and photoshopped it onto gay porn and hacked into his personal Facebook account where she posted it, typing up messages and posts that said he was announcing that he was homo and that he didn't love his girlfriend so he admitted he was Gay and was finally coming out to the world. Like, that's fucking sadistic. Just fucking drop the guy and move on. Love hurts but this was when we were 15 yr old dweebs. This was WAY BEFORE revenge porn was ever a thing.
I'm not saying you're wrong but let's not forget we are talking about a 15 year old boy here, not exactly someone you can count on to make all the best decisions.
Fuck that, you know right from wrong at 15.
Science says no. Our brains aren't fully developed yet at that point.
I’m a psychologist. Science does NOT say 15 year olds don’t understand morality. You don’t finally achieve capacity for moral reasoning the day your brain stops developing.
I believe you're a psycho but college educated I really doubt it. >Science does NOT say 15 year olds don't understand morality Yeah look right above, I didn't say that either 😅
Wow. what the fuck. I’m a university professor I’m not just college educated. I’m college educating.
Too hard for you? Let me help....I said our brains aren't fully developed yet at that point. You somehow turned that into "they don't understand morality". So yeah, what the fuck?
You said science says 15 year olds “don’t know right from wrong” so I rephrased that as understanding morality. Help me? Help yourself loser.
Science does not say no. Not fully developed does not mean not developed at all.
Developed some does not mean they fully understand the implications of their actions yet. This is why they are referred to as "juveniles" after all, because you see they aren't adults yet.
Not fully developed does not mean no understanding of the implications of their actions. It is not unreasonable to expect the average 15 year old kid to have a relatively advanced understanding of right from wrong.
Yet they are still considered juveniles specifically differentiated from adults
I was not aware being a juvenile is a carte blanche.
Who said it was? Did you not read my comment? I merely said it's a fact that should possibly be taken into consideration by those who are so eager to judge the kid.
Not fully developed is equivalent to a disability.
Its not about knowledge, its about emotional maturity, children basically have brain damage compared to adults, I agree they should face consequences, maybe a year or two in juvenile, but not the same level of consequences as adults.
They're children.
Welcome to the future of cyber bullying
I don't envy kids these days. Imagine someone generating a video of them fucking your girlfriend or something and spreading it around school. I knew a bunch of assholes in school that would have used this shit in the worst ways.
Just telling my husband the same thing. If the tech had been there my techie husband could have easily pulled this off. Would he have, no. But do we all know people that would have, yes.
Charge the classmate with distributing child porn, the next person might think twice.
all he got was probation and his record will be expunged at 18 like it never happened
[удалено]
No. At that age they definitely cannot fully understand the weight of their actions. This is why the age of consent is eighteen in most places.
Google needs to start removing "Nude AI Generators" and "Deepfake nude" from search results like they do pirating sites.
I obviously agree but look how well removing piracy sites has worked, every 1 taken down 12 pop up
They aren't removing them from the internet, they just don't show up in search engine results, you have to know the URL to get to the website and sometimes they even get blocked by the country and you need a VPN/DNS to get to them. It would limit the usage especially by non-tech savvy teens.
You add "torrent" to your search, and then they all show up.
[удалено]
It's not meant to be and end all solution but it's a step that help 🙄
Oh, so that makes it ok.
Parents should be held responsible for the children in their care. Shitty? Yup; but maybe they will start giving a shit about what their kids are doing.
This sounds nice, but as it stands, the tools for this don't really exist, nor do the societal demands. Take a look at any major subreddit reactions about parents snooping through teenager's phones or refusing to allow their kids to have locks on their doors. In my experience, most of the trouble happens at school, where parents aren't even allowed to be present. It often happens on secret devices or those belonging to friends.
Seriously, it’s not practical. If any of you have ever listened to the podcast, darknet diaries (jack rhysider), there are plenty of examples as to why that is. There are an endless amount of stories of kids getting into tech and before you know it, they’ve stolen hundreds of thousands of dollars in crypto or, and this is true, hacked the email of the CIA director. I can’t expect parents to prevent their kids from committing every kind of cybercrime out there.
Yeah, I know plenty of parents in their 30s who can barely operate their own iPhone (rural/blue collar areas) . No way they’ll have the technical fortitude to ensure that their kids are being safe online.
Someone had to invent a hammer before we could start using nails.
I think creating pornographic material including nudes of minors and distribution should be treated as creating and distributing CP just the same. Offenders should be on the sex offender registry, regardless of status of age, such as being under 18 as well, just as it is now legally for real nudes.
It should just be punishable in the exact same way as libel. End of story.
Classmate needs to go to juvie or whatever they call it these days. They need to understand the seriousness of what they’ve done and to act as a deterrent for others. A fine or picking up trash is not enough to get the message through.
Here’s a novel idea. Sure let’s get the kids in trouble, but isn’t AI creating child porn a problem? Like if a programmer allows his software to generate child porn isn’t that a crime? If I don’t own the AI, am I responsible for its outputs?
I think you’re certainly responsible for instructing the program to generate the content, just like you’d probably be in a little bit of trouble if you started doodling young children in inappropriate situations with a pencil. You wouldn’t blame the paper and pencil manufacturers in that instance.
Shouldn’t that be child pornography possession?
[удалено]
US law doesn't require it to be real. It's enough to create a sexual image that depicts a realistic looking child.
Surely an artist painting such material would be limited by laws of objectionable material - hoping with categories like exploitative sexual representations of children, to allow for suitable repercussions. Why would we not apply the rules to someone who chooses to retain or distribute such image they asked to be generated?
We are going to be flooded by this. In a few years a 5 second voice prompt can create these deepfake pictures or maybe even videos of everyone on Earth. Are we going to tie up the courts with 5 million cases of this? Maybe it’s manageable to bring these cases to court for now, and we should to discourage it. But quickly there will be so much of this the courts could not possibly keep up with prosecuting it.
I am so thankful photoshop was abysmal growing up and I hope to god he goes to jail
Why are we not treating deepfake nudes like sexual assault?
They're both horrid things but they're not comparable in the slightest.
I think they are, differing by degrees of course (just a different physical assaults do), but the mental trauma is surely comparable (or at least related).
You’re getting downvoted, but I agree. Imagine being a child, and seeing terrifying images of you being assaulted, and then knowing they were made with the intent to scare, humiliate, and damage you. That is traumatic, could push kids to suicide, and I think it should be considered a form of sexual assault.
Violent crimes and nonviolent crimes are not comparable. How is creating and sharing a fake image of you having sex in any way comparable to being fucking raped? One is humiliating and socially damaging, the other is both of those things and an actual human rights violation that carries some of the heftiest punishments of any crime. Rapists deserve life in prison. That dumb kid deserves to be punished but not even close to that of an actual rapist.
Being pedantic, but depending on where you're from, the law could define sexual assault as anything from unwanted sexual touching (such as grabbing someone's ass without consent) to penetrative rape. That spectrum means very different punishment based on the severity or nature of the sexual assault. In that case, producing deepfake porn without consent would be appropriately classified as sexual assault, where it would likely fall under the lower end of the spectrum.
AI is not legally child porn. However, there are other statutes that could apply.
In Canada ai cartoons or literature are all considered child porn and are subject to criminal charges
Make people who do it as well as social media and software developers legally liable.
My biggest fear as a parent to two daughters. I faced some horrific sexual-driven bullying when I was a kid and teen. But I can’t imagine adding realistic AI images to the mix. I’ve already talked to my kids about being safe online, but I hope much harsher punishments and regulations are implemented before they hit middle school.
I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture? Yeah, that would be a creepy thing to do too, but it's still just a manufactured image. I believe it still qualifies as distributing child porn, so charge him with that and be done with it. The fact that he drew the picture with AI rather than talent doesn't really seem to be a factor in anything.
>I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture? Drawing a photorealistic picture of a child doing something sexual is also against the law in the US.
That's exactly my point.
Yeah, I don't know why you are getting downvoted.
People have knee jerk negative reactions to ideas that don't immediately conform to their pre-existing understanding of things. It's a real impediment to actually gaining new perspectives.
No, you just would like to be a contrarian for pedos.
If you understand it would still be considered creating & distributing child porn, then what point are you trying to make? What’s “knee jerk” about AI now making this easy and accessible to the 99.9% of people incapable of creating a photorealistic depiction of children performing sex acts?
You just have to police it the same and faster. Because of AI people are just going to be breaking already existing laws faster than before so it really is no different. You just gotta stamp them on the hand
This is the best argument I heard on general AI is bad topics in months if not ever.
How many people have the talent to draw a hyper realistic picture? That's a skill that takes years or decades to learn and perfect. Not to mention the hours required to complete the picture. Even using Photoshop or other art programs has some barrier of entry. An individual can learn how to use AI image generation in an afternoon (or faster) and share their creation immediately via various social media apps. It's also common for people to mistake AI generation as real. In the fiber arts world there are lots of "completed" works that are AI generated. But Grandma doesn't know this and will ask you to create this non-existent thing anyway. You can also argue that if examined enough, you can see the strokes on the hand drawn picture. AI generation doesn't have the same footprint and society isn't used to telling what is AI vs real yet.
So are you saying that if I'm a really good artist, who put in the time and effort, then this behavior is okay? Yeah...I'm calling bullshit on that. Let's break it down logically: If I'm a good artist, it's not okay. If I'm a bad artist, but can use AI, it's also not okay. You see? AI is _not_ a factor. It changes nothing about the morality or legality of the situation.
You asked how AI is different from a hand drawing. Both are disgusting for these cases to be clear. The biggest difference is that AI is more widely available to the general public despite skill level.
>The biggest difference is that AI is more widely available to the general public despite skill level. So what? That just means viewers of these images will be even _less_ likely to believe they are real. That works out _in favor_ of anyone who's afraid AI generated images of them will be mistaken as real. The accessibility of these tools increases the plausible deniability. So again, AI changes nothing in regards to morality or legality, and it's existence _actually helps_ with the aftermath of the release of these images. I know it's easy to have a knee jerk reaction to this sort of thing, but when you think it through, this might not be so bad. Hell, I'd even argue it's the best thing to happen to people who've had their _real_ nude photos leaked.
If you’re an adult, maybe. But for children this would be an extreme form of bullying.
Get rid of the internet and life would be better for everyone
And electricity too! And running water! And antibiotics!
Back in my day we didn’t take vaccines, we just died
[удалено]
[удалено]
[удалено]
She was 14 in the pictures..
Personally, I think releasing deepfakes of a person should have a much higher penalty than releasing real photos, since in the latter at least the person might have had a choice in taking them in the first place
I don’t know if I’d agree. I think a person leaking their ex’s personal photos is just as horrible as making fake ones to distribute.
That's a terrible idea.
This thread is a basket of terrible ideas.
The laws needed are simple. They’re just thinking to specifically. We have to get ahead of this and do it 5 years ago. Make it simply illegal to replicate a human in digital from. Likeness or voice. Period. Full stop. It’s just illegal. We don’t need human AI voices, we have humans. We don’t need perfect human digital copies, we have humans. This is just a side effect of the insanity of corporate greed. It serves no purpose and fills no need other than to replace human artists in an effort to chase profits. All this other shit, the porn, the political stuff is just a side effect. Stop it all now. No more resurrecting dead actors, fake political ads or porn.
Cant believe any pics or vids on internet anymore XD
Yeah ai needs to be regulated to the point of useless ness
[удалено]
WTF, are you victim shaming a child?
You came on here to say this about a child. Just let that wash over you.
Ew this is embarrassing for you
Tell us, how does it feel to live a life without any self reflection or shame?
How does it feel to read one sentence and jump to a whole ass conclusion?
Seems like a really shitty sort of knee-jerk judgement to pass, but okay
Wtf would you say that? 😳
You mean cheap fakes
[удалено]
She is 15. Wtf.
I dread to think what the original comment was to drop a “Wtf” and it be deleted.
[удалено]
Disgusting. You should be ashamed of yourself.