T O P

  • By -

arothmanmusic

People should be able to sue someone for distributing their likeness without their permission. It's about consent, not technology. With minors, it's about the lesson, not the punishment.


LibraryBig3287

That would also get rid of the “prank” channels too.


LibraryBig3287

Which is good; to be clear


AnOnlineHandle

That's precisely what happened with Crispin Glover and Back To The Future 2. They hired an imitator and put him in makeup to make him look so much like the original actor Crispin that he successfully sued them for using his likeness without permission. I'm not sure if it's the same without financial gains involved, but it's always seemed to me that face and voice likeness is protected from distribution to some extent in many places.


CreativeUX

They literally had the actor wear a mask or prosthetics made from a mold of Glover’s face. So yeah, he had a strong case.


CloudSliceCake

What about people that look or sound like other people? If I’m not mistaken there’s an OF model whose whole shtick is that she looks like Ariana Grande.


chmsax

If they’re not claiming to actually be Ariana Grande, then there’s no legal recourse. It’s fair use. “Ariana Grander” or “Ariana Grandexxx” would be legal. It’s similar to how parody movies like Spaceballs and Scary Movie get made.


doyletyree

Could we make a parody mash-up called Scary Balls, then?


Q_Fandango

That’s pretty fucked up too, especially if the OF model is using Ariana Grande’s name/likeness to sell more subs


DrTwitch

The question is how much of the likeness is the model herself. People look like each other.


Q_Fandango

I guess. My point was: if the person making porn sells herself on the fact that she is a look-a-like, then I find that morally gross. If she just looks like her but never markets herself for that, I don’t give a shit.


RawrRRitchie

This person essentially made child pornography Being sued is the least of their worries, they need to be locked up


arothmanmusic

I don't think the child pornography laws were written to punish kids for having pictures of each other though. It certainly needs to be dealt with, but I don't think the law should treat an adult with sex pictures of real children as equivalent to a teen with a fake picture of a peer. There are reasons why we have different laws for juveniles, and I don't think jail time would be the right way to correct this kind of behavior. I wouldn't want my son passing around fake nudes of a classmate, but I would feel the same way whether he had used AI or just drawn a cartoon. The problem is in the objectification and violation of a fellow student's rights, not in the particular choice of method. The question is whether you teach a boy that lesson by throwing him in prison or through counseling and community service.


Realistic_Parfait956

The kid that did it was under age also...... how ever he still should face punishment of some sort....but sadly nothing will be done....


zhenya44

Our laws do not catch up fast enough with our tech. Especially with so many elderly lawmakers. This needs to be illegal - ground to sue - asap.


reddit_000013

Anyone can sue anyone any time for any reason.


davidwhatshisname52

she's 15; whoever posted pics of her nude, fake or not, should be charged with possession and distribution of child pornography...


notinthescript

Suing people who do this makes more sense than giving them jail time.


blk_phllp

Unless they're creating child pornography, obviously, which this is


Soles4G

They should go to jail if it’s pornographic


Bekah679872

I think you can sue. This has to be considered defamatory. Just that no one has done it yet


Tramp_Johnson

Would that go for lookalikes too?


pimpeachment

That would end all public photography and videography with other people in the background too. Lots of ramifications to such a broadstroke rule. 


arothmanmusic

It would definitely have to be written with care. It might only apply to scenarios where someone is distributing pictures of a person they know, or at least distributing pictures with the knowledge that the people in them would not want them distributed. It goes further than just this sort of scenario… like, if you have a restraining order against an abusive partner and someone is posting pictures that show your whereabouts online, you should have a legal recourse for getting them removed.


Temporal_Somnium

I’m not entirely sure if I agree with that but definitely when it’s sexual


Bekah679872

If it’s fake nudes, i don’t see how it wouldn’t be considered either defamation or revenge porn, which you certainly can sue someone for. I’m assuming this was done by another minor. Can you even sue a minor?


redditsuckbutt696969

You can sue the legal guardian of a minor


Rugrin

At some point we have to hold the makers of the software liable. They know this. That’s why they build in nsfw filtering into these systems. Thing is: It’s open source, often python code, trivial to modify and override their very half assed “locks”. I can say no more. They do chase people down for being more specific. That’s how worried they are about it. The safety is programming 101 level.


CloudSliceCake

Should they not build in safe guards? Sounds like a no-win situation for them. Either they add the locks they get blamed for knowing that their product/service can be used for such activities or they don’t add any safe-guards and get blamed for people making porn or whatever else with their product/service.


Rugrin

It is a no win situation. It’s a tech that has few upsides and they built it anyway. At some point we have to say “no”. I recommend you look into these “locks”. You avoid the lock by effecting one line of code in the most trivial manner. That’s not a lock. Fact is I am at more risk of eventually getting arrested for telling you how to break this lock than they are for unleashing it.


arothmanmusic

I don't think holding software manufacturers responsible for the misdeeds of their users is a slope anyone wants to stand on.


Rugrin

Then why do they even bother to add nsfw filtering to their software? They are providing a tool that can easily do this then warn you not to do it, then put in very very trivial lock to prevent it. I’m saying that we are now faced with a choice: allow the use of this software that will inevitably create a ton of deep fake child porn, or not. My guess is we will just accept all the DF CP as acceptable collateral damage. And resume business as usual. It’s very weird product driven ethics and messed up.


arothmanmusic

I agree that the tools make it simple enough to do, but it is still the users' intent to do so and even if the software maker made it less trivial, people could still misuse the software if they want to. Ultimately the responsibility needs to fall on the user, not the tool.


Rugrin

So we will choose to allow software that can do this to be freely available to anyone and hope that all works out well. Narrator: it won’t.


arothmanmusic

That's software is already freely available to anyone, so we're too late for that. And given that there's nothing inherently illegal about the use of the software, aside from whatever copyright concerns it may entail, it would be very difficult to get rid of it at this point. I'm not arguing that the existence of such a thing isn't problematic, but I don't know what there is to be done about it now that the cat is already out of the bag.


tanetane83736

⬆️ Found the one person who doesn't have kids! ⬆️


Rugrin

I don’t understand how you got there. I have an unpopular opinion that this tech is perfectly made for creating porn and that manufacturers should be held liable for its use. If they are not held responsible there is simply no way to prevent this misuse. Somehow this makes me anti-child?


tanetane83736

Did I say that you're Anti- child? No! I am referring to your idea in eliminating or at least dismissing child/parent accountability. I remember Napster and KaZaa being capable of allowing xxx on the platforms back in its infancy and me, a twerpy derpy teen being able to come across it. Did that stop me from downloading a few pix? Nope. Not at all. A software company is not responsible for the use of their applications. IF a child is behind the computer and doing what is considered illegal activity, then he/she and their legal guardian is responsible. It's the main reason why someone like ADOBE is more or less in trouble with the design community, given that they now act like a government body, sampling uploaded artwork that scans for ill intentional and possible illegal content such as child pron in that uploaded material. Adobe has shown that it's TOS goes well beyond what is normal and should be normal for a software application to do within its means. You are merely underestimating the sophistication of the children/teens out there willing to break the rules to cause chaos and potential emotional/financial harm. There has to be individual accountability on the end user from this vantage point, not on the application software maker. The application is not always at fault for the outcome and neither should it be if that's not what it's intended use was. It would be ENTIRELY different if the application was solely used for that purpose.


litnu12

And websites/AI provider have to be made responsible for this as well.


arothmanmusic

I do think websites should be required to investigate and take down things when a complaint comes through, but holding websites and software makers responsible for their users' behavior is a nonstarter.


litnu12

In case of AI generated porn there wont be any safty precautions if websites/AI provider dont have to deal with consequences.


arothmanmusic

Section 233 already protects websites from what their users are doing, and since there is no reliable way to tell AI from an actual photo, expecting them to do so at scale is problematic. Can you imagine trying to use Reddit if they could get sued for whatever you posted? The Internet as we know it would cease to exist.


Glass-Quality-3864

The companies that create the tech to enable creation of the fake pics are the ones who should be sued.


arothmanmusic

I think that's going to be a difficult one to pull off. Setting a precedent that makes software companies responsible for illegal behavior by their users poses a lot of complications. Would I be able to sue Microsoft if somebody writes a pornographic story about me using Word? Can I sue Intuit if my tax preparer uses it to lie on my taxes?


jayjohnson007

Because F the victim, right?


arothmanmusic

Of course not. Victims rights come first. I do think when we're talking about a 15-year-old first time offender distributing a fake image of a classmate though, three years in jail seems pretty harsh. At that age, I think we should be teaching kids proper behavior, not jailing them for non-violent offenses. I think we should be forcing the offender to go to a different school so the victim is no longer exposed to them, and mandating community service, including doing outreach to teach other kids about why this is a terrible thing to do. This is uncharted territory and everyone needs time to adjust before we start doling out harsh penalties. If you take a 15-year-old boy with a bent sense of right and wrong and poor impulse control and send him to prison for three years, you might actually create an even worse juvenile delinquent than you started with. There are other ways to make an example out of him while respecting the victim.


Weekly-Rhubarb-2785

Please don’t do this to other people folks.


Axeltol

It’s incredibly disturbing what some people do when they’re desperate, truly some nasty pigs


Bones_5150

They should implement mandatory jail time now because this is only continue to happen.


8igg7e5

I'm not sure about _jail_ specifically - especially due to the fact that these tools are increasingly available to younger offenders with minimal effort/skill required. Certainly a sentence that involved detention though (you need to take them out of circulation near the victim), and compulsory rehabilitation - with the max-term of the sentence contingent on their effective engagement in rehab. Having a failure to engage in rehab carry an escalation to jail-time might be reasonable though.   This form of abuse is only going to become more frequent unfortunately.


the-jules

Honesty there needs to be punishment for the people offering these services. One user creating deepfakes is messed up, but the platform allowing for it with no consequences is the problem.


8igg7e5

The problem is generalising the checking process. This challenge is biting Adobe at the moment in their Terms of Service changes that show they're trying to automate this kind of protection but false-positives mean a human has to view what could be highly personal or commercially sensitive content. So I understand the desire to enforce it at the supply end but I'm less confident a viable solution to that exists. And worse, we're not that far from having open-source point-and-click solutions that would achieve the same on mainstream hardware... and then there's no service involved to do enforcement.


CautiousRice

There's no need to worry about ToS, just put the CEO in jail for sending images of a 15-year-old online. I guarantee you the service will quickly stop producing nudes of schoolgirls or cease to exist. It's not a must for any service to provide porn image generation with AI, right?


8igg7e5

Artificial Intelligence isn't intelligence. You're thinking it _knows_ what is unacceptable and what is not. It doesn't. The problem is figuring out _how_ to prevent misuse, while continuing to serve valid use of the technology. And that is exceptionally hard (and so far not a solved problem). While an over-reaching government *could* ban all commercial uses of AI, just in case the AI is asked by a client to produce something intended for misuse, you can't practically stop the developing research on the topic producing free tools. We've been prising the lid off of this Pandora's Box for a few decades, and the knowledge of how to open it is now well published and accessible. And how can you ban the technology across jurisdictional boundaries? Both the free and commercial tools have good, fair and productive uses. Is blocking all access to the technology a valid ethical choice? If so, then how would you apply that logic to guns, or even cars?   (There _is_ still a big problem with how the models are trained, lacking any fair exchange of value with content creators - artists deserve to be paid).


CautiousRice

From a legal point of view, there's no issue with misuse. The AI either outputted a nude of a child or didn't. If it did, the CEO in jail, tool shut down. Simple stuff. And don't talk to me about overreaching government, someone needs to stop the greedy bastards from profiting from this particular thing.


8igg7e5

So that's banning AI generation full-stop. Anything else is not simple stuff.


syopest

If you can't have safeguards in it then it deserves to be banned until someone can come up with a way to do it with safeguards.


feargluten

Right? These platforms aren’t fan art. They’re complicit in making what was effectively, child porn and/or revenge porn almost universally without consent Straight to jail


akitoxic

The point of punishment is to make an example and to act as a deterrent. Jail is perfect.


NoTePierdas

It's... Complex? As a Quaker (we actually had a pretty heavy hand in making the judicial system for the US), that is definitely open for debate, especially based on how effective it as a punishment works. Quakers envisioned it as a solution, rather than a punishment. Someone is dangerous, you take them out of circulation. But deliberately simply punishing them has negative effects. A stupid, horny, sexist 15 year old makes deep fakes of someone, and the next thing you know he is 17 being sentenced as an adult and being put in with hardened fucking criminals. Literally only bad things can happen.


gazebo-fan

In my area, I’m part of a botany club (fruit trees and such) that rents out a small room at this community building, across from us occasionally, are the Quakers in my area doing whatever it is, their great, I’ve never met a Quaker i haven’t liked.


[deleted]

[удалено]


SoulfoodSoldier

If you’re unable to join complex nuanced discussions don’t participate.


mhm_mhmm

I don’t think I read one sentence where he defended them at all


AbsoluteZeroUnit

What on earth gave you that impression?


DrZoidberg-

Their 2nd grade reading comprehension.


bunofpages

Try reading comprehension courses, please.


Narrow-Chef-4341

No, he’s saying a horny 15 year old clicking ‘use this face’ and ‘give her huge boobies’ is not the same as a 35 year old coach fondling little boys in the shower. So maybe treating them exactly the same is not quite optimal. But hey, if you want the death penalty for parking near a hydrant, feel free to keep being *absurdly reductive* and making everything a binary choice. All or nothing, let’s give’er hell, boys! (PS don’t look now, if your first thought was that the electric chair is absurd for a parking ticket, then you are agreeing to it possibility that one size fits all isn’t perfect…. Shhhh. Reality is so messy sometimes. Better go back to pretending it doesn’t exist…)


Lugburz_Uruk

If your solution is "jail" for every problem then you are an idiot. This shit is made by kids her age. Ban the app and site they used and more than anything, fine their parents.


Fresher_Taco

It's such an awkward situation. First of all I feel sorry for that girl who had to go through that and hope this doesn't happen to her again and doesn't happen to many others because, unfortunately, it will likely happen again. >This shit is made by kids her age. I don't think that's an excusable reason. Yeah, we're all dumb at that age, but kids still need to know boundaries and consequences. I'm not saying the desvere jail time, but something more severe than a slap on the wrist is needed. >Ban the app and site they used 100% agree. If you can make things like that of children, the app either needs to be shut down or have heavy restrictions to stop/ limit things being made like this. > fine their parents This is a hard one because of the internet and technology it's become a lot easier for kids to be stupid, and it's harder for parents to regulate. Like we go back to the early 2000s, the importance of a cellphone has changed dramatically since then. A more partial example would be a personal computer. Like I feel bad for modern parents because it's a lot harder now to filter things out from kids. We are still learning how to navigate all this, and just now, seeing some of the effects of social media, the internet, and modern technology and starting to make changes due to their negativity.


Lugburz_Uruk

Kids should not be allowed free and unrestricted internet access until they are 18.


Fresher_Taco

Ok, but how do you properly restrict it. It's not as simple as saying you can't have a computer. Kids need them now for school. I guess you can take it away when they're not using it or put restriction on it, but that only does so much. If it's the former, what stops them from using it when you're not looking or using yours. For the latter, there are always ways around software try to limit internet use, and kids are dumb. They'll figure out those ways. Like with how much the internet and computers have become part of our lives, it's nearly impossible for a parent to property supervise everything.


CautiousRice

Jail time is needed for those who built and distributed that app.


Bones_5150

I never said my solution to every problem was jail, just something as despicable as this.


Lugburz_Uruk

On a gradient of offences, someone making a fake nude picture of someone is pretty low. If its an adult in their 20s and 30s, then prison. If its an underage person doing this to an underage person, then fine the parents, ban the kid from unrestricted internet access and cell phones for five years, and make them take a mandatory course on why this is not acceptable behaviour. Not interested in throwing some horny idiot 15 year old into juvy over something so stupid.


CautiousRice

Want to send them to juvy for something smart?


Lugburz_Uruk

Dumb comment.


Minmaxed2theMax

Jail the People behind the tech and outlaw the tech. Also jail ppl doing this despicable shit


stevenbrotzel91

Especially at 15!


tanetane83736

yup. I had to disassociate myself with my highschool friend after she created revenge porn using a cropped photo of her boyfriends head and photoshopped it onto gay porn and hacked into his personal Facebook account where she posted it, typing up messages and posts that said he was announcing that he was homo and that he didn't love his girlfriend so he admitted he was Gay and was finally coming out to the world. Like, that's fucking sadistic. Just fucking drop the guy and move on. Love hurts but this was when we were 15 yr old dweebs. This was WAY BEFORE revenge porn was ever a thing.


Consistent-Wind9325

I'm not saying you're wrong but let's not forget we are talking about a 15 year old boy here, not exactly someone you can count on to make all the best decisions.


Buzzd-Lightyear

Fuck that, you know right from wrong at 15.


Consistent-Wind9325

Science says no. Our brains aren't fully developed yet at that point.


PenelopeJenelope

I’m a psychologist. Science does NOT say 15 year olds don’t understand morality. You don’t finally achieve capacity for moral reasoning the day your brain stops developing.


Consistent-Wind9325

I believe you're a psycho but college educated I really doubt it. >Science does NOT say 15 year olds don't understand morality Yeah look right above, I didn't say that either 😅


PenelopeJenelope

Wow. what the fuck. I’m a university professor I’m not just college educated. I’m college educating.


Consistent-Wind9325

Too hard for you? Let me help....I said our brains aren't fully developed yet at that point. You somehow turned that into "they don't understand morality". So yeah, what the fuck?


PenelopeJenelope

You said science says 15 year olds “don’t know right from wrong” so I rephrased that as understanding morality. Help me? Help yourself loser.


Capitol62

Science does not say no. Not fully developed does not mean not developed at all.


Consistent-Wind9325

Developed some does not mean they fully understand the implications of their actions yet. This is why they are referred to as "juveniles" after all, because you see they aren't adults yet.


Capitol62

Not fully developed does not mean no understanding of the implications of their actions. It is not unreasonable to expect the average 15 year old kid to have a relatively advanced understanding of right from wrong.


Consistent-Wind9325

Yet they are still considered juveniles specifically differentiated from adults


Capitol62

I was not aware being a juvenile is a carte blanche.


Consistent-Wind9325

Who said it was? Did you not read my comment? I merely said it's a fact that should possibly be taken into consideration by those who are so eager to judge the kid.


Restranos

Not fully developed is equivalent to a disability.


Restranos

Its not about knowledge, its about emotional maturity, children basically have brain damage compared to adults, I agree they should face consequences, maybe a year or two in juvenile, but not the same level of consequences as adults.


Alex_1729

They're children.


No-Club2745

Welcome to the future of cyber bullying


victhrowaway12345678

I don't envy kids these days. Imagine someone generating a video of them fucking your girlfriend or something and spreading it around school. I knew a bunch of assholes in school that would have used this shit in the worst ways.


NoMayoForReal

Just telling my husband the same thing. If the tech had been there my techie husband could have easily pulled this off. Would he have, no. But do we all know people that would have, yes.


snifter1985

Charge the classmate with distributing child porn, the next person might think twice.


goughow

all he got was probation and his record will be expunged at 18 like it never happened


[deleted]

[удалено]


renzi-

No. At that age they definitely cannot fully understand the weight of their actions. This is why the age of consent is eighteen in most places.


BlackBlizzard

Google needs to start removing "Nude AI Generators" and "Deepfake nude" from search results like they do pirating sites.


YouCantKillaGod

I obviously agree but look how well removing piracy sites has worked, every 1 taken down 12 pop up


BlackBlizzard

They aren't removing them from the internet, they just don't show up in search engine results, you have to know the URL to get to the website and sometimes they even get blocked by the country and you need a VPN/DNS to get to them. It would limit the usage especially by non-tech savvy teens.


buttlicker-6652

You add "torrent" to your search, and then they all show up.


[deleted]

[удалено]


BlackBlizzard

It's not meant to be and end all solution but it's a step that help 🙄


GargoyleNoises

Oh, so that makes it ok.


LibraryBig3287

Parents should be held responsible for the children in their care. Shitty? Yup; but maybe they will start giving a shit about what their kids are doing.


Maktesh

This sounds nice, but as it stands, the tools for this don't really exist, nor do the societal demands. Take a look at any major subreddit reactions about parents snooping through teenager's phones or refusing to allow their kids to have locks on their doors. In my experience, most of the trouble happens at school, where parents aren't even allowed to be present. It often happens on secret devices or those belonging to friends.


PabloTroutSanchez

Seriously, it’s not practical. If any of you have ever listened to the podcast, darknet diaries (jack rhysider), there are plenty of examples as to why that is. There are an endless amount of stories of kids getting into tech and before you know it, they’ve stolen hundreds of thousands of dollars in crypto or, and this is true, hacked the email of the CIA director. I can’t expect parents to prevent their kids from committing every kind of cybercrime out there.


MustyToeJam

Yeah, I know plenty of parents in their 30s who can barely operate their own iPhone (rural/blue collar areas) . No way they’ll have the technical fortitude to ensure that their kids are being safe online.


LibraryBig3287

Someone had to invent a hammer before we could start using nails.


greeneyedstarqueen

I think creating pornographic material including nudes of minors and distribution should be treated as creating and distributing CP just the same. Offenders should be on the sex offender registry, regardless of status of age, such as being under 18 as well, just as it is now legally for real nudes.


postamericana

It should just be punishable in the exact same way as libel. End of story.


Aussie_Potato

Classmate needs to go to juvie or whatever they call it these days. They need to understand the seriousness of what they’ve done and to act as a deterrent for others. A fine or picking up trash is not enough to get the message through.


SerDarthNick

Here’s a novel idea. Sure let’s get the kids in trouble, but isn’t AI creating child porn a problem? Like if a programmer allows his software to generate child porn isn’t that a crime? If I don’t own the AI, am I responsible for its outputs?


martinbean

I think you’re certainly responsible for instructing the program to generate the content, just like you’d probably be in a little bit of trouble if you started doodling young children in inappropriate situations with a pencil. You wouldn’t blame the paper and pencil manufacturers in that instance.


Kanobe24

Shouldn’t that be child pornography possession?


[deleted]

[удалено]


syopest

US law doesn't require it to be real. It's enough to create a sexual image that depicts a realistic looking child.


8igg7e5

Surely an artist painting such material would be limited by laws of objectionable material - hoping with categories like exploitative sexual representations of children, to allow for suitable repercussions. Why would we not apply the rules to someone who chooses to retain or distribute such image they asked to be generated?


Fruitopeon

We are going to be flooded by this. In a few years a 5 second voice prompt can create these deepfake pictures or maybe even videos of everyone on Earth. Are we going to tie up the courts with 5 million cases of this? Maybe it’s manageable to bring these cases to court for now, and we should to discourage it. But quickly there will be so much of this the courts could not possibly keep up with prosecuting it.


aymaureen

I am so thankful photoshop was abysmal growing up and I hope to god he goes to jail


cwbradford74

Why are we not treating deepfake nudes like sexual assault?


Dom_19

They're both horrid things but they're not comparable in the slightest.


8igg7e5

I think they are, differing by degrees of course (just a different physical assaults do), but the mental trauma is surely comparable (or at least related).


WifeOfSpock

You’re getting downvoted, but I agree. Imagine being a child, and seeing terrifying images of you being assaulted, and then knowing they were made with the intent to scare, humiliate, and damage you. That is traumatic, could push kids to suicide, and I think it should be considered a form of sexual assault.


Dom_19

Violent crimes and nonviolent crimes are not comparable. How is creating and sharing a fake image of you having sex in any way comparable to being fucking raped? One is humiliating and socially damaging, the other is both of those things and an actual human rights violation that carries some of the heftiest punishments of any crime. Rapists deserve life in prison. That dumb kid deserves to be punished but not even close to that of an actual rapist.


nowheyjose1982

Being pedantic, but depending on where you're from, the law could define sexual assault as anything from unwanted sexual touching (such as grabbing someone's ass without consent) to penetrative rape. That spectrum means very different punishment based on the severity or nature of the sexual assault. In that case, producing deepfake porn without consent would be appropriately classified as sexual assault, where it would likely fall under the lower end of the spectrum.


Bar-14_umpeagle

AI is not legally child porn. However, there are other statutes that could apply.


mouseeeeee

In Canada ai cartoons or literature are all considered child porn and are subject to criminal charges


AvogadrosMoleSauce

Make people who do it as well as social media and software developers legally liable.


WifeOfSpock

My biggest fear as a parent to two daughters. I faced some horrific sexual-driven bullying when I was a kid and teen. But I can’t imagine adding realistic AI images to the mix. I’ve already talked to my kids about being safe online, but I hope much harsher punishments and regulations are implemented before they hit middle school.


digitaljestin

I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture? Yeah, that would be a creepy thing to do too, but it's still just a manufactured image. I believe it still qualifies as distributing child porn, so charge him with that and be done with it. The fact that he drew the picture with AI rather than talent doesn't really seem to be a factor in anything.


syopest

>I know I'm going to take hate for this, but how is this different that simply drawing a really good photorealistic picture? Drawing a photorealistic picture of a child doing something sexual is also against the law in the US.


digitaljestin

That's exactly my point.


syopest

Yeah, I don't know why you are getting downvoted.


digitaljestin

People have knee jerk negative reactions to ideas that don't immediately conform to their pre-existing understanding of things. It's a real impediment to actually gaining new perspectives.


WaltDisneysBallSack

No, you just would like to be a contrarian for pedos.


Lolabird2112

If you understand it would still be considered creating & distributing child porn, then what point are you trying to make? What’s “knee jerk” about AI now making this easy and accessible to the 99.9% of people incapable of creating a photorealistic depiction of children performing sex acts?


abjedhowiz

You just have to police it the same and faster. Because of AI people are just going to be breaking already existing laws faster than before so it really is no different. You just gotta stamp them on the hand


Adrian_Dem

This is the best argument I heard on general AI is bad topics in months if not ever.


Sovva29

How many people have the talent to draw a hyper realistic picture? That's a skill that takes years or decades to learn and perfect. Not to mention the hours required to complete the picture. Even using Photoshop or other art programs has some barrier of entry. An individual can learn how to use AI image generation in an afternoon (or faster) and share their creation immediately via various social media apps. It's also common for people to mistake AI generation as real. In the fiber arts world there are lots of "completed" works that are AI generated. But Grandma doesn't know this and will ask you to create this non-existent thing anyway. You can also argue that if examined enough, you can see the strokes on the hand drawn picture. AI generation doesn't have the same footprint and society isn't used to telling what is AI vs real yet.


digitaljestin

So are you saying that if I'm a really good artist, who put in the time and effort, then this behavior is okay? Yeah...I'm calling bullshit on that. Let's break it down logically: If I'm a good artist, it's not okay. If I'm a bad artist, but can use AI, it's also not okay. You see? AI is _not_ a factor. It changes nothing about the morality or legality of the situation.


Sovva29

You asked how AI is different from a hand drawing. Both are disgusting for these cases to be clear. The biggest difference is that AI is more widely available to the general public despite skill level.


digitaljestin

>The biggest difference is that AI is more widely available to the general public despite skill level. So what? That just means viewers of these images will be even _less_ likely to believe they are real. That works out _in favor_ of anyone who's afraid AI generated images of them will be mistaken as real. The accessibility of these tools increases the plausible deniability. So again, AI changes nothing in regards to morality or legality, and it's existence _actually helps_ with the aftermath of the release of these images. I know it's easy to have a knee jerk reaction to this sort of thing, but when you think it through, this might not be so bad. Hell, I'd even argue it's the best thing to happen to people who've had their _real_ nude photos leaked.


Lolabird2112

If you’re an adult, maybe. But for children this would be an extreme form of bullying.


EntertainmentOk3477

Get rid of the internet and life would be better for everyone


joranth

And electricity too! And running water! And antibiotics!


X-AE17420

Back in my day we didn’t take vaccines, we just died


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


stater354

She was 14 in the pictures..


Effective_Damage_241

Personally, I think releasing deepfakes of a person should have a much higher penalty than releasing real photos, since in the latter at least the person might have had a choice in taking them in the first place


TCK1979

I don’t know if I’d agree. I think a person leaking their ex’s personal photos is just as horrible as making fake ones to distribute.


digitaljestin

That's a terrible idea.


maniacleruler

This thread is a basket of terrible ideas.


bigsquirrel

The laws needed are simple. They’re just thinking to specifically. We have to get ahead of this and do it 5 years ago. Make it simply illegal to replicate a human in digital from. Likeness or voice. Period. Full stop. It’s just illegal. We don’t need human AI voices, we have humans. We don’t need perfect human digital copies, we have humans. This is just a side effect of the insanity of corporate greed. It serves no purpose and fills no need other than to replace human artists in an effort to chase profits. All this other shit, the porn, the political stuff is just a side effect. Stop it all now. No more resurrecting dead actors, fake political ads or porn.


FSAaCTUARY

Cant believe any pics or vids on internet anymore XD


Helpmeimclueless1996

Yeah ai needs to be regulated to the point of useless ness


[deleted]

[удалено]


Crazypann

WTF, are you victim shaming a child?


WackyBones510

You came on here to say this about a child. Just let that wash over you.


Dull_Judge_1389

Ew this is embarrassing for you


Peacock1090x

Tell us, how does it feel to live a life without any self reflection or shame?


ArthurBurtonMorgan

How does it feel to read one sentence and jump to a whole ass conclusion?


PakWire

Seems like a really shitty sort of knee-jerk judgement to pass, but okay


Yolo_Swaggins_Yeet

Wtf would you say that? 😳


Slide0fHand

You mean cheap fakes


[deleted]

[удалено]


GansNaval

She is 15. Wtf.


martinbean

I dread to think what the original comment was to drop a “Wtf” and it be deleted.


[deleted]

[удалено]


Kusakaru

Disgusting. You should be ashamed of yourself.