Deep fakes in general are a very scary concept because they have gotten so realistic. Even getting away from the sexual stuff, deep fakes can still ruin someone’s mental and reputation with how real they have gotten. What is extra scary is how good some private tts companies have gotten at copying peoples voices
I discovered I was deepfaked from this whole situation, and honestly even though I'm hurt I also feel for the sex workers who had their work stolen too. Both me and those SW who they put my face onto were sexually exploited and sold without our consent by someone we don't even know. Thanks for covering the situation, hopefully more people with be careful to make sure all of the sexual material they use was consensual.
@@yabo5131 excuse me? so you’re saying that women should be made into deep fake porn without their consent and they’re not allowed to complain about it because they are “actual” sa vicitms? yeah dude both these people can exist. if your idea of “consent” only relates to physical sexual assault that’s your problem
I was very innocent in high school. I never taken a nude or even made out with anyone. Never sexted. Yet, there were multiple people that were convinced I sent someone a nude. Turns out, it was some other girl that looked similar to me. The guy lied about who sent it. I have no idea why. It is an awful feeling to be judged for sending out sexual content when you never have. It is such a violation. I couldn't imagine being in the situation as the streamers. What they are going through is in a much larger scale. Truly horrific.
It shouldn't be illegal to publish it as long as it is very clearly marked as deepfaked and not real. If everyone knows its just an edited image then it's no different than creepy fanart, it's when everyone's convinced it's real that it's a problem.
@@Raptorworld22 No, it's currently classified as revenge porn in the areas it's currently banned, and I think this is a fair classification. There's a few reasons this is the case: 1.) Well-made deepfakes are indistinguishable from real photographs, and this is being further solidified by Dreambooth technology. It's worth noting that CP "art" gets charged as actual possession if it's modeled after real children or is too realistic. 2.) "If everyone knows it's just an edited image" requires every single point on the distribution pipeline to clearly mark the image/video as deepfake. This origin can wind up obscured or lost, either accidentally or intentionally, resulting in people believing it's real. 3.) The intent of deepfake porn is to obtain sexual content of someone who hasn't produced that content. You are bypassing their willingness to take naked photos or record videos of them having sex 4.) This is a thing men don't really understand, but most women are straight up HORRIFIED by the prospect of being seen nude by people they aren't intimate with. I for one had an incident where an ex landlord put a camera in my bathroom, and 6 years later I still have severe anxiety and trust issues from it. 5.) A common victim-blaming tactic men have towards women who have their nudes leaked is "You shouldn't have taken them." With deepfake porn, a person's choice to have nude photos exist of them is completely removed. 6.) Women have to deal with constant sexism and sexual harassment in their lives. Our ability to choose who can see us at our most intimate is one of the ONLY things we can control in that regard, and it's being taken from us.
@@cargyllion 1) Didn't know that, but if this stuff really is so accurate to real life now, then I am changing my stance and agree it should be illegal. 2) I never said my idea was practical, but in a perfect world that is what would be done. 3) This point doesn't make sense to me, as it's no different than creepy fan art in that case. Do you also think we should make drawing people without consent in explicit context illegal? Actually curious what your stance is there. 4) I don't know what men you're talking to, but every man I know would be equally horrified in this context, where everyone knows/thinks it's their body and it's without their consent. 5) Isn't applicable since my original point was if *everyone knows it's fake* 6) Having been a woman for more than half of my life, I know you are talking complete BS in that first half, unless you live in Saudi Arabia. Overall, I am changing my mind and agree that this very convincing deepfake stuff should be illegal. Different opinion on fanart and terrible photoshop jobs.
@@Raptorworld22 The current situation with Dreambooth deepfaking is that anyone with an Nvidia gpu made in the past 6 years and 20 or so images of a target individual can generate a model capable of replicating that person's appearance. It can generate you nude, in an SS uniform, wearing a Jace Beleren cosplay, whatever you want as long as you fed it reference images. And it's scary accurate. Literally the only saving grace right now is the current training models can't parse complicated tattoos and facial piercings. However, there's a large community of people working to fix these issues for the expressed purpose of generating AI porn.
someone made a deepfake of my cousin committing sexual acts on a minor and it almost completely ruined his life, it was sent from his facebook to all of his friends it was awful
Aside from the tremendously creepy aspect, it's good that you brought up how deepfake porn could potentially be damaging to the careers of the people being deepfaked. Think about it; the wrong people see something like that, like advertisers or investors, they might get fooled by it and decide to cut business ties with the people getting deepfaked. In short, just imagine your favorite content creators getting blacklisted due to sexual content that they never even produced; that's horrible.
This is a good concern, but I'm actually annoyed that companies even care so much about people having sexual content. As long as it's somewhere children won't accidentally stumble across it, there really shouldn't be a problem.
@@impishlyit9780 100% agree with ya. Just because someone's made or performed in pornographic content/media doesn't mean they should be blacklisted. It's a paycheck. Nothing more, nothing less. So long as all the people viewing the adult content are of-age themselves, everything's fine. Man, I got sidetracked on this one xD, but it's an important topic too.
Even if they aren't tricked by it personally a brand could probably be convinced to drop someone just because it exists at all especially if it gets popular enough. Real or not a brand that's promoting say kids content or trying to have a "family friendly" image may not want to be associated with that. I don't understand how it's legal anywhere given these companies should have no right to monetize your likeness anyway. Really gross situation and technology keeps advancing faster than we can plan for the ramifications of it.
@@josephschultz3301 agreed. Here where i live, a female teacher broke up with her husband and he, as revenge, shared online some porn videos they made together (that were private, they never posted them anywhere before, it was private videos for themselves and pics he took of her naked). Turns out she got fired from working as a teacher and blacklisted from every teaching career in our country, while he only got like 5 months of picking up trash as community service for doing something illegal. AFter those 5 months he can get back with his life (of course HE isnt getting fired nor blacklisted from any working job even if he was in those videos too, only women gotta be slandered for sexual things) meanwhile her life is ruined. Some idiots still insist on defending him and blaming her. Our country had a lot of protests about this.
What scares me the most is the potential for cyber bullying and the fact that I can predict with great certainty, that this will be used by school students to bully other kids
Only for this generation, the next generation of kids will grow up not trusting video / audio. Out of all the dangers AI will bring, this is one of the things you should be least worried about.
This been around for decades bruh.. back in the 00's it was photoshopped celebs. They've always found ways around it & unless actual federal laws are made, this ridiculous stuff won't stop
the biggest shock for me with this whole situation is that this wasn’t already illegal. there’s already laws and defamation lawsuits surrounding using someone’s likeness in anything without their consent, but for this kind of material is sickening.
Exactly! And then you have guys like d*stiny frying their brains out to try to understand why deepfakes are bad 🤦🏼♀️ when is just another form of defamation!
@@llama6394 it should be, lol. There have been cases in which celebrities sue companies because they use their image without their consent and/or against their will 🤷🏼♀️ but idk, the internet has normalized trasspassing personal boundaries to an extent where people are just mere objects...
@@llama6394 current law, sort of? Best example is with the old man “hide the pain Harold” or whatever that meme’s name is. His image, especially for those photos, are under copyright like any other stock image. So we do know that memes can be under copyright and can’t be used for profit or undermine other’s work. There’s also celebrities suing games, animated shows, etc. for using their likeness without their knowledge. However those are higher profiles figures, and common streamers would have a harder time fighting that in court. There’s also AI involved in deepfakes, which we still have no laws governing, and other issues of giving streamers protection from these websites. Honestly, one of the better things to combat it might be a libel/defamation lawsuit, since this basically is faking a porn tape for someone. And that’s outright stated under defamation as an injury of character and easily would at least have a trial for it.
It’s because the laws have to consistently adapt to technology so they will always be behind. We unfortunately won’t be able to experience a society we’re the laws will be ahead.
yall never had a hot friend that u wanted to sleep with but couldnt? idk why everyone is freaking out over this. everyone has their own private fantasies or curiosities. ive looked at some really really crazy stuff online just because. its not like atrioc was personally creeping over them, it was a solo curiosity that he accidentally showed.
The depressing truth to all of this AI deepfaking is that there’s creeps on the internet that can easily make porn of popular minor celebrities and that alone sends chills down my spine
Even if he found the website randomly which is very unlikely, the fact that he didn't notify his fellow coworkers and kept it to himself is pretty stupid of an excuse.
@knockout Well in order for him to know it was a deepfake site, it would have to be advertised as such, and presumably it would also advertise being able to fake his coworkers. Also I'm pretty sure the hub doesn't allow deepfake content so it makes no sense they would allow those kinds of ads.
To his credit, that’s a pretty awkward convo to have. “Yo bro, just letting u know I found this online service where it deepfaked you onto porn.” “How did you find it” “yeah, I was down bad and clicked on some ads, paid for some website, and beat my meat to it. Sorry for not telling you sooner.” “Bro wtf” that’s only one possibility of course, but ya see where I’m going with this? After the action is done, kinda hard to bring it up casually.
I remember less than 10 years ago, local stories about how a teacher lost her entire teaching career because when she was younger (and she was still young when they came out), she did adult content that some of her students eventually somehow stumbled upon online. That was from someone doing adult content for real. Now imagine if highschool or college students have the power to just deepfake a few convincing pictures of their teachers in adult content. They could ruin their career and reputation because even if it's proven fake, school establishment don't like to be anywhere near that kind of stuff and the content will probably remain available somewhere.
@@experimentalghoul3540 I’m not sure what you mean. Like anything else it exists because there’s a market for it people will pay for a service that lets them see their favorite celeb in adult scenes.
Remember back in high school when rumors would spread about people? This is the ungodly, coked-out, and roided-up evolution of that concept. This is fucking terrifying beyond comprehension.
Yup. The ramifications from deep fakes are large. Between mental and financial, you got legal issues. Think about how someone can be deep faked to cover up a serious crime
The ethics of AI technology and deep faking are very concerning, and not just this particular situation. Not being able to distinguish reality versus fake reality is a slippery and dangerous slope.
you know as a woman i’m so glad to see such a prominent male figure in the community speaking out against this. i’ve seen a lot of women being disregarded and belittled for saying the exact same thing and as shitty as it is, it’s nice to have someone to back us up.
Agreed, charlie and this community are what often restore my faith in humanity to be honest. Sometimes I'm really scared if no one in this world actually have decent clarity and conscience for everything that I've seen in my life. And it is MEANS A LOT for me to know there are peoples like charlie and his followers out there with sane minds and willness to speak up for those who need it. Charlie if you see this, you are impactful and honorable. Thank you 🙏
I mean when every girl i see often does sexual stuff, it got me really desensitized... i know it feels bad but when it's like pokimane and all, i cant even flinch
HealthyGamer_GG did a deep dive on this today that goes into some science about how nonconsensual porn affects not just the victims, but the consumers, and one of the chilling parts of it was that the deepfake stuff is behind paywalls because it's more addicting, because they're attached to an existing powerful parasocial relationship. It was a powerful take and I hope everyone sees it.
Isn’t all porn non consensual? The people participating in the act consented but there’s no way that they consent with everyone who’s watching, and as a passive observer i doubt that anyone is mentally checking off if it’s consensual or not anyway. As far as the individual is concerned they are partaking without participating and that is often a part of the harm done by porn.
@@sparks6177 people who make porn willingly and post it publicly are also consenting to having pretty much anyone view it. It’s comparable to UA-camrs and actors where they sign on to have anyone who can watch their content, it’s part of the deal. Steamers do not consent to having porn made of them because, well, it’s pretty obvious why. They are not at ALL the same.
it’s sad that it has to be pointed out that this type of technology can be also used to frame people for sexual assault or other heinous crimes for people to actually see an issue with AI being used in such a way.
Fun fact this won't happen since the ai that creates the deep fake is also able to detect deep fake as it trains two ai one for creating fake ai and one for detecting fake ai to improve it. So no it's not going to be possible that it would be used for blackmail
i think the crazy shit no one else or charlie touched on, is the idea that they can use deepfakes of minors and young celebrities or actors. theres probably already some floating around on telegram i bet. dont doubt it one bit. smh
I am really worried about this, especially in terms of crimes and trials. It has become so easy to frame someone of doing something gross which can be used during a divorce or just something illegal. Even if some IT specialists can prove that a "photo" or a "video" is a deep fake, it still can be so devastating for a person even if he didn't lose a trial. Drama usually stays for eternity in one's reputation. What I'm trying to say is AI faking action of real people can run out of control extremely easy.
So the reason why thst won't be a huge problem ris because it need thousand of hours of video and sound to make it even kinda believable so no normal person would be able to be deep fake only people in front of cameras often and if you in front of a camera often you have easier time to prove your innocent
@@bobberry1463 even if that’s true, anyone who has hours of content of them - with, or WITHOUT their knowledge - even just a few solid minutes of footage / other stuff and you can still do this. there’s been deepfake porn of people who are pretty chaste online (like billie eilish). and i’m sure people who are looking for that kind of thing won’t care if it’s poor quality or not as long as it’s close enough for sure it would be harder for people with less footage of them up / less public jobs, but even still, it’s better to protect everyone - silly example but genuinely: but does flo from progressive not deserve the same amount of privacy as any of us? just something to think about. 100% always better to make something more safe than less safe.
@@justasmltwngir1732 get the fuck outa here. You're gonna tell me you never flicked the bean to anyone besides the pornstar you're watching? Oh what you don't watch porn? Naw let's not even stop there you don't even tug em out right? Do you hear yourself. Everyone that masterbates has their "inspiration" cause you sure as hell ain't jerking it to literally nothing, regardless of whether or not you watch porn. And my reply to op, I would know it's not someone I loved, so no it wouldn't bother me one bit. That said in my opinion deepfakes should be labeled as a deepfake and not tried to be passed off as real.
@Edward Deepfake porn has gotten so advanced that it's quite hard to distinguish real from fake. Without proper evidence to disprove the deepfake, it can easily be used to slander. Although I do agree that deep fake porn of celebrities and social media personalities will always exist, that's just the price of fame. But what is disgusting is that it was of his friends and even his friends girlfriend.
@Edward everybody has fantasized if even a little about somebody attractive in their life but not everyone seeks deepfake porn of them. That shit is definitely weird man
im glad that someone with a large platform is talking about this issue. ive seen this exact thing happen to many female streamers, but because they were smaller, nobody said anything
This is technology that humanity simply isn't mature enough for. Imagine videos of crimes and the like being doctored with deepfakes in order to falsely imprison people. That's one of the first things that came to my mind when I was first introduced to it.
It just makes me sick how so many miserable people spend so much time of their day making fake videos and cheap deepfake animations on people. They deliberately do it to harm and Destroy peoples reputations.
And I don't get why they are so desperate to see somebody uncovered in the first place, acting like middle schoolers who have never seen a woman's body before
it was 100% clear from the second the technology started to exist. look at games from 1998 and today. it was inevitable that this technology is advanced as this.
I was once recorded without my consent in a sexual manner and was blackmailed. Even though it was my body I completely empathize with all of these streamers. Because I was never physically hurt, but just the act of seeing me (and in their case, their likeness) being shown without my consent was the most damaging shit ever.
@@Elfenlied8675309 that’s a logic fallacy just because something has been the case for a long time doesn’t mean that there’s evidence stating it should continue to be
@@Celeborn93 Absolutely good comparison. We have to think of black and white in these situations. Is this good or bad. It is bad. There cannot be middle ground especially when those whose likeness is being reproduced explicitly and implicitly stated they do not want to be used in that way.
Deepfakes have always irked me since I first heard of them a few years back, and knowing the technology for it is only getting better, that bad feeling isn’t going away. There’s so many ways it can be used in the wrong hands- blackmail, revenge porn, straight cruelty by sending it to friends/family, getting people fired, using CHILDREN, etc. It deeply, deeply unsettles me. The response to this whole streamer situation greatly concerns me, as well. These women are being openly objectified, having their privacy invaded in one of the worst ways possible, and some folks out here are so quick to brush it under the rug. It’s a shame.
You’re 100% right here!tbh technology is scary af nowadays between the ai art the deepfakes and the ai that emulate people’s voices it’s definitely risky af and could lead to a lot of horrible shit happening and it sucks seeing people being taken advantage of and having this horrible shit happening to em men women and children alike can be victim to ai deepfakes of all kinds including adult content. The thought alone makes me feel sick! I can’t imagine what the victims in this must be feeling and going thru tbh
I remember this one adult actress who had been affected by DeepFakes being applied on their content speaking out about this, it was something along the lines of “You’re taking something that I have produced for people to enjoy and you’re twisting it into a situation where the other party has literally no consent in the matter.” Adult DeepFake content is just such a strange thing to me, on the one hand I can to an extent admire the technology, but we really should be drawing the line at DeepFake memes for comedic value, cause when you cross into the territory of things being practically undetectable to the naked eye, it can genuinely ruin people’s lives.
Honestly it's hard to say where to draw the line. Comedic value is subjective and there can be stances within memes that people don't want associated with their faces. Even the film industry has morality issues about deepfaking. Good quote btw.
@@MammalianCreature I think it was in one of Shane Dawson’s old conspiracy videos on Deepfakes, long before all The weird shit about him came out. i believe it was his video “Conspiracy Theories with Shane Dawson” from 2019. I’d say the DeepFake section of that video was probably the last interesting thing he made tbh.
@@craigyeah1052 I say in memes for instance like what Corridor Digital does, they had that Keanu Reeves stops a robbery deepfake that was quite impressive, or like when people do stuff like deepfake nicholas cage into random scenes. I agree though, it’s kind of hard to tell where EXACTLY to draw the line with this kind of stuff.
@@iloveplasticbottles Yea just because you think deepfakes of you aren't a big deal, doesn't mean it's the same for others. People have varying opinions about this kinda stuff. And it's best to be on the safe side and never make deepfake porn in the first place. And if you plan to do so, just make some of yourself 😒
It's so crazy to me that some of these guys are defending it exactly why other men look bad. I'm not really surprised that most of the victims are women. They abuse the use freedom of speech and the amendments rights to justify the most digenerate thing ever.
It's weird to me that there are some folks out that you have to explain how morally unacceptable it is to exploit people on the internet with deep fake porn. Is their moral compass broken? This is about respect towards other people and consent.
@@aurea. I feel sometimes like the whole internet is full of soulless sociopaths, but then I remember a lot of people on the internet are teens thinking it’s cool to “not care,” if you know what I mean.
It's what happens when viewing tons of porn is considered normal especially for guys so there's nothing stopping them other than themselves in how far deep they go
"consent" Do you complain about comics making fun of specific people? They certainly don't get "consent" to rail on about people. This is so delusional, because it assumes that these celebrities are actually being affected (exploited!!) by the deepfake industry. I like how everyone has just ignored that this has literally flown under the radar until this weird incident by a streamer. So much proof of it having a real tangible affect on celebrities in general that it took a random giant streamer to have clicked on an ad to suddenly bring this to internet wide attention. There is real criticisms to be had like the companies labeling the porn as REAL!!! and it being charged for, IE, making money off of someones likeness/copyright without their consent, but the idea that porn deepfakes are inherently themselves *evil* is actual karen behavior. These do not have actual effects on anyone, except in the small exclusive cases of direct blackmail, and even THAT situation is bullshit because there's also tech that detects deepfakes that is constantly evolving alongside AI tech, so any faked blackmail is usually very quickly detected as fake and thrown out. Charlies example of "muh companies pull the trigger like winnie hut junior" is not a problem of deepfakes, but companies being pussies, and one that should be ironed out. I'm sure a few cases of potential "leaks" turning out to be fake resulting in triggerhappy companies looking like dumbasses will curb this behavior.
Dude. My biggest concern is how this could be used to frame innocent people for crimes. The state or a malicious person could take someone down by just making up video evidence of them committing crimes they never did. If this happens a lot, we could end up in a world where video evidence can no longer be used in court (unless we find a way to better identify AI deep fakes).
Security camera software could add features to grant access to footage (where it is viewed on its official website/app), that way it can't be fabricated
I could be wrong but wouldn't you still be able to use metadata? Phone/camera manufacturers could use some sort of verification system to confirm whether a video was real. Obviously a lot of videos/images would no longer be valid but I don't think its entirely a lost cause.
i had a similar experience. i was only 16 and one of my friends (20 at the time) spread around fake screenshots of me saying sexual things to him. it was all for a “joke” but people genuinely thought it was real, and he played into it. it’s such a disgusting feeling to avoid touching anything remotely sexual as a woman, only for men to decide you don’t get to choose whether or not you’re considered sexual. they turn you into an object for their own pleasure so easily and it’s terrifying. it makes you feel like you don’t have a right to your own body anymore.
It's scary because anyone could suffer from this, imagine you have a crazy ex who wants to ruin your life and deepfakes you in that type of scenario and sends It to everyone who knows you
@@llamarelish4701 cool it with the misandry, being a man or not has nothing to do with it, im a man and i dont agree with him lets not generalize a gender
My biggest concern is when people begin deep faking individuals doing illegal things. Someone doesn't like someone and so they deep fake them doing something horrible to a child and it's hard to disprove in court. On the flip side of that, once it becomes well known enough, deep faking becomes a serious issue mudding photographic evidence, suddenly it's hard to prove someone who is guilty based on photographic evidence because they can claim it was deep faked.
"deep faking becomes a serious issue mudding photographic evidence" Yeah, but that's not necessarily a bad thing. There's a lot of innocent people in prison, so having more prosecutions fall apart because of the dubiousness of video evidence could be a positive thing.
Already happening. We have seen "algorihmicaly enhanced" footage admitted in court trials. Even if it's not purposeful deepfakes, it's still extrapolated data, something that never actualy existed
i'm glad charlie understands how disturbing it actually is, because i've seen some youtubers say that the only problem in this situation is that the guy watching deepfake porn didn't do enough privacy measures and that watching this stuff is okay if you're not talking about it publicly.
I think the nuance that it only matters now that it is more realistic, where as it was more humorous (at least the ones he had of him) before is a bit wrong though. It is absolutely worse for the person being faked, as it's harder to deny and refute that it's them, and would make them feel far more invaded. But, for the creator, and the consumer it is exactly the same. The intent is the same whether it looks a bit off or not. They are still trying to create an adult video that the person did not consent for.
i mean...think about it...if nobody knew, what would the ethical issue be? Like, for example, would it be unethical for someone to draw a picture of a celebrity naked, and then jerk off to it? I feel like people just love shaming others when it comes to anything sexual cause they're desperate for people to think that they're not the weird one, when in fact, we're all weird.
It’s gross, especially cause it’s not even limited to celebrities, influencers, streamers, etc. Even average every day people are getting deepfaked and it’s especially sad because with celebrities, we can assume it’s not real, but these every day people having their name and faces connected to these videos- recruiters and hiring managers aren’t going to go that extra step to see if it’s legitimate. It usually isn’t until people get sent links to the porn videos that they learn what’s being done.
I mean, I agree deepfakes are pretty messed up. But don't people draw stuff like that too? How come people haven't been talking about that? Sure it isn't the real person, but their likeness is still being used and they're still being objectified
I mean… the idea of even having those kinda of tabs open while streaming is mind blowing, it’s like going to a public bathroom without locking the door.
It's like going into a public bathroom stall to pull down your pants , insert a banana, then bend over and leave the door open expecting no one is gonna see that goofy shit.
When I do stream (rarely), I make sure every bit of software that isnt related to streaming is closed, not minimized. I'm not ashamed or into anything illegal, just like, even accidentally flashing OBS bothers me. then after stream I do a full restart to make sure obs itself is off. Too safe? maybe, but better than this guy
thank you for talking about this aspect of the issue. I've seen so many absolutely horrible takes of people saying "its not their bodies" "normal people would get over it in a week theyre upset for clout" "its weird and creepy but not harmful" and its absolutely infuriating so hopefully some people will see this and understand the gravity of whats been done. Like Charlie said: imagine you posted a selfie once in your life and now someone took that and put it on a body--not some animated 3D thing, a real convincing looking body--thats not yours and is its doing something extremely pornographic. A full sex tape. Now imagine your parents have seen it. Your friends. Your significant other. Your kids if you have them. Your boss and all your coworkers. If someone looks for your name that video comes up. People might ask you to do porn or say explicit things to you now. You are treated differently. All of this, completely without your knowledge or consent, just because someone thought your selfie was hot and felt entitled to make porn of you to sell for a profit and for their own pleasure. Edit: I want to clarify that the example I used here is a hyperbole that I only used to try and help relate the situation to a more "normal" level. It takes much more than a selfie to replicate and deepfake a face so I don't want anyone to panic or be fearful, that was never my intention and I'm sorry if it came off that way. I was simply hoping to provide that perspective to help some people who might not fully grasp how seriously this can harm someone who doesn't do sexual content and did not consent to it
@@protomato6427 Difference is, porn actors signed into it. They're aware this is happening or could happen to them because they're literally the ones in the videos doing it. If you're not a porn actor, you wouldn't want to be treated like one out of nowhere. Of course, there are still some stuff that people say and do towards porn actors and sex workers in general that they shouldn't have to put up with.
This is such a disturbing idea. And there's also the idea of "the Internet is forever." Even if every site and video hosting this content is blasted, there will be a few videos and images saved and reuploaded infinitely, or just some that get missed. It really shows people cannot stand having no access to every single part of a person, even the most intimate things. It's so gross how if you aren't sexual, there may be someone out there DETERMINED to make you so, or at least in ways you are not comfortable with. No man, woman, or whatever should have such intimate content made of them without permission. Thanks for talking about the general topic.
Almost as if the internet was always full of weirdos and it never was a good idea to share personal information online. Luckily for them, social media threw all that caution out of the window.
It's actually more amazing how quickly information decays on the internet. Efforts by ArchiveTeam and Internet Archive show just how true this is. On random hard drives, sure, but as far as the actual internet... no. "The internet is forever" only happens with a lot of effort.
Internet + narcissistic Humans = I hope this current journey of my soul ends decades BEFORE AI rises and possibly (?) obliterate Human existence as we know it. **my soul needs a fresh Human vessel to circuitously witness Planet Earth's end times.
The Lost Media community knows how untrue this adage really is. Most of the time, deleting a video off the internet pretty much removes it from human knowledge, no matter how much people try to get it back.
I always thought AI would take over by human looking robots, not deepfake porn. It's seriously sad that this shit happens. It's happening on Instagram too. I think this could also shed some light on porn addiction and the dangers of it. What I'm really worried about is creeps using this with minors.
Glad you mentioned the dangers of porn addiction; I'm an addict myself (hentai) and I'm trying to find some productive ways to distract myself from feeding it.
@@soaringsquid0.129 good luck. I know other porn addicts that had success with crocheting, chainmail (metal not emails), and miniature roses as house plants. Just need to keep your mind and hands productive.
@@soaringsquid0.129 Well, you better not take any looks at AI porn made with stable diffusion and stuff... now you can produce hentai shit with just a click
This comment section is making me both enraged and dissapointed by the second. There’s plenty of sane people here, sure, but why can’t everyone understand that other people don’t want to be depicted in sexual situations? These kinds of disgusting slime are better off dead, along with pedophiles and other kinds of non-human filth that think consent is just a suggestion.
@@jerome1lm It's no longer a "fantasy" if you choose to make that fantasy into a material entity (deepfaked video) which can be seen by anyone, not just the fantasizer. Seek help.
Was just talking about this with my boyfriend. I can’t imagine finding P of myself online without consent, or how it would make me feel. Everyone calling these women sensitive for being in this situation is heartless.
But they're NOT "pics of you" they're an AI generated guess. Either way, you shouldn't be sending pornographic images to other people if you don't want the chance of other people getting a hold of it
it's scary how much AI has progressed. if people can do this, I don't want to imagine what we can do in 20 years. you could frame someone for terrible things without taking a step out from your home.
Just a couple of years ago I remember trying some program that could say a few sentences in the voice of Homer Simpson, Squidward etc and it was still way off, now in 5 minutes I can make Oderus from GWAR recite bible passages and the realism is only determined by what audio clips I use as reference and by playing with the speech settings. I hate to see where this will be in just another 6 months.
IMO it'll reach the point where all videos and recordings will stop being considered as legal proof for anything because of how easy it is to fake them. I doubt you could end up sending someone to jail with a deepfake, but it'll still be problematic in terms of reputation and public opinion as people are easily swayed in social media.
When I was younger, I wanted to become a programmer and create stuff that's exactly like deepfake. Now that people have done it before I even had a chance, and I've seen what people have and maybe will use it for, I wish that it never had existed. Man, I sure love technology.
I actually had never known this was even a thing. This is kinda crazy with how it can’t be distinguished easily and can be used to ruin peoples reputations
Honestly, down the line it could get so bad at some point that even when managers do end up finding deepfaked porn of the person they’re about to hire-real or not-they probably just won’t feel inclined to give a shit. Like, it’ll happen so many times to anyone that they’re about to hire that it doesn’t even faze them anymore and they’re like “ah, ok, let’s just take a look at their resume” If it does happen, revenge porn would basically lose its purpose and would just go on to be a regular nuisance that barely anyone bats an eye at or cares about because “yeah, yeah, they’re not the only one, y’know”, that itself would be kinda satisfying Either that, or officials are probably gonna find some way to make deepfake porn production into an executionable offense
dont share info on the internet lol its your fault when people take advantage of the info YOU uploaded. Your face is PUBLIC when you put it in PUBLIC space. Id hate it if it happened to me so i didn't put my info or face on the fucking internet.
@@Lewisrobbie02 At the very least, it's apparently "convincing". When years ago it was "obvious", the problem is how it'd look a decade from now. Better act now before it gets to that point
I’m not even popular on the internet (at all), and someone did this shit to me just by gathering photos by following my instagram. They tried sending the fake photos or videos (I never opened them obviously) to all kinds of people I knew, including my own mother, just to blackmail me into sending them money. Luckily, I managed to spread awareness to everyone to simply delete the chat if this person messaged them. From what I know, no one saw they sent. At least no one currently in my life, thank god. This shit is a serious problem.
charlie hit the nail right in the ass. The biggest issue is not just the fact that its a hug violation, but it also has horrible implications to be used as blackmail and harassment. Not to mention the potential for deep faked images of children/underage streamers. It is absolutely sickening.
@oh no cringe detector In case you are being manually this pathetic or you read responses to your pathetic bot, stop. You're not going to achieve anything by doing this.
You can use software to prove that a video is deepfake with 100% accuracy, you'd have to be a idiot to get blackmailed with the video equivalent of photoshop
That's always been a thing though, this is just an evolution of photoshop and previous visualisation methods. Before the camera people were saying the same thing about using sketch artists for Police work, "what if they could draw it wrong and convict the wrong person" Make it illegal to do things that will cause international incidents like deepfaking the President or CEO of companies, the same as it is to fraud or slander. But to make it illegal to whack off to someone without their consent....? Naive and entirely unenforceable and not at all MORAL anyway, nobody can tell you what is right or wrong to think about, a hundred years ago it was "MORALLY WRONG" according to almost everyone for people to be homosexual.
It's only a matter of time until some teenager creates a revenge porn deepfake of someone they don't like at school. In that situation, a person's life will probably drastically change forever, and it's hard to say if the truth would ever come out. It would also create a whole new weird legal problem. Would a teenager deepfaking their classmate into porn be classified as a sex crime against a minor? I lean towards yes, but I also have a feeling that it will take almost a decade for widespread legal precedent to be established.
Dang I never thought of it like that, like I thought most comments were blowing this out of proportion. But your comment really opened my eyes honestly thank you
Something like that actually happened to me. I managed to stop the spread of the fake content at least before the people currently in my life could see them.
@@Cruz474 Photoshops are still vile and those should be banned too. Stop thinking from the perspective of someone tugging their meat, and start thinking from the perspective of the victims
@@hithisthers7214 I'm thinking from an omniscient perspective. I was just looking to see for your consistency. I see it, so I have no quarrel with you. Respect.
I was so naive when deep fakes first dropped. I was just thinking, "oh man, now I can make myself look like a superhero or a movie star or something". Sucks that this is the reality of the situation.
Oh, very innocent. Teens (and the cia too tbh) will use it to make fake sex tapes to blackmail others, criminals will use it to frame others, courtrooms are already using deepfake experts.
@cats one of the industries that most rapidly attempts to adopt content related technology is the pornography industry, so its not really that hard to predict if you've given it some thought
i thought the same thing. but that’s sadly the very surface and optimistic way of using deepfake. we all know it’s going to be used for much more devious and evil intentions
Meh, e-celeb fanfiction has existed for years. It's exactly the same thing - fantasizing about having a sexual parasocial relationship with your favorite celebrity. People who are into that are definitely creepy weirdos, but complaining that it's "unethical", especially when some of these e-celebs are encouraging parasocial relationships in the first place, is pretty hypocritical. There's of course the whole other angle of passing it off as real, but there are fake news about e-celebs all the time. This is something you cannot police. Best you can do is take responsibility, not encourage parasocial relationship with your fans (especially sexual ones like OnlyFans), and make sure you're not associated with it. Nobody is looking up DeepFakes of Michael Stevens from Vsauce.
IT will ruin someone getting caught but definitely will not ruin the one being deepfaked, unless they chose to be ruined. (because people would just comfort and give sympathy to the indicated victims) this is digital harassment in the making.
It’s easy af to prove a video is deepfaked, no matter how realistic it looks. There is software that can determine deepfakes based on a number of factors and it’s 100% accurate
This stuff pisses me off so much. The responses to it like “they posted pics of their face online they knew this could happen” or “so now it’s illegal to find someone attractive or fantasize about them” or genuinely one of the worst “all women should be ok with this because only fans exists”. It’s genuinely infuriating to see response like this
Lawmakers really need to completely ban this. Imagine if they deepfake a persons face who is under age and then put it on the body of someone of age, and say it’s “ok” because the persons body is of age. Very horrific slippery slope here. Disgusting!
Human brains aren't adapted to seeing ourselves doing sexual things that we don't remember doing. In this case it's because we didn't actually do it, but the point stands. It's deeply psychologically disturbing. Dr. K on HealthyGamerGG did a video on this situation as well, it's really insightful.
@@AkiRa22084 We're not adapted to that either, but it's one frame and we can, over time, convince ourselves that it's fake and we didn't do that. I'm not saying it's no big deal - editing anyone into sexual stuff that they didn't consent to is wrong. But video is a new level that we're not adapted to as a culture, let alone as a species. Our culture still considers video to be a standard of evidence. Even leaving alone that seeing yourself do something you'd never do is damaging to your mental health, it's still damaging to professional reputation and personal relationships, not to mention potential criminal investigations. The CIA & FBI have already used slander and framing to take down people they decided to, people that we now consider to be on the right side of history. I have no doubts they'll do it again and worse.
Also my friend just brought up fake revenge porn, which leads to "evidence" of cheating to break up relationships, or maybe non-sexual content that becomes "evidence" that you should be written out of your parents' will, for example.
this has already happened with children and the government has done NOTHING. if any good can come out of this situation, i think it’s that people are gaining awareness for just how scary the ease of these actions are and can actually lead to some change to prevent them
Problem is.. how do you outlaw it? This tech is open source so its too late to ban. It’s already illegal to use the tech on children, in theory, but it’s kind of impossible to prevent.
To me, the biggest thing that disgusts me that nobody is talking about is how he brought this deep family struggle and streamed it for everyone. These streamers are sick in the head for bringing so much personal shit on stream. Deepfakes are stupid and weird, but bringing your wife on stream to cry in a very awkward way is weird as well.
that's what i also dont get. before looking it up i thought she caught him doing it, and forced him to come out, but nah, it was his chat, so why is she there? and also why is this so public. if he need to excuse, do it to the offended party. not an open video that is only going to make it worse for everyone involved.
@@marcosdheleno He said he didnt sleep at all the entire night it happened. So not only was he on lack of any sleep, this is his entire career that he has been building up for years and he probably saw it flashing before his eyes that everything could be gone so he felt he had to do something. Hindsight is 20/20, not everyone can make perfect decisions all the time. Its easy to see why what he did was wrong after it happened.
Dude wanted a fast fap and found no good material so he created some himself. Is that really that bad? I hope he releases the full material, I am getting bored
I feel like it was done on purpose so more people would try to do this and get into it. People that had no idea deep fakes existed. Now that it is public knowledge, watch everything disturbing that happens (political figure gaffes etc) now can just be excused as deepfakes
even if the person faked does make there own sexual content, there own business gets undermined by bootleggers basically, and even if someone consents to be photographed nude in there own content, that doesnt mean they consent to being deepfaked. thanks for talking about this charlie, ur a stand up guy
@frogmen it literally is illegal to sell fake jordans. That's counterfeit product Edit: also glad to know that women are yet another product. Maybe try to see women as human being and not sexual products to produce "cheap knockoffs" of
@@quixoticvalkyrie Lol no selling fake jordans is only illegal if you try to sell them as though they are real, knowing that theyre fake. If these deep fakes are being advertised as fake, then thats it. And drop the woman are objects nonsense, same applies to mens deepfakes. As charlie has stated this has also happened to him.
Yup. I'm sure laws will be made here federally soon because as of now, not many can do anything at all except get them removed. In the internet though... Things always make its way back
I feel kinda sad to feel the need to say this, but charlie, thank you for using your platform to show people how to be a decent person. You're a good man. Truly. I know he is your friend, but I appreciate you making sure this problem is brought to people's attention. I dont think a lot of people would do what you do, so just, thank you
The comments section here is a breath of relief compared to the replies I’ve seen on twitter about this. Lots of porn addicted men saying “well it’s fake get over it” or “being famous comes with these type of consequences”. Lots of trolls just typing “lol” on one girl’s video where she’s talks about being a victim and is clearly upset about this whole thing. These lowlifes have no empathy. It’s a symptom of being terminally online and addicted to porn.
The AI art situation had a lot of these apathetic tech bros call artists "aspiring McDonalds workers", saying art wasn't a real job and that people should lose their jobs because that's what happened with tech advances in the past. You'd think living in an advanced society would have people try to work things like that out instead of letting AI run rampant like this, but here we are.
@@Mrhellslayerz There will always be work for real artists, AI technology is a boon for businesses, instead of hiring several hundred workers you can instead hire a dozen and have them work alongside the AI. Even deep fakes have their uses in business. I would love to see a young William Shatner have a surprise appearance in a Star Trek TV show or movie. Deep fake porn needs to be illegal unless you get consent from the person who's face you are using and the sex worker you aren't giving credit too.
Sorta off topic but Charlie seems like one of those friends who would tell you your screwing up before it’s too late. It’s honestly refreshing to see someone be so blunt and honest about something when it’s about a friend. I’m sure it’s a very difficult situation but I think that it’s pretty inspiring to see that how he’s willing to stand by his morals and say it’s wrong despite having reservations, not wanting to add negativity to the situation and caring about the other person.
I also think that not only just not having consented to being deep-faked (which is still extremely horrible), someone that these women thought was their friend paid for that content and imagined about having s3x with them. Personally, that would make me extremely uncomfortable.
The deep fake thing is definitely weird and creepy but the idea Atrioc/basically any straight male hasn’t fantasized at least a little about basically every vaguely attractive woman in their life does seem to show a lack of understand of how male sexuality actually works. They just keep it to themselves. It’s the actions based on the thoughts that are the issue not really the thoughts themselves.
@@soccrplayr232 thinking about it is one thing, humans are sexual creatures, but acting on it is so weird. this is a different kind of creepy than just fantasizing about someone you think is attractive. especially bc it was people he knew personally??? nah that’s demented. this is a really good way to gauge someone’s morality and i’m genuinely surprised so many people are stupid enough to think it’s okay. i’m like genuinely shocked this even happened 💀
Wait a minute... You think your male friends don't fantasize about doing unspeakable things to you? That's a staple fantasy right next to planning hiw to fight different animals.
@@juno3281 I mean honestly I wouldn't do it and its a little messed up but not sure it was all that big of a deal. The dude selling the stuff is the actual issue not so much Atrioc looking at it and the reaction against him feels a bit exaggerated since he's the only real face to put to it and people are angry it could happen. Also not sure it was so clear he was there looking up people he knew even if they were on the site, I could easily buy he was going to do a segment about AI/deep fakes soon and went down a weird rabbit hole and got caught with it on his computer because he never closes tabs. Still weird just not sure he needs to quit streaming over it.
There’s going to be a lot of “deep fake“ related issues that come from this. Whether it comes from streamers influencers business owners and even politicians. This whole scenario opens up a lot of “plausible deniability” for people that partake in these predatory activities. That is why I am most upset.
I like to see where this takes... since humanity couldnt help their balls to always make advance tech for no reason, i know it is always been wrong but i like to see how far extent human beings will learn the hardway
You’re late deep facts have been around for a while and many websites have been shutdown , Phub was forced to remove every deep fake video back in like 2015-2016
what I'm concerned most about is this being used as a way to bully people. Primerily kids in schools, this can ruin their lives and follow them forever.
As someone who grew up on the internet this is 100% going to be misused with teens. There is a reason why so many young teens used snapchat and it's not innocent at all.
You can't deepfake a person unless you feed the AI tens or hundreds of thousands of photos of someone's face, hence why it's only streamers/celebrities who are being targeted. If you're not a high profile public figure then it's currently impossible to deepfake with current technology. AI can't just magically know what your face looks like from every angle. Maybe in the future that will change but for now I'd say school kids are safe.
This has already happened, a MOM deepfaked videos some girls that her daughter had a problem with, of them smoking, drinking, and having sex, in an attempt to get them booted off the cheerleading squad.
@@seekittycat but wouldn't the same young teens know that it was fake, so they wouldn't fall for it, and instead look at whoever posted it sideways? But maybe the damage of association to that image is great enough to leave a permanent scar in their minds. But haven't kids done that since forever? Especially female bullies, since they resort to other things than violence, like reputation destruction? I seem to remember the one girl that committed suicide but made a video where she flipped through cards before she did it? I forgot the name, but it was a while ago. I remember finebros reaction channel did a video on it.
I don't think you know how consent works in conjunction with being an online celebrity. Porn stars give their consent to have their bodies shown online in any form as they have sold their body. Streamers consent to having their face be used outside their stream indirectly through free media practices.
The "apology video" was the worst idea ever: 1.) More attention to the AI thing 2.) More stuff for drama channels 3.) More attention to the AI thing due to drama channels 4.) Bringing your wife into for 0 reason 5.) Literally will never go away due to the 15 min video
well, the alternative is to say nothing and then people call him out and draw attention because he won't apologize and well... the apology or lack thereof didn't matter much, it was gonna blow up.
@Zanaki a written apology and let it blow up for a while (which it will do no matter what). This just escelated it way more and made it way more drama and clip friendly. The only two sides that gained something from the video are drama channels and AI stuff producers.
Before I say what I’m about to say, I don’t condone Atriocs actions whatsoever. Just friendly discourse of the subject :) I think points 1-3 are the same point; I don’t think the apology stream or whatever it was he did would bring much more outside attention than the original incident. With more coverage from more channels,and news sites, and drama sites and so on, any attention created by those outlets are ultimately out of his control. Bringing his wife on beside him could definitely just be a comfort thing, but I agree it’s out of place and the guy is basically just a business oriented psychologist so I wouldn’t be surprised if it served a different purpose. Yeah, none of the fucked up deepfakes will ever stop unfortunately because there will always be piece of shit losers who could care less about what other people feel. I did hear the original website was taken down, but things like this never really go away in an environment where anyone can do anything almost entirely anonymously. I used to watch Atrioc from time to time to fill silence so it’s possible that I could have some subconscious bias, but still it’s deeply saddening to see that he would use his friends like that in such an abhorrent way. I learned it’s just important to remember that UA-camrs and Streamers will only ever show you the part of themselves that they want you to see. Oh my god I wrote a fucking essay again sorry about that; anyway that’s all
There are also now advanced deepfaked voices, I've heard them and they sound very very real. Not choppy, not stuttering. All you need is a snippet of someone speaking with no music behind it. This is getting INSANE. I guess a "positive" side is it can only do american accents well.
i’m not sure, the way technology is progressing it is getting harder to control and catch up to. i guess one way is to have these adult sites be monitored better. maybe every video should have written consent with footage before being able to be uploaded and confirmed.
Nope lol, deepfakes will keep on being created which is the sad reality. Mostly to appeal for the creeps that wish they saw someone in specific without clothes
dont share info on the internet lol its your fault when people take advantage of the info YOU uploaded. Your face is PUBLIC when you put it in PUBLIC space. Id hate it if it happened to me so i didn't put my info or face on the fucking internet.
Our government could get off their ass and pass laws making it illegal to make or distribute falsified porn using someone’s likeness without their consent. It would be an ongoing battle, just like it is to have revenge porn or CP removed from sites
Damn, felt it when he had to call someone he used to be friends with a porn addicted loser. He almost held back a little bit but he came in strong and fought his own urge to sugar coat someone he wishes didnt make such a horrible mistake.
It started with faking porn of celebrities. The next big thing under that is content creators who are internet celebrities. Kinda makes sense that this was going to happen. It was just a question of "when will it happen". Also before AI generated, it was people who looked similar but not exactly identical to famous people who were doing porn and branding themselves as "fake X famous person" but the titles were always "X famous person doing Y" So again. It was just a matter of time. Doesn't mean it is ok though.
The really terrifying thing is that it probably won't be long until this kind of tech is easily accessible, like chat GPT. Its gonna become a huge problem when some creepy fuck can just use someone's instagram to generate this kind of content...
@@RubiixCat It's going to be easily accessible, almost impossibly un-enforceable at a large-scale, and going to get more finer tuned and powerful as learning sets and designs improve. Not a question of if, which is scary.
and then after that, it's gonna be normal people, and people are gonna use it not just for porn and shit like that but also to fake incriminating evidence in court that could ruin people's lives
I was wondering when Charlie would cover this topic, and I’m very glad that he did so in the way he did. He was respectful and did his best to make everyone aware of what was going on without making a huge deal out of it in a way that would draw even more attention to it.
The interesting plot point of this whole story is that not only was the ad not malware but was a legitimate site with a paywall and then the traffic exploded on the site to the point it was taken down
I can't imagine going through something as serious as having intimate images of myself spread on the internet, fake or not. Stunts like this harms people. Real people! It's unacceptable!
@@krotchlickmeugh627 As Charlie mentioned, deepfake technology is apparently so advanced that you wouldn't know if it's real or not. Anyone could make deepfake porn of you, me, our grandparents, or anyone we could think of
I like Charlie because he is chill. He sees porn of himself he didnt know existed and laughs, but immediately understands how dangerous it is. Dude can laugh and be serious at the same time.
Really appreciate your language use here because it feels intentional and important; referring to the women he looked at as "colleagues" and "coworkers". I think something that seems a bit lost in this discussion due to it being on Twitch is that this is a man who fully looked at porn of the people he works with. In any other "real world" scenario, if your coworker got caught with porn of you and your colleagues (regardless of the gender) on his work computer there would be an immediate understanding about how absolutely disgusting and straight up creepy that is, and how rightfully so in that situation, that colleague would most likely lose their job because of sexual harassment. I hope the women who are victims of this really have more support not just from their user base, but from Twitch directly, because this is just unacceptable behavior that's eerily being described as "normative".
@@lioreubm677 Yes and no. (Basically) all men have a sex drive wired like this, not all men act on it. Many men are good people and dont indulge in their darker side like this
@@eligedzelman5127 “darker side”? The way you’re describing men makes it sound like all men have dark, horrible, twisted urges, but the ones who don’t act on those are good. That’s not how people work, and it’s not okay to generalize men in that way. Some people do have bad thoughts and bad urges, while some don’t. And yes, it is scientifically proven that males have more libido than females, but that doesn’t make them inherently creepy. It’s okay to have sexual urges; most people do.
Thank you for explaining the relationship between none sexual content creators and how hard they’ve tried to stray away from that. I feel like a lot of people are just blind to this and confused on why people like pokimane cheers her only fans friends on yet she’s upset about this. Their takes are sooo wrong and I hope they listen to you and understand this. You explained it well.
I hate the fact that they're people in this world who willingly do this to other people. Deepfakes can be funny for a meme but when that technology gets into the wrong hands it has devastating consequences. Thank you for speaking on this
@@Unknown_Genius I get what you're saying. Memes are usually lighthearted like deepfakes of Nicholas Cage I've seen. Porn is a whole other level, it can be detrimental to the victim's wellbeing, safety and public perception of them.
@@iino07 Deepfaking someone in a meme-y context is still taking their image and using it in some way they don’t necessarily consent to. And memes can also mess with someone’s life: the person in the Bad Luck Brian meme is a good example. Its still exploitative, and should be morally wrong: just because the intent is less sinister doesn’t mean crossing the boundary should be ok. This whole situation is upsetting but is to some extent ubiquitous. People have been being exploited since people started pulling their phones out every time someone or something slightly catches their attention. There’s already very little consideration given to how the subjects feel or what they consent to, so it’s unsurprising that something like this whole situation is deemed ok by a shockingly large amount of people.
@@Neo2266. Someone could use that blackmail you, fuck with your personal relationships, cast doubt on your credibility. More reasons besides. It’s like someone having a you-seeking missile. Even if they never use it, it’s disconcerting that they have the ability to hurt you any moment they choose.
This is really upsetting. I’d feel so vulnerable. You can’t do anything to stop it. I’d hate to have their family to see that or be sent to them. Really sick stuff.
if someone made deepfake pron of me and sent it to my family, I think that would be hilarious and I doubt my family would care once they knew what deepfakes were.... stop being a victim and caring so much lmfao
theres ai deepfake of voices too, you can legit make someone say anything you want politics and literally everything's about to go more downhill than it ever has
@@captainmycaptain8334 reminds me of the movie Tango & Cash, where a cop (Stallone & Russell) got deepfake voice recording that caused them to go to prison
@@crypticslayer69 the issue isnt any of us being gullible its other people that dont know its fake, you can say its fake but the tech is so good it wouldnt convince everyone, look at the state of the media now where obviously doctored/staged shit ends up making the news and dividing people, can you imagine stuff that isnt so obvious. Not to mention the population of gen x and boomers who believe everything they fucking see on fb and twitter. trying to convince them its fake would be like trying to convince a fish to crawl out the water
You turning this video from the "Fault of Atrioc" to "The Bigger Issue at Hand" is a very smart way of tackling this. Hopefully something can get done about this, whether it be more regulation or something.
I've been aware of this for a long while now, but only seen it with big name celebrities. I genuinely think this needs to be classified as revenge porn.
Nah revenge porn is something different, people don't make deepfakes to take revenge on people. It's mostly people in relationship that break up and then the one of the person takes revenge on them by leaking their nudes
@@johnrenwelmauro2387 I don't get it either. I'm a single adult man and use incognito on my password protected phone religiously if I'm looking at porn, for no other reason than I just don't want weird adds/search suggestions, etc. These guys career are on the net, the stakes are infinitely higher for them. Fucking lunacy
I don’t know why I’m so paranoid but I got another app that is protected by Face ID and cannot be opened with my password if someone somehow guessed It
This genuinely scares the shit out of me where it’ll be in another 10 years. The Snapchat crying filter is already wild enough. Once this becomes widespread and people are able to use it maliciously. The future is really quite scary
The ramifications of this are gut-wrenching. Just one tech-savvy person can ruin someone's entire life over something they didn't do. Even Inside Edition did a report on something like this. A student had a video deepfaked of her vaping and the video was sent all over including to her parents, just because the student was a cheerleader and some other cheerleader didn't like her. It's unfathomable, to have one's life ruined over something they never even thought of doing, it could be anything, porn, a crime, anything compromising or embarrassing really. I just can't put this into words.
@@tedhaidydei8261 this is something a random person will be able to do tho and destroy someone’s life. It’s scary man. And it’s only gonna get more advanced
@@SagaciousTv Ok. Any random person can kill anyone as well. What im trying to say is that things like these are preventable. We can prevent them by making them illegal and severely punishing people who commit them . Just like murder, assault and pretty much anything else.
Deep fakes in general are a very scary concept because they have gotten so realistic. Even getting away from the sexual stuff, deep fakes can still ruin someone’s mental and reputation with how real they have gotten. What is extra scary is how good some private tts companies have gotten at copying peoples voices
Especially if it's That vegan teacher's latest shorts.
waltah🫵
And as another bad side effect, it gives possible deniability to REAL evidence of people doing bad things. It's all-around awful.
at least watch the video first
Watch the video first weirdo
I discovered I was deepfaked from this whole situation, and honestly even though I'm hurt I also feel for the sex workers who had their work stolen too. Both me and those SW who they put my face onto were sexually exploited and sold without our consent by someone we don't even know. Thanks for covering the situation, hopefully more people with be careful to make sure all of the sexual material they use was consensual.
So sorry that had to happen to you
dang, honestly deep faking is getting way to real, its going too far
Unga bunga unga bunga unga bunga
@@yabo5131 excuse me? so you’re saying that women should be made into deep fake porn without their consent and they’re not allowed to complain about it because they are “actual” sa vicitms? yeah dude both these people can exist. if your idea of “consent” only relates to physical sexual assault that’s your problem
@@yabo5131 she never said anything about SA. Someone selling your image, without your consent, for sexual purposes is sexual exploitation not SA
I was very innocent in high school. I never taken a nude or even made out with anyone. Never sexted. Yet, there were multiple people that were convinced I sent someone a nude. Turns out, it was some other girl that looked similar to me. The guy lied about who sent it. I have no idea why. It is an awful feeling to be judged for sending out sexual content when you never have. It is such a violation. I couldn't imagine being in the situation as the streamers. What they are going through is in a much larger scale. Truly horrific.
that is so horrible oh my god. i am so sorry you had to deal with that
Nah lol them girl streamers are fine they'll just break ties with dude an go on about there life in my opinion
@@whitemamba0089 "in my opinion" girl ur opinion doesnt impact anything no offence
@@9t4nek Did you just call him a girl? Shame on you
Sure Courtney. Your parents might believe you but I don't.
this shouldn't be a 'ugh this is so weird' situation
publishing or selling ai deepfake porn should be illegal, a legitimate punishable offense
How about making all porn illegal
It shouldn't be illegal to publish it as long as it is very clearly marked as deepfaked and not real. If everyone knows its just an edited image then it's no different than creepy fanart, it's when everyone's convinced it's real that it's a problem.
@@Raptorworld22 No, it's currently classified as revenge porn in the areas it's currently banned, and I think this is a fair classification. There's a few reasons this is the case:
1.) Well-made deepfakes are indistinguishable from real photographs, and this is being further solidified by Dreambooth technology. It's worth noting that CP "art" gets charged as actual possession if it's modeled after real children or is too realistic.
2.) "If everyone knows it's just an edited image" requires every single point on the distribution pipeline to clearly mark the image/video as deepfake. This origin can wind up obscured or lost, either accidentally or intentionally, resulting in people believing it's real.
3.) The intent of deepfake porn is to obtain sexual content of someone who hasn't produced that content. You are bypassing their willingness to take naked photos or record videos of them having sex
4.) This is a thing men don't really understand, but most women are straight up HORRIFIED by the prospect of being seen nude by people they aren't intimate with. I for one had an incident where an ex landlord put a camera in my bathroom, and 6 years later I still have severe anxiety and trust issues from it.
5.) A common victim-blaming tactic men have towards women who have their nudes leaked is "You shouldn't have taken them." With deepfake porn, a person's choice to have nude photos exist of them is completely removed.
6.) Women have to deal with constant sexism and sexual harassment in their lives. Our ability to choose who can see us at our most intimate is one of the ONLY things we can control in that regard, and it's being taken from us.
@@cargyllion 1) Didn't know that, but if this stuff really is so accurate to real life now, then I am changing my stance and agree it should be illegal.
2) I never said my idea was practical, but in a perfect world that is what would be done.
3) This point doesn't make sense to me, as it's no different than creepy fan art in that case. Do you also think we should make drawing people without consent in explicit context illegal? Actually curious what your stance is there.
4) I don't know what men you're talking to, but every man I know would be equally horrified in this context, where everyone knows/thinks it's their body and it's without their consent.
5) Isn't applicable since my original point was if *everyone knows it's fake*
6) Having been a woman for more than half of my life, I know you are talking complete BS in that first half, unless you live in Saudi Arabia.
Overall, I am changing my mind and agree that this very convincing deepfake stuff should be illegal. Different opinion on fanart and terrible photoshop jobs.
@@Raptorworld22 The current situation with Dreambooth deepfaking is that anyone with an Nvidia gpu made in the past 6 years and 20 or so images of a target individual can generate a model capable of replicating that person's appearance. It can generate you nude, in an SS uniform, wearing a Jace Beleren cosplay, whatever you want as long as you fed it reference images. And it's scary accurate.
Literally the only saving grace right now is the current training models can't parse complicated tattoos and facial piercings. However, there's a large community of people working to fix these issues for the expressed purpose of generating AI porn.
Charlie is slowly transitioning into a being of pure black shirt energy
@Red Star 🅥 🤓
it’s like watching rick grimes go from officer friendly to season 5 rick
It's his villan arc
Cap also fake check mark
Almost as good as black air force energy
someone made a deepfake of my cousin committing sexual acts on a minor and it almost completely ruined his life, it was sent from his facebook to all of his friends it was awful
That is super fucked up, I hope that he's since doing better.
That's crazy and messed up.
jesus…
That’s so fucked. I’m curious how defenders of this AI pron would rationalise this.
Who did that
When moist barely uses any metaphors to explain the situation, you know he's completely serious.
@Auracle this is not just Photoshop for videos dude
@Auracle would you like to be put in a porn vid without your consent?
You’re acting like photoshop can’t be used to destroy people’s lives either.
@Satchelhandle fr
@Auracle I think you didn’t watch the video did you?
Aside from the tremendously creepy aspect, it's good that you brought up how deepfake porn could potentially be damaging to the careers of the people being deepfaked. Think about it; the wrong people see something like that, like advertisers or investors, they might get fooled by it and decide to cut business ties with the people getting deepfaked. In short, just imagine your favorite content creators getting blacklisted due to sexual content that they never even produced; that's horrible.
This is a good concern, but I'm actually annoyed that companies even care so much about people having sexual content. As long as it's somewhere children won't accidentally stumble across it, there really shouldn't be a problem.
We just need to make discrimination on a professional level completely illegal
@@impishlyit9780 100% agree with ya. Just because someone's made or performed in pornographic content/media doesn't mean they should be blacklisted.
It's a paycheck. Nothing more, nothing less.
So long as all the people viewing the adult content are of-age themselves, everything's fine.
Man, I got sidetracked on this one xD, but it's an important topic too.
Even if they aren't tricked by it personally a brand could probably be convinced to drop someone just because it exists at all especially if it gets popular enough. Real or not a brand that's promoting say kids content or trying to have a "family friendly" image may not want to be associated with that. I don't understand how it's legal anywhere given these companies should have no right to monetize your likeness anyway. Really gross situation and technology keeps advancing faster than we can plan for the ramifications of it.
@@josephschultz3301 agreed. Here where i live, a female teacher broke up with her husband and he, as revenge, shared online some porn videos they made together (that were private, they never posted them anywhere before, it was private videos for themselves and pics he took of her naked). Turns out she got fired from working as a teacher and blacklisted from every teaching career in our country, while he only got like 5 months of picking up trash as community service for doing something illegal. AFter those 5 months he can get back with his life (of course HE isnt getting fired nor blacklisted from any working job even if he was in those videos too, only women gotta be slandered for sexual things) meanwhile her life is ruined. Some idiots still insist on defending him and blaming her. Our country had a lot of protests about this.
What scares me the most is the potential for cyber bullying and the fact that I can predict with great certainty, that this will be used by school students to bully other kids
I give it 3 years
Damn.........................
Only for this generation, the next generation of kids will grow up not trusting video / audio. Out of all the dangers AI will bring, this is one of the things you should be least worried about.
It could also be used to blackmail or discredit people, disrupt criminal investigations, create fake alibis...
Elon was right about AI taking over monkaS
I can only hope deepfake detection evolves as rapidly as deepfakes are
Proper anti-cheat in games is easily bypassable, taking that as an example, deepfake detection has no chance.
Schaffrillas when he watches The Owl House instead of paying attention to the road. 🚗💥
The technology used to make deepfakes is what’s used to detect it as well so the advancement of it should stay 1:1
This been around for decades bruh.. back in the 00's it was photoshopped celebs. They've always found ways around it & unless actual federal laws are made, this ridiculous stuff won't stop
@Mikonson your shoe size is 6, taking that example, you have very small package
We did it, guys. We have lived to see man-made horrors beyond our comprehension.
I comprehend them.
Black Mirror is mandatory as far as I'm concerned. Sleep walking into an AI dystopia
I think there's been a few things in history that were a little more horrible
I shall study these horrors so I may one day comprehend them
It's just photoshop but automatic. Ya'll freak out way too easy.
the biggest shock for me with this whole situation is that this wasn’t already illegal. there’s already laws and defamation lawsuits surrounding using someone’s likeness in anything without their consent, but for this kind of material is sickening.
Exactly! And then you have guys like d*stiny frying their brains out to try to understand why deepfakes are bad 🤦🏼♀️ when is just another form of defamation!
It is sickening but does that also mean it's illegal to edit someone's face into a meme without their consent?
@@llama6394 it should be, lol. There have been cases in which celebrities sue companies because they use their image without their consent and/or against their will 🤷🏼♀️ but idk, the internet has normalized trasspassing personal boundaries to an extent where people are just mere objects...
@@llama6394 current law, sort of? Best example is with the old man “hide the pain Harold” or whatever that meme’s name is. His image, especially for those photos, are under copyright like any other stock image. So we do know that memes can be under copyright and can’t be used for profit or undermine other’s work. There’s also celebrities suing games, animated shows, etc. for using their likeness without their knowledge.
However those are higher profiles figures, and common streamers would have a harder time fighting that in court. There’s also AI involved in deepfakes, which we still have no laws governing, and other issues of giving streamers protection from these websites.
Honestly, one of the better things to combat it might be a libel/defamation lawsuit, since this basically is faking a porn tape for someone. And that’s outright stated under defamation as an injury of character and easily would at least have a trial for it.
It’s because the laws have to consistently adapt to technology so they will always be behind. We unfortunately won’t be able to experience a society we’re the laws will be ahead.
Charlie with a black shirt really has the energy of a dad explaining why something is wrong.
Its not good sign.. idk what happened.. maybe sold his soul? Replaced by clone?
Schaffrillas when he watches The Owl House instead of paying attention to the road. 🚗💥
It’s basically like Spider-Man with the symbiote, too powerful of a force to be reckon with.
@@furryprideworldwide816 L
@@furryprideworldwide816 Loser!
The worst part is that the people being deep faked were personal friends. That is a level of down bad I never wish to witness, let alone experience.
She baked his wedding cake 😭
absolutely down horrendous and creepy bro
Schaffrillas when he watches The Owl House instead of paying attention to the road. 🚗💥
And he’s married
yall never had a hot friend that u wanted to sleep with but couldnt? idk why everyone is freaking out over this. everyone has their own private fantasies or curiosities. ive looked at some really really crazy stuff online just because. its not like atrioc was personally creeping over them, it was a solo curiosity that he accidentally showed.
The depressing truth to all of this AI deepfaking is that there’s creeps on the internet that can easily make porn of popular minor celebrities and that alone sends chills down my spine
You and I both know that this is very much a real thing already. What’s worse is that I doubt that there will ever be an end to it
@@Lunar_Capital it’s an unfortunate realty we live in, that’s what makes this scarier
Apparently there’s already been AI CP being made and it’s absolutely disgusting
@@suburban5
Just hoping lawmakers see this more as a real problem.
If this really scares you that much, I hope you never discover just how messed up the world really is.
Even if he found the website randomly which is very unlikely, the fact that he didn't notify his fellow coworkers and kept it to himself is pretty stupid of an excuse.
Let’s be real, It 100% wasn’t an ad at all.
@@bonoboboy821 ofc, I mean that's way too obvious anyway
@knockout Well in order for him to know it was a deepfake site, it would have to be advertised as such, and presumably it would also advertise being able to fake his coworkers. Also I'm pretty sure the hub doesn't allow deepfake content so it makes no sense they would allow those kinds of ads.
@@mezdemundi7115 why wouldnt pornhub allow it? they have ads for brazzers which is a competitor.
To his credit, that’s a pretty awkward convo to have. “Yo bro, just letting u know I found this online service where it deepfaked you onto porn.” “How did you find it” “yeah, I was down bad and clicked on some ads, paid for some website, and beat my meat to it. Sorry for not telling you sooner.” “Bro wtf” that’s only one possibility of course, but ya see where I’m going with this? After the action is done, kinda hard to bring it up casually.
I remember less than 10 years ago, local stories about how a teacher lost her entire teaching career because when she was younger (and she was still young when they came out), she did adult content that some of her students eventually somehow stumbled upon online.
That was from someone doing adult content for real.
Now imagine if highschool or college students have the power to just deepfake a few convincing pictures of their teachers in adult content.
They could ruin their career and reputation because even if it's proven fake, school establishment don't like to be anywhere near that kind of stuff and the content will probably remain available somewhere.
Yeah it really feels like opening Pandora’s box, there’s no telling how far it’ll go
Why does deep fake even exist to begin with?
@@experimentalghoul3540 So you can elect puppets into office.
@@experimentalghoul3540 I’m not sure what you mean. Like anything else it exists because there’s a market for it people will pay for a service that lets them see their favorite celeb in adult scenes.
Regulations are being put in place in 2023.
Remember back in high school when rumors would spread about people? This is the ungodly, coked-out, and roided-up evolution of that concept. This is fucking terrifying beyond comprehension.
And considering how even adults still act like high school kids despite being in professional positions...
I am in high school rn I am a freshman and I never hear rumors about people in my school
@@luckyoshi you're just lucky. If you genuinely believe high school isn't drama ridden you're naive as fuck
i like to see more of it...
Yup. The ramifications from deep fakes are large. Between mental and financial, you got legal issues. Think about how someone can be deep faked to cover up a serious crime
The ethics of AI technology and deep faking are very concerning, and not just this particular situation. Not being able to distinguish reality versus fake reality is a slippery and dangerous slope.
Everything on the internet is fake.
Oh stop living in the past. /s
you know as a woman i’m so glad to see such a prominent male figure in the community speaking out against this. i’ve seen a lot of women being disregarded and belittled for saying the exact same thing and as shitty as it is, it’s nice to have someone to back us up.
Agreed. Pretty friggin' sad how many men are downplaying this.
Agreed, charlie and this community are what often restore my faith in humanity to be honest. Sometimes I'm really scared if no one in this world actually have decent clarity and conscience for everything that I've seen in my life. And it is MEANS A LOT for me to know there are peoples like charlie and his followers out there with sane minds and willness to speak up for those who need it. Charlie if you see this, you are impactful and honorable. Thank you 🙏
Yes 100000%
True
right
extremely well said. it’s devastating how desensitized our society has become to advancements like this.
Well said yourself.
The societies in the west are pretty much fucked, no surprise there
Kind of cool I'm glad I know of it now 😃
wow
I mean when every girl i see often does sexual stuff, it got me really desensitized... i know it feels bad but when it's like pokimane and all, i cant even flinch
HealthyGamer_GG did a deep dive on this today that goes into some science about how nonconsensual porn affects not just the victims, but the consumers, and one of the chilling parts of it was that the deepfake stuff is behind paywalls because it's more addicting, because they're attached to an existing powerful parasocial relationship. It was a powerful take and I hope everyone sees it.
Definitely will dig into that, Dr. K has amazing, insightful, compassionate content. Thanks!
Thanks for the rec
Isn’t all porn non consensual? The people participating in the act consented but there’s no way that they consent with everyone who’s watching, and as a passive observer i doubt that anyone is mentally checking off if it’s consensual or not anyway. As far as the individual is concerned they are partaking without participating and that is often a part of the harm done by porn.
@@sparks6177 I mean they’re essentially consenting to all consuming by consenting to have to video publicly displayed.
@@sparks6177 people who make porn willingly and post it publicly are also consenting to having pretty much anyone view it. It’s comparable to UA-camrs and actors where they sign on to have anyone who can watch their content, it’s part of the deal. Steamers do not consent to having porn made of them because, well, it’s pretty obvious why. They are not at ALL the same.
it’s sad that it has to be pointed out that this type of technology can be also used to frame people for sexual assault or other heinous crimes for people to actually see an issue with AI being used in such a way.
I can see governments and corrupt law enforcement abusing this
@@jess648 very true! and that’s terrifying to think about
@@alyssaaaaa04 yeah…
Fun fact this won't happen since the ai that creates the deep fake is also able to detect deep fake as it trains two ai one for creating fake ai and one for detecting fake ai to improve it. So no it's not going to be possible that it would be used for blackmail
i think the crazy shit no one else or charlie touched on, is the idea that they can use deepfakes of minors and young celebrities or actors. theres probably already some floating around on telegram i bet. dont doubt it one bit. smh
I remember when deepfakes meant putting Nic Cage in all the movies, ah good times.
Good ol’ times
Now it's about putting something in nic cage
The sheer embarrassment of this situation feels so tangible, I don't know how one could live with that
bots. on a serious topic video.
MrBeast IS THE ĀŅŤŀ christ
He's definitely gonna be known as the Deepfake guy for the rest of his days
@@radicallybeanbots on every video
I’m better than Penguinz0 my content is better😎😎
I am really worried about this, especially in terms of crimes and trials. It has become so easy to frame someone of doing something gross which can be used during a divorce or just something illegal. Even if some IT specialists can prove that a "photo" or a "video" is a deep fake, it still can be so devastating for a person even if he didn't lose a trial. Drama usually stays for eternity in one's reputation.
What I'm trying to say is AI faking action of real people can run out of control extremely easy.
So Disney is not capable of doing s proper Deepfake in one shot but we supposed to believe this can be done in court videos.
De-lu-sio-nal.
So the reason why thst won't be a huge problem ris because it need thousand of hours of video and sound to make it even kinda believable so no normal person would be able to be deep fake only people in front of cameras often and if you in front of a camera often you have easier time to prove your innocent
@@bobberry1463 even if that’s true, anyone who has hours of content of them - with, or WITHOUT their knowledge - even just a few solid minutes of footage / other stuff and you can still do this. there’s been deepfake porn of people who are pretty chaste online (like billie eilish). and i’m sure people who are looking for that kind of thing won’t care if it’s poor quality or not as long as it’s close enough
for sure it would be harder for people with less footage of them up / less public jobs, but even still, it’s better to protect everyone - silly example but genuinely: but does flo from progressive not deserve the same amount of privacy as any of us?
just something to think about. 100% always better to make something more safe than less safe.
We need to stop believing video evidence is infallible, that's the main takeaway here.
@@heavenwaits "does flo from progressive not deserve the same amount of privacy as any of us?"
I don't see how it's a violation of privacy.
If you're still trying to defend it, just imagine if it was someone you loved.
@Edward don’t try to say “EVERYONE”. Don’t lump other people in with your gross behavior.
@@justasmltwngir1732 get the fuck outa here. You're gonna tell me you never flicked the bean to anyone besides the pornstar you're watching? Oh what you don't watch porn? Naw let's not even stop there you don't even tug em out right? Do you hear yourself. Everyone that masterbates has their "inspiration" cause you sure as hell ain't jerking it to literally nothing, regardless of whether or not you watch porn.
And my reply to op, I would know it's not someone I loved, so no it wouldn't bother me one bit. That said in my opinion deepfakes should be labeled as a deepfake and not tried to be passed off as real.
@Edward Deepfake porn has gotten so advanced that it's quite hard to distinguish real from fake. Without proper evidence to disprove the deepfake, it can easily be used to slander. Although I do agree that deep fake porn of celebrities and social media personalities will always exist, that's just the price of fame. But what is disgusting is that it was of his friends and even his friends girlfriend.
I won’t lie there are more horny motherfuckers than acceptable here. People really will sexualise anything, even the most wholesome humans imaginable.
@Edward everybody has fantasized if even a little about somebody attractive in their life but not everyone seeks deepfake porn of them. That shit is definitely weird man
The guy just got married in August, he had to admit he paid for those deep fakes. I'd die of embarrassment.
daaaaaaammnnn omg
Plenty of dudes buy porn now a days and hide it from their wives
and his bestfriend's girlfriend who made his wedding cake for free is also on the deepfake website. disgusting behavior.
anyone who pays for porn is too far gone tbh
@@yxnnah yikes, i knew it was bad but didn't know it was THIS bad. What was bro thinking when he made that decision
im glad that someone with a large platform is talking about this issue. ive seen this exact thing happen to many female streamers, but because they were smaller, nobody said anything
I love sweet Anita deepfakes
Cry more
@@THIRTEENTH13TH bruh what
@@Justin-rt7yp get out
@@Justin-rt7yp 😐
It’s honestly surreal that deepfaking has reached such a point to where it’s starting to become an actual concern.
This is technology that humanity simply isn't mature enough for. Imagine videos of crimes and the like being doctored with deepfakes in order to falsely imprison people. That's one of the first things that came to my mind when I was first introduced to it.
It just makes me sick how so many miserable people spend so much time of their day making fake videos and cheap deepfake animations on people. They deliberately do it to harm and Destroy peoples reputations.
And I don't get why they are so desperate to see somebody uncovered in the first place, acting like middle schoolers who have never seen a woman's body before
It’s been a “concern” since 2015 it’s not really new nowadays
it was 100% clear from the second the technology started to exist.
look at games from 1998 and today. it was inevitable that this technology is advanced as this.
I was once recorded without my consent in a sexual manner and was blackmailed. Even though it was my body I completely empathize with all of these streamers. Because I was never physically hurt, but just the act of seeing me (and in their case, their likeness) being shown without my consent was the most damaging shit ever.
Ya thats messed up, hopefully you got away from that person.
Absolutely insane that people have to come out and say "please don't put my face on a random naked woman's body"
the internet is beyond messed up
This has been happening for literally decades. It will continue to happen.
@@Elfenlied8675309 that’s a logic fallacy just because something has been the case for a long time doesn’t mean that there’s evidence stating it should continue to be
@@Elfenlied8675309 So has murder. But you don't defend that.
@@Celeborn93 nice comparison
@@Celeborn93 Absolutely good comparison. We have to think of black and white in these situations. Is this good or bad. It is bad. There cannot be middle ground especially when those whose likeness is being reproduced explicitly and implicitly stated they do not want to be used in that way.
Deepfakes have always irked me since I first heard of them a few years back, and knowing the technology for it is only getting better, that bad feeling isn’t going away. There’s so many ways it can be used in the wrong hands- blackmail, revenge porn, straight cruelty by sending it to friends/family, getting people fired, using CHILDREN, etc. It deeply, deeply unsettles me. The response to this whole streamer situation greatly concerns me, as well. These women are being openly objectified, having their privacy invaded in one of the worst ways possible, and some folks out here are so quick to brush it under the rug. It’s a shame.
deepfaking should probably be illegal idk
There were people claiming deep fakes were coming that you can't tell from reality 12 years ago.
Yea it is
You’re 100% right here!tbh technology is scary af nowadays between the ai art the deepfakes and the ai that emulate people’s voices it’s definitely risky af and could lead to a lot of horrible shit happening and it sucks seeing people being taken advantage of and having this horrible shit happening to em men women and children alike can be victim to ai deepfakes of all kinds including adult content. The thought alone makes me feel sick! I can’t imagine what the victims in this must be feeling and going thru tbh
holy shit you're right people can make actual cp off it damn
I remember this one adult actress who had been affected by DeepFakes being applied on their content speaking out about this, it was something along the lines of “You’re taking something that I have produced for people to enjoy and you’re twisting it into a situation where the other party has literally no consent in the matter.”
Adult DeepFake content is just such a strange thing to me, on the one hand I can to an extent admire the technology, but we really should be drawing the line at DeepFake memes for comedic value, cause when you cross into the territory of things being practically undetectable to the naked eye, it can genuinely ruin people’s lives.
Honestly it's hard to say where to draw the line. Comedic value is subjective and there can be stances within memes that people don't want associated with their faces. Even the film industry has morality issues about deepfaking. Good quote btw.
If you remember who, drop their name. I want to see their full side of the story.
@@MammalianCreature I think it was in one of Shane Dawson’s old conspiracy videos on Deepfakes, long before all The weird shit about him came out. i believe it was his video “Conspiracy Theories with Shane Dawson” from 2019. I’d say the DeepFake section of that video was probably the last interesting thing he made tbh.
@@craigyeah1052 I say in memes for instance like what Corridor Digital does, they had that Keanu Reeves stops a robbery deepfake that was quite impressive, or like when people do stuff like deepfake nicholas cage into random scenes.
I agree though, it’s kind of hard to tell where EXACTLY to draw the line with this kind of stuff.
@@MammalianCreature 🤨📷
Crazy how we've all forgotten about empathy and the golden rule. If you dont want this done to you, why do it to others?
I truly dont care if someone made deep fakes of me, id even say they are real if they looked good
@@iloveplasticbottles Yea just because you think deepfakes of you aren't a big deal, doesn't mean it's the same for others. People have varying opinions about this kinda stuff. And it's best to be on the safe side and never make deepfake porn in the first place. And if you plan to do so, just make some of yourself 😒
It's so crazy to me that some of these guys are defending it exactly why other men look bad. I'm not really surprised that most of the victims are women. They abuse the use freedom of speech and the amendments rights to justify the most digenerate thing ever.
@@yabrofenkoI think you were refering to @drakeweddner.
It's weird to me that there are some folks out that you have to explain how morally unacceptable it is to exploit people on the internet with deep fake porn. Is their moral compass broken? This is about respect towards other people and consent.
I wonder the exact same thing. The lack of empathy astounds me.
@@aurea. I feel sometimes like the whole internet is full of soulless sociopaths, but then I remember a lot of people on the internet are teens thinking it’s cool to “not care,” if you know what I mean.
It's what happens when viewing tons of porn is considered normal especially for guys so there's nothing stopping them other than themselves in how far deep they go
"consent" Do you complain about comics making fun of specific people? They certainly don't get "consent" to rail on about people.
This is so delusional, because it assumes that these celebrities are actually being affected (exploited!!) by the deepfake industry. I like how everyone has just ignored that this has literally flown under the radar until this weird incident by a streamer. So much proof of it having a real tangible affect on celebrities in general that it took a random giant streamer to have clicked on an ad to suddenly bring this to internet wide attention.
There is real criticisms to be had like the companies labeling the porn as REAL!!! and it being charged for, IE, making money off of someones likeness/copyright without their consent, but the idea that porn deepfakes are inherently themselves *evil* is actual karen behavior. These do not have actual effects on anyone, except in the small exclusive cases of direct blackmail, and even THAT situation is bullshit because there's also tech that detects deepfakes that is constantly evolving alongside AI tech, so any faked blackmail is usually very quickly detected as fake and thrown out. Charlies example of "muh companies pull the trigger like winnie hut junior" is not a problem of deepfakes, but companies being pussies, and one that should be ironed out. I'm sure a few cases of potential "leaks" turning out to be fake resulting in triggerhappy companies looking like dumbasses will curb this behavior.
There is no such thing as a moral compass.
Dude. My biggest concern is how this could be used to frame innocent people for crimes. The state or a malicious person could take someone down by just making up video evidence of them committing crimes they never did. If this happens a lot, we could end up in a world where video evidence can no longer be used in court (unless we find a way to better identify AI deep fakes).
Video, audio, everything. A post truth world.
Yh that true or used to create more diversion by creating like fake hate crime ect u never know with western govornents
Security camera software could add features to grant access to footage (where it is viewed on its official website/app), that way it can't be fabricated
I could be wrong but wouldn't you still be able to use metadata? Phone/camera manufacturers could use some sort of verification system to confirm whether a video was real. Obviously a lot of videos/images would no longer be valid but I don't think its entirely a lost cause.
It just should be illegal in all honesty
i had a similar experience. i was only 16 and one of my friends (20 at the time) spread around fake screenshots of me saying sexual things to him. it was all for a “joke” but people genuinely thought it was real, and he played into it. it’s such a disgusting feeling to avoid touching anything remotely sexual as a woman, only for men to decide you don’t get to choose whether or not you’re considered sexual. they turn you into an object for their own pleasure so easily and it’s terrifying. it makes you feel like you don’t have a right to your own body anymore.
Cry more.
@@Justin-rt7yp Whoa there buddy, bad day today huh?
@@Justin-rt7yp hope the fbi finds your secret hard drive
Men vs Women? You are as bad as the guy honestly. Be a human.
@@Justin-rt7yp Step away from the underage girls
It's scary because anyone could suffer from this, imagine you have a crazy ex who wants to ruin your life and deepfakes you in that type of scenario and sends It to everyone who knows you
Seems like the issue is the crazy ex not the deepfakes
@@eugenelevin9809 only a man could type this holy shit
@@llamarelish4701 cool it with the misandry, being a man or not has nothing to do with it, im a man and i dont agree with him lets not generalize a gender
@@llukeglanton no it’s true, literally ONLY a man could say something as stupid as that. im sorry you cant realize that.
@@alyssaaaaa04 the misandry is insane ngl
My biggest concern is when people begin deep faking individuals doing illegal things. Someone doesn't like someone and so they deep fake them doing something horrible to a child and it's hard to disprove in court. On the flip side of that, once it becomes well known enough, deep faking becomes a serious issue mudding photographic evidence, suddenly it's hard to prove someone who is guilty based on photographic evidence because they can claim it was deep faked.
"deep faking becomes a serious issue mudding photographic evidence"
Yeah, but that's not necessarily a bad thing. There's a lot of innocent people in prison, so having more prosecutions fall apart because of the dubiousness of video evidence could be a positive thing.
no. photographic evidence will become meaningless
Already happening. We have seen "algorihmicaly enhanced" footage admitted in court trials.
Even if it's not purposeful deepfakes, it's still extrapolated data, something that never actualy existed
@@peacemaster8117 there is more criminals getting away than innocent people in jail. The justice system is already a joke. Its not a good thing at all
Its never gonna get better when all the top streamers just softball him...
i'm glad charlie understands how disturbing it actually is, because i've seen some youtubers say that the only problem in this situation is that the guy watching deepfake porn didn't do enough privacy measures and that watching this stuff is okay if you're not talking about it publicly.
I think the nuance that it only matters now that it is more realistic, where as it was more humorous (at least the ones he had of him) before is a bit wrong though. It is absolutely worse for the person being faked, as it's harder to deny and refute that it's them, and would make them feel far more invaded.
But, for the creator, and the consumer it is exactly the same. The intent is the same whether it looks a bit off or not. They are still trying to create an adult video that the person did not consent for.
@@PaulFilmer If anything, it's now easier for anyone to do anything, because they can blame it on deepfakes
@@protomato6427 This is irrelevant to this situation as a whole, and this comment specifically, but thanks for the insight.
i mean...think about it...if nobody knew, what would the ethical issue be? Like, for example, would it be unethical for someone to draw a picture of a celebrity naked, and then jerk off to it? I feel like people just love shaming others when it comes to anything sexual cause they're desperate for people to think that they're not the weird one, when in fact, we're all weird.
@@nGUNNARp you would feel very different if it was you having sex with a minor to F your life up! Yea… this is real bad.
It’s gross, especially cause it’s not even limited to celebrities, influencers, streamers, etc. Even average every day people are getting deepfaked and it’s especially sad because with celebrities, we can assume it’s not real, but these every day people having their name and faces connected to these videos- recruiters and hiring managers aren’t going to go that extra step to see if it’s legitimate. It usually isn’t until people get sent links to the porn videos that they learn what’s being done.
Cry more
@@Justin-rt7yp seems like somebody's cranky
I mean, I agree deepfakes are pretty messed up. But don't people draw stuff like that too? How come people haven't been talking about that? Sure it isn't the real person, but their likeness is still being used and they're still being objectified
What's the issue though? As long as everyone knows it's fake all this crying seems like white knighting.
It only became a issue when someone was getting paid.
this should be illegal, it's terrifying.
I mean… the idea of even having those kinda of tabs open while streaming is mind blowing, it’s like going to a public bathroom without locking the door.
Ikr!! Dude has a serious problem lmao.
It's like going into a public bathroom stall to pull down your pants , insert a banana, then bend over and leave the door open expecting no one is gonna see that goofy shit.
Those doors usually don't have working locks
When I do stream (rarely), I make sure every bit of software that isnt related to streaming is closed, not minimized. I'm not ashamed or into anything illegal, just like, even accidentally flashing OBS bothers me. then after stream I do a full restart to make sure obs itself is off. Too safe? maybe, but better than this guy
@@Fox_Olive is it that hard to create another user account in your operating system? damn.
You can hear the sheer sadness in Charlie’s voice as he talks about his friend getting caught for doing these horrible things. So sad.
Sure, he knows well that he committed a mistake by not vetting the people he calls his friends
how is this horrible 😭😂
@@csolisr bro, come on, how do you vet your friends sexual porn interests... this isn't something you go out of your way to find out.
@@MegaMuff420 So you would be fine with people making deep fake porn of your gf or mom or sister?
why is it horrible he got caught? its way more horrible that he consumed that kind of malicious media in the first place 😐
thank you for talking about this aspect of the issue. I've seen so many absolutely horrible takes of people saying "its not their bodies" "normal people would get over it in a week theyre upset for clout" "its weird and creepy but not harmful" and its absolutely infuriating so hopefully some people will see this and understand the gravity of whats been done. Like Charlie said: imagine you posted a selfie once in your life and now someone took that and put it on a body--not some animated 3D thing, a real convincing looking body--thats not yours and is its doing something extremely pornographic. A full sex tape. Now imagine your parents have seen it. Your friends. Your significant other. Your kids if you have them. Your boss and all your coworkers. If someone looks for your name that video comes up. People might ask you to do porn or say explicit things to you now. You are treated differently. All of this, completely without your knowledge or consent, just because someone thought your selfie was hot and felt entitled to make porn of you to sell for a profit and for their own pleasure.
Edit: I want to clarify that the example I used here is a hyperbole that I only used to try and help relate the situation to a more "normal" level. It takes much more than a selfie to replicate and deepfake a face so I don't want anyone to panic or be fearful, that was never my intention and I'm sorry if it came off that way. I was simply hoping to provide that perspective to help some people who might not fully grasp how seriously this can harm someone who doesn't do sexual content and did not consent to it
I’d probably say that it’s a deepfake.
So you are saying life of a porn actor is a nightmare? Well, you are partially correct, but it's a problem of prudes, not porn.
@@protomato6427 Difference is, porn actors signed into it. They're aware this is happening or could happen to them because they're literally the ones in the videos doing it. If you're not a porn actor, you wouldn't want to be treated like one out of nowhere. Of course, there are still some stuff that people say and do towards porn actors and sex workers in general that they shouldn't have to put up with.
Think about victims of sexual assault? They could be revictimized by seeing that. It also takes away consent. That super hard on people mentally.
it'd be better imo if it was their body but just someone else's face.
Not everything that is legal is morally correct
Just like onlyfans
This is such a disturbing idea. And there's also the idea of "the Internet is forever." Even if every site and video hosting this content is blasted, there will be a few videos and images saved and reuploaded infinitely, or just some that get missed. It really shows people cannot stand having no access to every single part of a person, even the most intimate things. It's so gross how if you aren't sexual, there may be someone out there DETERMINED to make you so, or at least in ways you are not comfortable with.
No man, woman, or whatever should have such intimate content made of them without permission. Thanks for talking about the general topic.
Almost as if the internet was always full of weirdos and it never was a good idea to share personal information online. Luckily for them, social media threw all that caution out of the window.
It's actually more amazing how quickly information decays on the internet. Efforts by ArchiveTeam and Internet Archive show just how true this is. On random hard drives, sure, but as far as the actual internet... no. "The internet is forever" only happens with a lot of effort.
Internet + narcissistic Humans = I hope this current journey of my soul ends decades BEFORE AI rises and possibly (?) obliterate Human existence as we know it.
**my soul needs a fresh Human vessel to circuitously witness Planet Earth's end times.
@@SEEYAIAYE 401 comments on these channel jesus christ
The Lost Media community knows how untrue this adage really is. Most of the time, deleting a video off the internet pretty much removes it from human knowledge, no matter how much people try to get it back.
I always thought AI would take over by human looking robots, not deepfake porn. It's seriously sad that this shit happens. It's happening on Instagram too. I think this could also shed some light on porn addiction and the dangers of it. What I'm really worried about is creeps using this with minors.
Glad you mentioned the dangers of porn addiction; I'm an addict myself (hentai) and I'm trying to find some productive ways to distract myself from feeding it.
@@soaringsquid0.129 good luck. I know other porn addicts that had success with crocheting, chainmail (metal not emails), and miniature roses as house plants. Just need to keep your mind and hands productive.
@@soaringsquid0.129 Well, you better not take any looks at AI porn made with stable diffusion and stuff... now you can produce hentai shit with just a click
@@soaringsquid0.129 Good luck dude!
@@soaringsquid0.129 I wish you luck with your journey!!! Addictions are hard
This comment section is making me both enraged and dissapointed by the second. There’s plenty of sane people here, sure, but why can’t everyone understand that other people don’t want to be depicted in sexual situations? These kinds of disgusting slime are better off dead, along with pedophiles and other kinds of non-human filth that think consent is just a suggestion.
absolutely right
Are you gonna police peoples fantasies?
@@jerome1lmyes. Especially when it’s affecting other’s lives…
@@jerome1lm It's no longer a "fantasy" if you choose to make that fantasy into a material entity (deepfaked video) which can be seen by anyone, not just the fantasizer. Seek help.
@@cyanscrewdriver2092 amen
Was just talking about this with my boyfriend. I can’t imagine finding P of myself online without consent, or how it would make me feel. Everyone calling these women sensitive for being in this situation is heartless.
You went gently there with the "heartless", because I'd call them straight-out sociopaths.
Its not porn of anyone. You muppet. You have a girlfriend not a boyfriend for not catching you on this L take
you're lucky. i wish i had a bf. all i have is a gf & she has 2 bfs so it makes me even more jealous that i don't have a bf :(
I cant imagine how these girls feel
But they're NOT "pics of you" they're an AI generated guess.
Either way, you shouldn't be sending pornographic images to other people if you don't want the chance of other people getting a hold of it
What scares me is that people will start doing this with minors which will be ten times worse
Shit... you're right. What the hell has this world come to?
This can also be applied to AI generated art, it's all completely fucked. I sincerely hope this issue is made more prevalent to other people.
Already exists...
@@cutlassfury3393 no... this is nightmare you're lying. Lying is dishonorable
Supreme Court already ruled doing so to be CP
it's scary how much AI has progressed. if people can do this, I don't want to imagine what we can do in 20 years. you could frame someone for terrible things without taking a step out from your home.
Hopefully we’d also have a way to prove what’s real and what’s not.
Just a couple of years ago I remember trying some program that could say a few sentences in the voice of Homer Simpson, Squidward etc and it was still way off, now in 5 minutes I can make Oderus from GWAR recite bible passages and the realism is only determined by what audio clips I use as reference and by playing with the speech settings. I hate to see where this will be in just another 6 months.
Well some things are not fakeable
@@Holuunderbeere bet u in 10 year people will be able to fake someone stealing at walmart and using it as proof
IMO it'll reach the point where all videos and recordings will stop being considered as legal proof for anything because of how easy it is to fake them. I doubt you could end up sending someone to jail with a deepfake, but it'll still be problematic in terms of reputation and public opinion as people are easily swayed in social media.
When I was younger, I wanted to become a programmer and create stuff that's exactly like deepfake. Now that people have done it before I even had a chance, and I've seen what people have and maybe will use it for, I wish that it never had existed. Man, I sure love technology.
I actually had never known this was even a thing. This is kinda crazy with how it can’t be distinguished easily and can be used to ruin peoples reputations
Honestly, down the line it could get so bad at some point that even when managers do end up finding deepfaked porn of the person they’re about to hire-real or not-they probably just won’t feel inclined to give a shit. Like, it’ll happen so many times to anyone that they’re about to hire that it doesn’t even faze them anymore and they’re like “ah, ok, let’s just take a look at their resume”
If it does happen, revenge porn would basically lose its purpose and would just go on to be a regular nuisance that barely anyone bats an eye at or cares about because “yeah, yeah, they’re not the only one, y’know”, that itself would be kinda satisfying
Either that, or officials are probably gonna find some way to make deepfake porn production into an executionable offense
dont share info on the internet lol its your fault when people take advantage of the info YOU uploaded. Your face is PUBLIC when you put it in PUBLIC space. Id hate it if it happened to me so i didn't put my info or face on the fucking internet.
@@Jiffy-Liffy7714 They can't.
Honestly it really isn’t that genuine they look obviously faked
@@Lewisrobbie02 At the very least, it's apparently "convincing". When years ago it was "obvious", the problem is how it'd look a decade from now. Better act now before it gets to that point
I’m not even popular on the internet (at all), and someone did this shit to me just by gathering photos by following my instagram. They tried sending the fake photos or videos (I never opened them obviously) to all kinds of people I knew, including my own mother, just to blackmail me into sending them money. Luckily, I managed to spread awareness to everyone to simply delete the chat if this person messaged them. From what I know, no one saw they sent. At least no one currently in my life, thank god. This shit is a serious problem.
Woah that’s some pretty heavy (and illegal) shit. Do you know who it was?
Can't you like sue someone for this?
It's likely fake like when you get an email that says they recorded you through your webcam and to pay them.
@@Donovarkhallum That wa s a big skype tactice.
That is awful... sorry
charlie hit the nail right in the ass. The biggest issue is not just the fact that its a hug violation, but it also has horrible implications to be used as blackmail and harassment. Not to mention the potential for deep faked images of children/underage streamers. It is absolutely sickening.
@oh no cringe detector In case you are being manually this pathetic or you read responses to your pathetic bot, stop. You're not going to achieve anything by doing this.
You can use software to prove that a video is deepfake with 100% accuracy, you'd have to be a idiot to get blackmailed with the video equivalent of photoshop
That's always been a thing though, this is just an evolution of photoshop and previous visualisation methods. Before the camera people were saying the same thing about using sketch artists for Police work, "what if they could draw it wrong and convict the wrong person"
Make it illegal to do things that will cause international incidents like deepfaking the President or CEO of companies, the same as it is to fraud or slander. But to make it illegal to whack off to someone without their consent....? Naive and entirely unenforceable and not at all MORAL anyway, nobody can tell you what is right or wrong to think about, a hundred years ago it was "MORALLY WRONG" according to almost everyone for people to be homosexual.
Exactly. The people who can't see this have stones for brains
@oh no cringe detector Shut up, bot
You've gained quite a bit of respect from me, for calling out someone even though you consider them a friend. Takes guts
Huge amount of respect for Charlie with how he handled this video.
🤖🤖🤖🤖🤖🤖🤖🤖
🦿🦿🤖🦾🤖🦿🦾🤖🦾🦿🦾🤖🤖🤖🤖🤖🤖🤖
I agree. Charlies the Man
ironic coming from a bot that has a sexualized image in the profile to make discord mods click on it
@@not_so_anon_person_on_the_8250 Does that mean they copied the comment?
It's only a matter of time until some teenager creates a revenge porn deepfake of someone they don't like at school. In that situation, a person's life will probably drastically change forever, and it's hard to say if the truth would ever come out. It would also create a whole new weird legal problem. Would a teenager deepfaking their classmate into porn be classified as a sex crime against a minor? I lean towards yes, but I also have a feeling that it will take almost a decade for widespread legal precedent to be established.
Dang I never thought of it like that, like I thought most comments were blowing this out of proportion. But your comment really opened my eyes honestly thank you
Something like that actually happened to me. I managed to stop the spread of the fake content at least before the people currently in my life could see them.
@@thehale_ That's messed up. I'm sorry you went through that
@@thehale_ Jesus Christ what is wrong with people
bro u giving people ideas 😭
thanks for speaking up about this. the fact that people disregard this is insane.
They won't be disregarding it when it happens to them
@@candy-ninja who in the world would wanna see corn of an ugly loser who gets off on ignoring women's consent though? :/
@Edward booooooo 👎
Because it's hard to care about people who speak of morality when their community is rather depraved.
@Edward bro got pissy cuz i boo’d him 💀
This isn't just weird, it should be illegal.
It can destroy careers and lives
Meh, publishing/selling can make a case for making illegal. Don't see a problem in keeping it private.
@@Cruz474 It's a violation. It's inhumane, grotesque and criminal. Doesn't matter if it's private or not
@@hithisthers7214 Ok then why aren't photoshops banned.
@@Cruz474 Photoshops are still vile and those should be banned too.
Stop thinking from the perspective of someone tugging their meat, and start thinking from the perspective of the victims
@@hithisthers7214 I'm thinking from an omniscient perspective. I was just looking to see for your consistency. I see it, so I have no quarrel with you. Respect.
I was so naive when deep fakes first dropped. I was just thinking, "oh man, now I can make myself look like a superhero or a movie star or something". Sucks that this is the reality of the situation.
Oh, very innocent. Teens (and the cia too tbh) will use it to make fake sex tapes to blackmail others, criminals will use it to frame others, courtrooms are already using deepfake experts.
i bet you are excited for human-like robots to vacuum your house.
@cats one of the industries that most rapidly attempts to adopt content related technology is the pornography industry, so its not really that hard to predict if you've given it some thought
Oh you sweet summer child. Just wait till A.I. winter.
i thought the same thing. but that’s sadly the very surface and optimistic way of using deepfake. we all know it’s going to be used for much more devious and evil intentions
The whole deepfake thing is honestly scary. It can ruin someone's life super quick.
yep, but so far i think it's only ruined atrioc's.
Meh, e-celeb fanfiction has existed for years. It's exactly the same thing - fantasizing about having a sexual parasocial relationship with your favorite celebrity. People who are into that are definitely creepy weirdos, but complaining that it's "unethical", especially when some of these e-celebs are encouraging parasocial relationships in the first place, is pretty hypocritical.
There's of course the whole other angle of passing it off as real, but there are fake news about e-celebs all the time. This is something you cannot police. Best you can do is take responsibility, not encourage parasocial relationship with your fans (especially sexual ones like OnlyFans), and make sure you're not associated with it. Nobody is looking up DeepFakes of Michael Stevens from Vsauce.
IT will ruin someone getting caught but definitely will not ruin the one being deepfaked, unless they chose to be ruined. (because people would just comfort and give sympathy to the indicated victims) this is digital harassment in the making.
It’s easy af to prove a video is deepfaked, no matter how realistic it looks. There is software that can determine deepfakes based on a number of factors and it’s 100% accurate
Too be fair at the point it’s at now people can still tell if the video is fake or not
This stuff pisses me off so much. The responses to it like “they posted pics of their face online they knew this could happen” or “so now it’s illegal to find someone attractive or fantasize about them” or genuinely one of the worst “all women should be ok with this because only fans exists”. It’s genuinely infuriating to see response like this
Fan fiction always have been weird, especially when they are about real people. Turning them into images makes it even worse.
@@Quisl the thing is, you always knew that it was fanfiction and not real. But most people can’t even tell these pictures are fake.
@@Quisl
No it hasn't
@@bobstevenson3130 “Can’t tell they’re fake” like you aren’t accessing the content on a website that has “deepFAKE” in the title
@@SSchemeS they mean images might be posted and seen by people who dont know its fake
Lawmakers really need to completely ban this. Imagine if they deepfake a persons face who is under age and then put it on the body of someone of age, and say it’s “ok” because the persons body is of age. Very horrific slippery slope here. Disgusting!
Human brains aren't adapted to seeing ourselves doing sexual things that we don't remember doing. In this case it's because we didn't actually do it, but the point stands. It's deeply psychologically disturbing. Dr. K on HealthyGamerGG did a video on this situation as well, it's really insightful.
What about Photoshop?
@@AkiRa22084 We're not adapted to that either, but it's one frame and we can, over time, convince ourselves that it's fake and we didn't do that. I'm not saying it's no big deal - editing anyone into sexual stuff that they didn't consent to is wrong.
But video is a new level that we're not adapted to as a culture, let alone as a species. Our culture still considers video to be a standard of evidence. Even leaving alone that seeing yourself do something you'd never do is damaging to your mental health, it's still damaging to professional reputation and personal relationships, not to mention potential criminal investigations.
The CIA & FBI have already used slander and framing to take down people they decided to, people that we now consider to be on the right side of history. I have no doubts they'll do it again and worse.
Also my friend just brought up fake revenge porn, which leads to "evidence" of cheating to break up relationships, or maybe non-sexual content that becomes "evidence" that you should be written out of your parents' will, for example.
@@aliceh5289they did ? Genuinely curious
I can't imagine children will be safe from this. This needs to be outlawed and should be upheld in court.
Didn’t even think ab this. Ew
this has already happened with children and the government has done NOTHING. if any good can come out of this situation, i think it’s that people are gaining awareness for just how scary the ease of these actions are and can actually lead to some change to prevent them
@@DreamsOfLennox sad thing is i doubt it will reach enough of the people that need to hear it
Problem is.. how do you outlaw it? This tech is open source so its too late to ban.
It’s already illegal to use the tech on children, in theory, but it’s kind of impossible to prevent.
How could you possibility ban it?
To me, the biggest thing that disgusts me that nobody is talking about is how he brought this deep family struggle and streamed it for everyone. These streamers are sick in the head for bringing so much personal shit on stream. Deepfakes are stupid and weird, but bringing your wife on stream to cry in a very awkward way is weird as well.
that's what i also dont get. before looking it up i thought she caught him doing it, and forced him to come out, but nah, it was his chat, so why is she there? and also why is this so public. if he need to excuse, do it to the offended party. not an open video that is only going to make it worse for everyone involved.
@@marcosdheleno He said he didnt sleep at all the entire night it happened. So not only was he on lack of any sleep, this is his entire career that he has been building up for years and he probably saw it flashing before his eyes that everything could be gone so he felt he had to do something. Hindsight is 20/20, not everyone can make perfect decisions all the time. Its easy to see why what he did was wrong after it happened.
His wife was there because she also wanted to say something in the apology stream, I doubt Atrioc asked her to be there
Dude wanted a fast fap and found no good material so he created some himself. Is that really that bad? I hope he releases the full material, I am getting bored
I feel like it was done on purpose so more people would try to do this and get into it. People that had no idea deep fakes existed. Now that it is public knowledge, watch everything disturbing that happens (political figure gaffes etc) now can just be excused as deepfakes
even if the person faked does make there own sexual content, there own business gets undermined by bootleggers basically, and even if someone consents to be photographed nude in there own content, that doesnt mean they consent to being deepfaked. thanks for talking about this charlie, ur a stand up guy
Cheap knockoffs are cheap knockoffs bud. It aint illegal to sell fake jordans
@@HybridxProject Yes it is
@frogmen it literally is illegal to sell fake jordans. That's counterfeit product
Edit: also glad to know that women are yet another product. Maybe try to see women as human being and not sexual products to produce "cheap knockoffs" of
@@quixoticvalkyrie Lol no selling fake jordans is only illegal if you try to sell them as though they are real, knowing that theyre fake. If these deep fakes are being advertised as fake, then thats it. And drop the woman are objects nonsense, same applies to mens deepfakes. As charlie has stated this has also happened to him.
Go where the money is, if i don't want to be "undermined". Price match that shit, or cry a river.
We need some serious privacy laws. Stuff like this is only going to get worse because so much of our data is making other people money.
This should be treated the same as blackmail porn
Yup. I'm sure laws will be made here federally soon because as of now, not many can do anything at all except get them removed.
In the internet though... Things always make its way back
This is going to help the rule 34 community ngl
yes, more laws, more state intervention. that always helps
i’d be surprised if it doesn’t get outlawed at some point because i can already see people deepfaking politicians onto criminals like murders and shit
I feel kinda sad to feel the need to say this, but charlie, thank you for using your platform to show people how to be a decent person. You're a good man. Truly. I know he is your friend, but I appreciate you making sure this problem is brought to people's attention. I dont think a lot of people would do what you do, so just, thank you
Acknowledge my hatred of Whigglets
Oh look, another sycophant in the youtube comment section. Who'd have thought?
get a grip
@@zoomingby what advantage did they gain? UA-cam likes? Lmao
You're just looking for reasons to be angry
the dickriding is crazy
The comments section here is a breath of relief compared to the replies I’ve seen on twitter about this. Lots of porn addicted men saying “well it’s fake get over it” or “being famous comes with these type of consequences”. Lots of trolls just typing “lol” on one girl’s video where she’s talks about being a victim and is clearly upset about this whole thing.
These lowlifes have no empathy. It’s a symptom of being terminally online and addicted to porn.
Your talking about the same women that will whore themselves out for only fans but a deepfake is to much lmao cry more
Most of the comments are a breath of relief, but just look a the reply section to big comments, they're the opposite
The AI art situation had a lot of these apathetic tech bros call artists "aspiring McDonalds workers", saying art wasn't a real job and that people should lose their jobs because that's what happened with tech advances in the past. You'd think living in an advanced society would have people try to work things like that out instead of letting AI run rampant like this, but here we are.
@@Mrhellslayerz There will always be work for real artists, AI technology is a boon for businesses, instead of hiring several hundred workers you can instead hire a dozen and have them work alongside the AI. Even deep fakes have their uses in business. I would love to see a young William Shatner have a surprise appearance in a Star Trek TV show or movie.
Deep fake porn needs to be illegal unless you get consent from the person who's face you are using and the sex worker you aren't giving credit too.
Alot of men are lost mentally nowadays
Sorta off topic but Charlie seems like one of those friends who would tell you your screwing up before it’s too late. It’s honestly refreshing to see someone be so blunt and honest about something when it’s about a friend. I’m sure it’s a very difficult situation but I think that it’s pretty inspiring to see that how he’s willing to stand by his morals and say it’s wrong despite having reservations, not wanting to add negativity to the situation and caring about the other person.
I also think that not only just not having consented to being deep-faked (which is still extremely horrible), someone that these women thought was their friend paid for that content and imagined about having s3x with them. Personally, that would make me extremely uncomfortable.
Deepfake is no different than using your imagination. People fantasize about having sex with their friends, deepfake is just making it digital
The deep fake thing is definitely weird and creepy but the idea Atrioc/basically any straight male hasn’t fantasized at least a little about basically every vaguely attractive woman in their life does seem to show a lack of understand of how male sexuality actually works. They just keep it to themselves. It’s the actions based on the thoughts that are the issue not really the thoughts themselves.
@@soccrplayr232 thinking about it is one thing, humans are sexual creatures, but acting on it is so weird. this is a different kind of creepy than just fantasizing about someone you think is attractive. especially bc it was people he knew personally??? nah that’s demented. this is a really good way to gauge someone’s morality and i’m genuinely surprised so many people are stupid enough to think it’s okay. i’m like genuinely shocked this even happened 💀
Wait a minute... You think your male friends don't fantasize about doing unspeakable things to you?
That's a staple fantasy right next to planning hiw to fight different animals.
@@juno3281 I mean honestly I wouldn't do it and its a little messed up but not sure it was all that big of a deal. The dude selling the stuff is the actual issue not so much Atrioc looking at it and the reaction against him feels a bit exaggerated since he's the only real face to put to it and people are angry it could happen. Also not sure it was so clear he was there looking up people he knew even if they were on the site, I could easily buy he was going to do a segment about AI/deep fakes soon and went down a weird rabbit hole and got caught with it on his computer because he never closes tabs. Still weird just not sure he needs to quit streaming over it.
There’s going to be a lot of “deep fake“ related issues that come from this. Whether it comes from streamers influencers business owners and even politicians. This whole scenario opens up a lot of “plausible deniability” for people that partake in these predatory activities. That is why I am most upset.
I like to see where this takes... since humanity couldnt help their balls to always make advance tech for no reason, i know it is always been wrong but i like to see how far extent human beings will learn the hardway
the old that wasn't me, the video was faked argument is only going to get better because of deep fakes.
This has been around for YEARS. Nothing will change
You’re late deep facts have been around for a while and many websites have been shutdown , Phub was forced to remove every deep fake video back in like 2015-2016
what I'm concerned most about is this being used as a way to bully people. Primerily kids in schools, this can ruin their lives and follow them forever.
As someone who grew up on the internet this is 100% going to be misused with teens. There is a reason why so many young teens used snapchat and it's not innocent at all.
You can't deepfake a person unless you feed the AI tens or hundreds of thousands of photos of someone's face, hence why it's only streamers/celebrities who are being targeted. If you're not a high profile public figure then it's currently impossible to deepfake with current technology. AI can't just magically know what your face looks like from every angle.
Maybe in the future that will change but for now I'd say school kids are safe.
This has already happened, a MOM deepfaked videos some girls that her daughter had a problem with, of them smoking, drinking, and having sex, in an attempt to get them booted off the cheerleading squad.
@@seekittycat but wouldn't the same young teens know that it was fake, so they wouldn't fall for it, and instead look at whoever posted it sideways?
But maybe the damage of association to that image is great enough to leave a permanent scar in their minds. But haven't kids done that since forever? Especially female bullies, since they resort to other things than violence, like reputation destruction? I seem to remember the one girl that committed suicide but made a video where she flipped through cards before she did it? I forgot the name, but it was a while ago. I remember finebros reaction channel did a video on it.
@@RS-fy9hb Association can definitely change your life, no matter if you prove your innocence, sadly.
"It's not their real body, why do they care?" = "It's only a drawing/animation/cartoon, plus she's 5000 years-old! Why do they care?"
nahida reference?
I've berated and abandoned a friend or two who've tried to defend nahida r34. It is so disgusting
This whole situation show how many people don't understand how consent works and it's really disgusting
They know, they just don't care about it. Not knowing sounds like a shitty excuse
@@n1ppe That's actually a fair point and perspective!
no, they KNOW how it works, they just dont care for it.
@@GodhandPhemto based struggler
I don't think you know how consent works in conjunction with being an online celebrity.
Porn stars give their consent to have their bodies shown online in any form as they have sold their body.
Streamers consent to having their face be used outside their stream indirectly through free media practices.
The "apology video" was the worst idea ever:
1.) More attention to the AI thing
2.) More stuff for drama channels
3.) More attention to the AI thing due to drama channels
4.) Bringing your wife into for 0 reason
5.) Literally will never go away due to the 15 min video
It was great marketing
well, the alternative is to say nothing and then people call him out and draw attention because he won't apologize and well... the apology or lack thereof didn't matter much, it was gonna blow up.
@@jumhed994 Fuck marketing. There are some times where you need to value self-esteem more.
@Zanaki a written apology and let it blow up for a while (which it will do no matter what). This just escelated it way more and made it way more drama and clip friendly.
The only two sides that gained something from the video are drama channels and AI stuff producers.
Before I say what I’m about to say, I don’t condone Atriocs actions whatsoever. Just friendly discourse of the subject :)
I think points 1-3 are the same point; I don’t think the apology stream or whatever it was he did would bring much more outside attention than the original incident. With more coverage from more channels,and news sites, and drama sites and so on, any attention created by those outlets are ultimately out of his control.
Bringing his wife on beside him could definitely just be a comfort thing, but I agree it’s out of place and the guy is basically just a business oriented psychologist so I wouldn’t be surprised if it served a different purpose.
Yeah, none of the fucked up deepfakes will ever stop unfortunately because there will always be piece of shit losers who could care less about what other people feel. I did hear the original website was taken down, but things like this never really go away in an environment where anyone can do anything almost entirely anonymously.
I used to watch Atrioc from time to time to fill silence so it’s possible that I could have some subconscious bias, but still it’s deeply saddening to see that he would use his friends like that in such an abhorrent way.
I learned it’s just important to remember that UA-camrs and Streamers will only ever show you the part of themselves that they want you to see.
Oh my god I wrote a fucking essay again sorry about that; anyway that’s all
There are also now advanced deepfaked voices, I've heard them and they sound very very real. Not choppy, not stuttering. All you need is a snippet of someone speaking with no music behind it. This is getting INSANE. I guess a "positive" side is it can only do american accents well.
Schaffrillas when he watches The Owl House instead of paying attention to the road. 🚗💥
@@furryprideworldwide816 username checks out
Not for a lot longer as intern goes
I've seen a few of those as well for VTubers. It is terrifyingly accurate. You can't tell the difference without someone telling you prior.
@@mee091000 Oh my god no they're the perfect target, they have hours of voice samples that could be used
You see this could be even worse in the case of what if someone tried to deepfake an underage person
My thoughts exactly. This shit is dangerous
all this shit is horrendous and creepy, i feel bad for anyone affected. Is there or will there ever be a way to stop from this to happen?
i’m not sure, the way technology is progressing it is getting harder to control and catch up to. i guess one way is to have these adult sites be monitored better. maybe every video should have written consent with footage before being able to be uploaded and confirmed.
Nope lol, deepfakes will keep on being created which is the sad reality. Mostly to appeal for the creeps that wish they saw someone in specific without clothes
stop people from making fake images in photoshop, etc.?
lmao, dont be ridiculous, snowflake
dont share info on the internet lol its your fault when people take advantage of the info YOU uploaded. Your face is PUBLIC when you put it in PUBLIC space. Id hate it if it happened to me so i didn't put my info or face on the fucking internet.
Our government could get off their ass and pass laws making it illegal to make or distribute falsified porn using someone’s likeness without their consent. It would be an ongoing battle, just like it is to have revenge porn or CP removed from sites
Damn, felt it when he had to call someone he used to be friends with a porn addicted loser. He almost held back a little bit but he came in strong and fought his own urge to sugar coat someone he wishes didnt make such a horrible mistake.
I don’t think that comment was directed at him specifically
@Dawn A mistake can be something you've done deliberately, saying that it was a mistake does not on its own minimize the seriousness of the act.
@generic femboy how would you know that??? People are so stupid nowadays it’s crazy
@generic femboy That could be, this video is my only exposure to the streamer in question, I was just commenting about language.
It started with faking porn of celebrities. The next big thing under that is content creators who are internet celebrities. Kinda makes sense that this was going to happen. It was just a question of "when will it happen".
Also before AI generated, it was people who looked similar but not exactly identical to famous people who were doing porn and branding themselves as "fake X famous person" but the titles were always "X famous person doing Y"
So again. It was just a matter of time. Doesn't mean it is ok though.
The really terrifying thing is that it probably won't be long until this kind of tech is easily accessible, like chat GPT. Its gonna become a huge problem when some creepy fuck can just use someone's instagram to generate this kind of content...
@@RubiixCat It's going to be easily accessible, almost impossibly un-enforceable at a large-scale, and going to get more finer tuned and powerful as learning sets and designs improve. Not a question of if, which is scary.
and then after that, it's gonna be normal people, and people are gonna use it not just for porn and shit like that but also to fake incriminating evidence in court that could ruin people's lives
@@RubiixCat This is already easily possible. If you post a bikini pic, it can be turned into a nude by AI very easily.
I was wondering when Charlie would cover this topic, and I’m very glad that he did so in the way he did. He was respectful and did his best to make everyone aware of what was going on without making a huge deal out of it in a way that would draw even more attention to it.
The interesting plot point of this whole story is that not only was the ad not malware but was a legitimate site with a paywall and then the traffic exploded on the site to the point it was taken down
The whole "clicked an ad" excuse is such bs, phub does not allow deepfake ads so he went and searched for the videos and paid for it too
@@queefqueefington Is it possible he used phub as a generalization? That's how I took it, atleast
@@queefqueefington yeah uh pH also dosent allow cp but before the wipe they had a scary amount of cp
I can't imagine going through something as serious as having intimate images of myself spread on the internet, fake or not. Stunts like this harms people. Real people! It's unacceptable!
Fake or not!? Wtf? Theres a night and day difference
@@krotchlickmeugh627 As Charlie mentioned, deepfake technology is apparently so advanced that you wouldn't know if it's real or not. Anyone could make deepfake porn of you, me, our grandparents, or anyone we could think of
@@krotchlickmeugh627 did you even watch the video? Charlie explains exactly why it's harmful, real or not
@@krotchlickmeugh627 might want to re-watch the video, Charlie lays it out quite clearly.
Oh god, I can't believe there are critikal Stan accounts now
I like Charlie because he is chill. He sees porn of himself he didnt know existed and laughs, but immediately understands how dangerous it is. Dude can laugh and be serious at the same time.
The Charlie effect
Schaffrillas when he watches The Owl House instead of paying attention to the road. 🚗💥
Wish we had less insecure content creators and more Charlie's
No it's because he's a guy he doesn't care when it happens to him, but when women have it happen they break down.
@@WadeAlma not wanting porn of yourself on the internet is not only a woman thing guys dont want it too ☠️
Really appreciate your language use here because it feels intentional and important; referring to the women he looked at as "colleagues" and "coworkers".
I think something that seems a bit lost in this discussion due to it being on Twitch is that this is a man who fully looked at porn of the people he works with. In any other "real world" scenario, if your coworker got caught with porn of you and your colleagues (regardless of the gender) on his work computer there would be an immediate understanding about how absolutely disgusting and straight up creepy that is, and how rightfully so in that situation, that colleague would most likely lose their job because of sexual harassment.
I hope the women who are victims of this really have more support not just from their user base, but from Twitch directly, because this is just unacceptable behavior that's eerily being described as "normative".
TL;DR, close your porn tabs at work.
Starting to understand why my dad would tell me it really is all men
@@lioreubm677 Yes and no. (Basically) all men have a sex drive wired like this, not all men act on it. Many men are good people and dont indulge in their darker side like this
@@eligedzelman5127 “darker side”? The way you’re describing men makes it sound like all men have dark, horrible, twisted urges, but the ones who don’t act on those are good. That’s not how people work, and it’s not okay to generalize men in that way. Some people do have bad thoughts and bad urges, while some don’t. And yes, it is scientifically proven that males have more libido than females, but that doesn’t make them inherently creepy. It’s okay to have sexual urges; most people do.
Thank you for explaining the relationship between none sexual content creators and how hard they’ve tried to stray away from that. I feel like a lot of people are just blind to this and confused on why people like pokimane cheers her only fans friends on yet she’s upset about this. Their takes are sooo wrong and I hope they listen to you and understand this. You explained it well.
I hate the fact that they're people in this world who willingly do this to other people. Deepfakes can be funny for a meme but when that technology gets into the wrong hands it has devastating consequences. Thank you for speaking on this
Tyre Nichols deserved it because he shouldn't have ran from the pigs.
@@Unknown_Genius I get what you're saying. Memes are usually lighthearted like deepfakes of Nicholas Cage I've seen. Porn is a whole other level, it can be detrimental to the victim's wellbeing, safety and public perception of them.
@@iino07 Deepfaking someone in a meme-y context is still taking their image and using it in some way they don’t necessarily consent to. And memes can also mess with someone’s life: the person in the Bad Luck Brian meme is a good example. Its still exploitative, and should be morally wrong: just because the intent is less sinister doesn’t mean crossing the boundary should be ok.
This whole situation is upsetting but is to some extent ubiquitous. People have been being exploited since people started pulling their phones out every time someone or something slightly catches their attention. There’s already very little consideration given to how the subjects feel or what they consent to, so it’s unsurprising that something like this whole situation is deemed ok by a shockingly large amount of people.
How can y’all seriously be butthurt about people plastering someone’s face over someone else’s naked body? How far does your pettiness go?
@@Neo2266. Someone could use that blackmail you, fuck with your personal relationships, cast doubt on your credibility. More reasons besides. It’s like someone having a you-seeking missile. Even if they never use it, it’s disconcerting that they have the ability to hurt you any moment they choose.
This is really upsetting. I’d feel so vulnerable. You can’t do anything to stop it. I’d hate to have their family to see that or be sent to them. Really sick stuff.
@Patrick39i agree.ur videos r better
@Patrick39 1k comments on this channel 😂 you envy him
Technically if there aren't enough pics/vids of your face then you can't be deepfaked, yet another reason to remain anon
if someone made deepfake pron of me and sent it to my family, I think that would be hilarious and I doubt my family would care once they knew what deepfakes were.... stop being a victim and caring so much lmfao
@@maxtm3000 well that’s you . Some people live in poor religious countries and would literally get killed if their family saw it
Didn't even know that existed. How desperate can one person get....
Being able to manipulate a persons image is crazy. Tons of fake blackmail potential.
theres ai deepfake of voices too, you can legit make someone say anything you want politics and literally everything's about to go more downhill than it ever has
@@captainmycaptain8334 reminds me of the movie Tango & Cash, where a cop (Stallone & Russell) got deepfake voice recording that caused them to go to prison
If it's fake blackmail ignore it don't be so gullible
@@crypticslayer69 the issue isnt any of us being gullible its other people that dont know its fake, you can say its fake but the tech is so good it wouldnt convince everyone, look at the state of the media now where obviously doctored/staged shit ends up making the news and dividing people, can you imagine stuff that isnt so obvious. Not to mention the population of gen x and boomers who believe everything they fucking see on fb and twitter. trying to convince them its fake would be like trying to convince a fish to crawl out the water
Photoshop?
You turning this video from the "Fault of Atrioc" to "The Bigger Issue at Hand" is a very smart way of tackling this. Hopefully something can get done about this, whether it be more regulation or something.
I've been aware of this for a long while now, but only seen it with big name celebrities. I genuinely think this needs to be classified as revenge porn.
With technology improving it will become dangerous imo
Nah revenge porn is something different, people don't make deepfakes to take revenge on people. It's mostly people in relationship that break up and then the one of the person takes revenge on them by leaking their nudes
@pringln bot obv
@@user-cb9cd5qj6k go away bot
@@SansTheSkeleton_ tbh i could see ppl commissioning deepfakes of their exes to try and "get them back"
love how so many people are outing themselves rn
I’ll always be amazed by how casual people are with porn, just leaving it up on their browsers and forgetting about it
his job is to be watched whilst on a computer also... so stupid
They dont know the concept of "incognito"
That's why I would install two separate web browser applications for different sort of things for this reason.
@@johnrenwelmauro2387 I don't get it either. I'm a single adult man and use incognito on my password protected phone religiously if I'm looking at porn, for no other reason than I just don't want weird adds/search suggestions, etc. These guys career are on the net, the stakes are infinitely higher for them. Fucking lunacy
I don’t know why I’m so paranoid but I got another app that is protected by Face ID and cannot be opened with my password if someone somehow guessed
It
This genuinely scares the shit out of me where it’ll be in another 10 years. The Snapchat crying filter is already wild enough. Once this becomes widespread and people are able to use it maliciously. The future is really quite scary
The ramifications of this are gut-wrenching. Just one tech-savvy person can ruin someone's entire life over something they didn't do. Even Inside Edition did a report on something like this. A student had a video deepfaked of her vaping and the video was sent all over including to her parents, just because the student was a cheerleader and some other cheerleader didn't like her. It's unfathomable, to have one's life ruined over something they never even thought of doing, it could be anything, porn, a crime, anything compromising or embarrassing really. I just can't put this into words.
@@AVI-lh6rm just one random dude with a knife in their pocket can kill someone.
@@tedhaidydei8261 this is something a random person will be able to do tho and destroy someone’s life. It’s scary man. And it’s only gonna get more advanced
@@SagaciousTv Ok. Any random person can kill anyone as well. What im trying to say is that things like these are preventable. We can prevent them by making them illegal and severely punishing people who commit them . Just like murder, assault and pretty much anything else.
@Ted Haidydei Good luck. Piracy is illegal, yet its still widespread and goes unpunished.