The U.S. Wants to Arrest People Who Make Deepfakes
Вставка
- Опубліковано 18 чер 2024
- Watch The Full Philip DeFranco Show: ua-cam.com/users/defranco?sub_...
Subscribe for more Quickie News: ua-cam.com/users/defranco?sub_...
----------
Produced by: Cory Ray, Philip DeFranco
Edited by: James Girardier, Maxwell Enright, Julie Goldberg, Christian Meeks, Matthew Henry
Art Department: William Crespo
Writing/Research: Lili Stenn, Philip DeFranco
------------
#AI #TaylorSwift #JennaOrtega
Isn't it awesome how quickly laws can be developed to combat bad things but only when celebrities are the largest target
Even if the biggest names are celebrities. The fact if someone gets a enough good quality photos they can make pornographic images or even videos of you, and in a world where our bodies are special and should only be shown to those we trust it SHOULD be illegal.
@@NIX-FLIX exactly. there have already been cases where this tech has been used to harm ordinary people
@@bighillraft im saying if this was a problem that only affected ordinary people it'd be years be for we saw any tackling of the issue on the level we're seeing today because it targeted elites early on
@@bighillraftyou have no examples of this happening bot.
Yeah, not surprising, considering that the people most often deepfaked are the rich famous and powerful.
Yeah I got a feeling limiting international affairs in Israel is to hard but focusing on absolutely meaningless endless battles is easy
Its been used to normal people as well. Here in the Netherlands it happened to some girls. Its horrible and just because someone is famous or not its still fcked up.
01:24
Yeah it's still fucked up either way. You want a deep fake porn video of you spreading around the internet? Have empathy
@@HYv47 It just doesn't matter. People will see it, talk about for a day or two, then completely forget in a week. There is so much porn or hella people who didn't want the vid uploaded, yet, no one actually knows who they are.
How about making all deep fakes involving unwilling participants illegal...
@@ggexgaming6020 Yeah. What if I made deep fake porn of myself? Could I be charged with it if I don't care? What about my friend who does porn? If I make a deepfake of them would it be illegal? What's funny is that the folks who do porn already are actually in the clear for this type of stuff. You can already be seen naked so no one has the incentive to deepfake you. And even then, 99% of people will never be deepfaked. But lets say at least 25% of that 99% is deepfaked, no one is gonna give a fuck.
@@ggexgaming6020 *highest?
Because if it's not pornographic it can technically fall within the rights of parody.
For example if you wanted to replace an entire movie's characters with other actors.
That sort of thing is a crime against humanity to ban.
Thankfully ai will become an underground thing and accessible to people who seek it out, it's like ghost guns, you can't stop the proliferation of ai and ai getting into and used by the general public.
How about no.
It's an interesting question, what should and shouldn't be allowed? I'm certainly on board with ruling out deepfake porn, but I'm not sure I see the harm in turning Nicholas Cage into Lois Lane. He might get annoyed, but I'm not sure it warrants legal action. Though, to be fair, I'm not the one getting turned into Lois Lane, so I might feel differently if the tables were turned.😂
The rich lobbying to protect themselves but not children
What a crazy world... Never did I think this would be a real headline. When I was a kid this kind of stuff was crazy sci-fi
Its a great thing in the end of the day, but not surprised that it became a big thing once influential people start to be the victim.
Deep fake porn doesn't even look real lol
That's the funniest part about it. Plus, how tf do you even track who made the image? You could make it then post it from an ghost account at a Starbucks. Only way they'd know who it was is if they figured out the exact deepfake cite, but even then, who's to say they used a cite and not a custom AI they made or a less safeguarded image to image model.
@@SozioTheRogue genuinely, 99% of the time its a local model, which you can not track as it doesnt have any way to track, after all if you are running AUTO1111 on a local PC with some model like PicX, then you quite literally are immune from this law. The only way to protect is to basically try to go after the source but that is away too far gone now as to how popular it is. If you take down Auto1111 10 clones will appear. If you take down PicX model from CivitAI, then 10 more torrents will appear for it. So at best, you have to monitor social media at all times to make sure no one is sharing this stuff and then take it down if it looks like its AI made.
For now. It will get much more realistic eventually.
The fact that a venture capital firm invested into deep fakes is against this bill is wild. You would think they would be the front runners and making sure their technology isn’t abused. Something isn’t adding up.
Simply ask yourself, for what reason does deep faking exist? It’s a primarily malicious tool.
People pay for deepfakes. It's not surprising but it is disgusting that lobbyists would be against this.
Why is this a thing? Oh, the people getting deepfaked.
How is this a bad thing, exactly? Yall want to act like just because it's celebrities mainly getting hit it shouldn't be illegal. Tf?
Yeah, good luck with that. The genie isn’t going back in the bottle.
There’s plenty of foreigners that make AI images. They’ll just keep doing it
In the US, a bill rarely passes. I don't know if this bill is just going to be sitting in limbo or if its actually going to get somewhere.
Wait till they realize the people making and sharing these things do not even live on the same half of the earth as these laws
I missed this one. Ty for sharing, you do great work narrating and summing up situations. Keep doing diligent work 💪
Im sorry if my favorite celebrity tried to sue of possession of videos, there will be problems.
My initial instinct when I saw the title was "GOOD", but now I have to wonder why it's Ted Cruz that wants this.
Cory Chase has a lot of content that people could easily use to create art/parody/satire/criticism with Cruz in it, so he doesn’t care about free speech.
Okay, so who blocked it and why?
How is this constitutional? Fake images created with Photoshop have been a thing for years now and can be considered art, criticism, satire, parody or other kinds of protected speech. AI created images are fundamentally no different than photoshop or even literally cutting and pasting with scissors and glue. Assuming the subjects of such fake images are adults, such a ban should appear prima facie to violate the 1st Amendment.
What are you really arguing *for* here? That it's criticism or satire if people deepfake their coworkers and neighbors and send it around the internet? You really think this expression is tolerable and acceptable? The "art" of exposing someone intimately deserves protection? Because it's what's happening right now. You should realize that this is a crime against a person no matter what the law states. The people that have this happen to them are endlessly victimized by the internet. How would you feel if a family member unalived themselves over this kind of harassment?
@@zachbryant8405 Harrassment is already criminalized under many different statutes. You don't need a federal law to outlaw AI fakes and stiffle free speech to prevent or prosecute legitimate harrassment. Are you arguing in favor of the government imprisoning people for creating artistic content that it doesn't like? That's some scary authoritarian BS...
@@ForceSmart what we're really talking about here is enforcing the social contract in a digital age. You cannot forcibly parade someone around naked to a crowd, nor can you forcibly look at them naked as the only witness. Why should you get to do that digitally? No part of this fits in the social contract and honestly should become part of common law.
Yeah no. I’m not for this. But I think certain intent and utilization of deepfakes can be more easily classified under existing defamation law. And government can ease standards for civil action for it.
Exactly
Defamation requires false statements of fact. Rarely are nude deepfakes ever passed off as factual or truthful, nor would most reasonable people consider them to be such false statement of facts. Fundamentally, creating such images is protected speech and can be considered art, criticism, parody, satire, etc. if they do not meet the legal standards for defamatory speech. It doesn’t matter if they’re created with AI, photoshop, or good old fashioned scissors and glue.
I love how even something like this has its hands in the political system.
Damn they got billie ??
Link?
😩
Nah that’s crazy
ain't no way lil bro 💀
Dude get your life in order. Your better then this...
Man no more AOC 😢😢😢
Sad news for doppelgangers everywhere :(
Okay but like, is it illegal to draw Taylor swift naked? Who gives a shit about this?
Yes. That’s freedom of speech. Fake images, whether created by AI, photoshop, or cut and pasted with scissors and glue, are all forms of protected speech (art, criticism, parody, satire, etc) and banning such content (especially if depicting adults) would be unconstitutional. It should be no surprise that lawmakers don’t really care about constitutional rights though. If such a bill does become law, it should be struck down by the courts.
@@ForceSmart I would argue that there is clear and present danger to making deep fakes as many people's first feelings when they find out are of suicide, especially in people who are not public figures. It is extreme emotional harm that decent people would not shrug off as tolerable. It is the same concept as revenge porn except it can be done to anyone, even you and your family. All someone needs is a picture of your mother's or sister's face and bam. That should not be something you don't mind happening without their consent.
@@zachbryant8405 There is no clear and present danger. You grossly misuse the term, in part because there is no inherent threat. It is also not society's ethical or legal duty to make sure people's feeling are never hurt. Free speech can hurt, but just because it might hurt, does not make it illegal. That's called liberty. However, harrassment is already criminalized under many different statutes. You don't need a federal law to outlaw AI fakes and stiffle free speech to prevent or prosecute legitimate harrassment. Are you arguing in favor of the government imprisoning people for creating artistic content that it doesn't like? That's some scary authoritarian BS...
@@ForceSmart if it's a gross misuse, then how does harassment not also violate the 1st amendment? Where is the clear and present danger if emotional harm and wellbeing do not qualify? Because that is the basis of harassment claims. From how you've phrased it, if harassment is enough to cover this, then the new law would be a bolstering of harassment laws. None of this changes the fact that AI nonconsensual porn is a complete and total violation of personhood. The ability to choose when others see you naked is an innate right, and is cruel and unusual to lose that right. It's not just hurt feelings. It is lasting damage that propagates like a virus. As long as even one creep has the photo, someone can be found and harassed further. To drive my point, if this violation of one's free will to control their nakedness is art to you, then do you consider rape to be performance art?
You can't stop the signal
This is 100% unimportant. Just the rich applying new rules to us but never themselves.
Law makers fail In general
If they have a 464% increase in porn doesnt that also mean that society is messed up to the point where they have an audience thats growing at the same pace for this? I dont think this can be fixed at this rate, good luck people.
Rich people protecting rich people . . . Funny how quick they jump to protect the rich and powerful
Hey we’re not na- oh yeah those people? Fair enough.
The bear doesnt post nonconsensual porn of women online, leading to endless harassment spanning the course of years, but men do. Not the comment section arguing muh first amendment is more important. These people are victims and spreading porn of someone online is not parody or satire. The manufacturing and mass distribution of unwilling porn of nonpublic figures is a crime against humanity. The people of the world will adapt their laws accordingly.
And they should
Now I agree with Ted Cruz, Are you happy now AI people!?
Good! This crap can literally ruin people's lives. The only thing worse in my opinion is kiddy stuff. Why do you want to promote messing up other people's lives by subscribing to that crap?
Yeah good point I don’t honestly know why Christians are even allowed into this country. You do make a good point about ideological drivel being a bad thing for productive society. Really hard for me to figure out why more people don’t say the truth?
Good its about time because this could get people in trouble for someone making a deep fake about them
Sounds like an easily protected right? Maybe don’t let idiot corporations fire their employees for having more hustle than their slavers?
@@uncertaintytoworldpeace3650 youre a weird guy
Ted cruz watches cousin pronz. That being said, yes something needs to happen.
Excellent
"We say thing and then we fight and then nothing happens"
If I had a nickle...
They shall start with you
Good.
I wanna see some deepfakes of Ted Cruz now
💯 YES
I'm getting tired of all of these abbreviations.
Some day, a cat could walk across a keyboard and is guaranteed to type out an abbreviation that already exists
Yikes
Based
I find it interesting that we need a bill to “allow someone to sue” under a set of specific circumstances. Seems to me that one should be able to sue anyone for any perceived crime and that the justice system should be more streamlined and less punitive.
Yeah… I can see why that wouldn’t work. 😂
Also, the Senate may have been able to find bipartisan cooperation, but the House of Representatives is just a mess. I do hope that they come up with a meaningful bill that can be enforced, but I'm not holding my breath.
Hell yeah
The _Land of the Free._
Where you can fake being a citizen, but not fake an image of an actual citizen. 🤪
ur more than welcome to move :)...unless youre too broke :)
Good. Very good.
yes something needs to be done but frankly any bill made by this guy should be side eyed
I'm unwilling to agree with Cruz on anything so I'll agree with the co sponsors of the bill.
AI porn is fucking disgusting. That feels like it’s like digital rape. I can’t imagine what I would think if that happened to me, I feel terrible for anyone who has been affected.
I feel any punishment more than being sued for defamation is overkill and does not fit the crime.
I see it as a violation of the First Amendment (in most situations)
Never mind, this is more focused on revenge p*rn witch should have heavier consequences. However My previous comments on ai p*rn or recreational p*rn art in gerneral still stands.
@@ggexgaming6020 I've been worry about deepfake stuff ever since a co worker showed me what he could do with it the ability to frame someone in my eyes is only going to improve as ai gets better. I hope it doesn't get to that point but people have been convicted on less evidence than a video with their face in it.
@@dragconen well yea obviously, using deep fakes with the intention of hurting someone should be illegal. I am merely arguing for recreational use of deepfakes for personal use. (like making AI p*rn for yourself or sharing it on dedicated xxx website as free content)
100% based
Better arrest fox news!!!
womp womp