It’s Getting Harder to Spot a Deep Fake Video

Поділитися
Вставка
  • Опубліковано 7 чер 2024
  • Fake videos and audio keep getting better, faster and easier to make, increasing the mind-blowing technology's potential for harm if put in the wrong hands. Bloomberg QuickTake explains how good deep fakes have gotten in the last few months, and what's being done to counter them.
    Video by Henry Baker, Christian Capestany
    Like this video? Subscribe: ua-cam.com/users/Bloomberg?sub_...
    Become a Quicktake Member for exclusive perks: ua-cam.com/users/bloombergjoin
    QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
    Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
    Visit our partner channel QuickTake News for breaking global news and insight in an instant.

КОМЕНТАРІ • 5 тис.

  • @business
    @business  3 роки тому +409

    We have some exciting news! We’re launching channel Memberships for just $0.99 a month. You’ll get access to members-only posts and videos, live Q&As with Bloomberg reporters, business trivia, badges, emojis and more.
    Join us: ua-cam.com/users/bloombergjoin

    • @52sees
      @52sees 3 роки тому +2

      Epic

    • @Amu_LEGEND
      @Amu_LEGEND 3 роки тому +5

      Okay
      👁️ 👁️
      👄

    • @lightningfun6486
      @lightningfun6486 2 роки тому +4

      What

    • @TheProfessor66
      @TheProfessor66 2 роки тому

      "Warfare campaign" that aged poorly with the Ukraine president deepfake.

    • @deleted9388
      @deleted9388 2 роки тому +2

      Anyone with 3 million subZ is a cointel pro shill

  • @520lun
    @520lun 3 роки тому +14314

    2018: Deep fake is dangerous
    2020: DAME DA NE

  • @brianmchaney7473
    @brianmchaney7473 5 років тому +7584

    2008: Pics or it didn't happen.
    2018: Pics are a lie.

    • @AFCA-vn9bl
      @AFCA-vn9bl 5 років тому +343

      Brian McHaney pics have been a lie since photoshop, but now evenvideo’s are a lie

    • @wolfenstien13
      @wolfenstien13 5 років тому +187

      Remember it or it didn't happen,
      Write it or it didn't happen,
      Paint it or it didn't happen,
      Pint it or it didn't happen,
      Photograph it or it didn't happen,
      Record it or it didn't happen,
      Take a of picture it or it didn't happen,
      Video tape it or it didn't happen,
      What now?

    • @efloof9314
      @efloof9314 5 років тому +145

      White Coyote pull out your brain hook it up to a system and show the memory of it happenin

    • @Tofu3435
      @Tofu3435 5 років тому +14

      @@wolfenstien13 now? nothing happening.

    • @Dig_Duke_SFM
      @Dig_Duke_SFM 5 років тому +68

      @@DrumToTheBassWoop
      A literal human sacrifice to Satan to reveal the truth. Or it didn't happen.

  • @aguyonasiteontheinternet578
    @aguyonasiteontheinternet578 Рік тому +235

    The real terrifying thing about this video is that it was uploaded 4 years ago.

    • @audiobyamp4459
      @audiobyamp4459 Рік тому +14

      Im starting to see all of the videos with startling information are usually old

    • @whizzerbrown1349
      @whizzerbrown1349 Рік тому +1

      So far the only Deep Fake I’ve seen popping up have been videos of the members of parliament playing sweaty League of Legends matches so personally the doom and gloom of this video has started eroding lol

    • @963freeme
      @963freeme 11 місяців тому +1

      The newer videos of Trump look like deep fakes. In the Kanye West & Piers Morgan interview from last year, Kanye looked like a deep fake.

    • @jocogorenc7354
      @jocogorenc7354 7 днів тому

      Five :o

  • @myusernameis_pasword6860
    @myusernameis_pasword6860 3 роки тому +766

    I think rules and laws about deep fakes should be put in place before this gets any worse because real harmful things can happen to people's reputations and even the fact that people can claim innocence for crap they did say!

    • @river_acheron
      @river_acheron Рік тому

      How? Once something CAN be technologically done, there cannot be rules and laws to unlearn it. lol. Those that want to use deepfakes to scam are of course not going to listen to rules and laws against creating them!
      The ONLY solution here is to find a way to detect a deepfake from the real thing.

    • @tj_trout9855
      @tj_trout9855 Рік тому +14

      Surly no one will break the law!

    • @myusernameis_pasword6860
      @myusernameis_pasword6860 Рік тому +25

      @@tj_trout9855 of course people will but at least with rules in place people have the ability to take action in a court

    • @yoursleepparalysisdemon1828
      @yoursleepparalysisdemon1828 Рік тому +5

      it’s just technology at this point. you don’t seem to understand why banning it would be hindering tech. don’t hate what you don’t understand.

    • @myusernameis_pasword6860
      @myusernameis_pasword6860 Рік тому +30

      @@yoursleepparalysisdemon1828 I'm no hating it, I think it has cool applications. You misunderstand my point of view. What I'm saying is that there should be laws in place to defend those who's images are being used to say things they never said. I don't want to ban it, I just want to make sure that there is protection in place in case people misuse this technology.

  • @nufizfunslower3438
    @nufizfunslower3438 3 роки тому +11993

    Imagine developing advanced technology and people using it to make memes

    • @joonatanlepind3124
      @joonatanlepind3124 3 роки тому +401

      lmao I love that it's happening.

    • @diwakardayal954
      @diwakardayal954 3 роки тому +373

      that's the internet

    • @JA-yz8eq
      @JA-yz8eq 3 роки тому +53

      this is just movie technology released for im sure the pure purpose of high-level plausible deniability tampering

    • @joonatanlepind3124
      @joonatanlepind3124 3 роки тому +40

      @@JA-yz8eq no it's for making callmecarson sing big time rush

    • @leleled6467
      @leleled6467 3 роки тому +27

      I guess people are doing this in an attempt to corrupt it before it's used for the worst

  • @MrPatpuc
    @MrPatpuc 5 років тому +13827

    This is terrifying.
    Imagine when deepfake videos can frame innocent people as guilty.

    • @22z83
      @22z83 5 років тому +2016

      Well soon videos can't be used as evidence because of this

    • @billybadbean9077
      @billybadbean9077 5 років тому +885

      @@22z83 but it can still ruin lives

    • @treerexaudi
      @treerexaudi 5 років тому +387

      unless it is 16k res it isn't trustable xD even a simple mask in a robbery I saw can make it look like someone else just because low quality camera. It is silly and scary.

    • @MrSirFluffy
      @MrSirFluffy 5 років тому +143

      You can fake at 4k and lower resolution to make it impossible to know if its fake.

    • @deitrickorullian505
      @deitrickorullian505 5 років тому +405

      False allegations can be made against people without any real evidence to support them and people believe it, I can't image what this will do

  • @jadkleb2788
    @jadkleb2788 3 роки тому +121

    Other than the funny comments and memes this is actually extremely terrifying...

    • @LAkadian
      @LAkadian 2 роки тому +7

      Actually, those are terrifying too, for their unabashed idiocy.

  • @willdwyer6782
    @willdwyer6782 Рік тому +114

    Putting Tom Hanks as Forrest Gump into archival TV footage could be considered early deepfake video. They digitally manipulated the lips of the other people in the scenes to move in sync with an impersonator's voice.

    • @yoursleepparalysisdemon1828
      @yoursleepparalysisdemon1828 Рік тому +1

      isn’t a deepfake using ai or something? iirc it was done differently.

    • @sarah69420
      @sarah69420 Рік тому +5

      @@yoursleepparalysisdemon1828 deep fake is a general idea of creating a fake video/audio/picture of or including someone not originally there or altering those who are, AI is just a tool to get that done

    • @yoursleepparalysisdemon1828
      @yoursleepparalysisdemon1828 Рік тому +2

      @@sarah69420
      the definition is that it was digitially altered.

  • @ChrisSche
    @ChrisSche 5 років тому +8221

    It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.

    • @JP-sm4cs
      @JP-sm4cs 5 років тому +547

      Make public broadcasts have a near invisible encryption watermark that distorts modifications? But yeah phone based evidence is screwed

    • @eddyavailable
      @eddyavailable 4 роки тому +297

      audio is very easily edited and manipulated nowadays.

    • @dennydarkko
      @dennydarkko 4 роки тому +29

      Far Altright how do you know they haven’t already? 😂

    • @therogue9000
      @therogue9000 4 роки тому +150

      Yes they will... Deepfakes are pointless on security cams they are not at the right angle and the video is ussualy to low res.

    • @ev.c6
      @ev.c6 4 роки тому +112

      @@JP-sm4cs SURE. Like you can't fake the water mark either. You seem not to understand how complex the AI behind this deep fake is. If they can fake someone's facial expressions in a video like that, just imagine how easy it is to put some stupid watermark in a few frames.

  • @cjezinne
    @cjezinne 5 років тому +2635

    At first, I thought this was going to be bad... but then I saw the Nicolas Cage renders and then life made sense again

  • @arcosprey4811
    @arcosprey4811 Рік тому +23

    Imagine how much worse it is now.

  • @spetzy1921
    @spetzy1921 Рік тому +13

    This was 4 years ago. Let that sink in.

    • @sanmartinella4933
      @sanmartinella4933 Рік тому +5

      And this tool is offered to the public, imagine what our governments have.

  • @3p1cand3rs0n
    @3p1cand3rs0n 5 років тому +7287

    I seriously thought they were going to reveal that the guy at 0:56 was, himself, a deep fake.

    • @thoyo
      @thoyo 5 років тому +781

      same here! his lips didn't seem to match his speech and his eyes looked a bit dead

    • @andrelee7081
      @andrelee7081 5 років тому +654

      I think that's just the power of a potato cam.

    • @TitorEPK
      @TitorEPK 5 років тому +73

      You're ready for the future.

    • @tacubaeulalio
      @tacubaeulalio 5 років тому +40

      Does anyone know what they are talking about when they mention the weather patterns or flowers? That part honestly confused me. I take it as they can make fake videos of weather changing or flowers blooming but not sure why that would be useful ?

    • @mishaj2647
      @mishaj2647 5 років тому

      ElyssaAnderson b

  • @candifemale5118
    @candifemale5118 3 роки тому +7012

    Everyone before: Deepfake is so dangerous...
    Everyone now: *DAME DA NE*

  • @ghdhfgh6125
    @ghdhfgh6125 Рік тому +20

    This video was posted 4 years ago. Imagine how hard it is now.

  • @OsaZain
    @OsaZain 5 років тому +3716

    Imagine the potential for blackmail :/

    • @Dylan-hy2zj
      @Dylan-hy2zj 5 років тому +273

      OsaZain if anything the reverse is true, any video posted of you doing bad things you can just say it was deep fake blackmail

    • @madman2u
      @madman2u 5 років тому +205

      +Dylan Adams *Except* for the fact that it won't matter. Anything remotely real looking is going to work against your interests. It wouldn't matter if it's a fake because people will still believe you did or say X. People's reputation and life has been ruined for less and without evidence.
      Say someone accuses X or being a liar and a cheat. X says the video is a deep fake. The accusation while not necessarily true conflicts with X's statement. You can either take the video as evidence or the persons word who has a vested interest and therefore lies to protect themselves. It's a lose-lose scenario for the accused. It's just a matter of how much you'll lose. Even if the video is then proven to be fake the damage would've already been done. Unfortunately, bad news are so much easier to believe. It's not ideal at all...
      What we can do to combat this is to be more wary of the so called evidence people come up with. Being objective is important and if there is any doubt then one should always err on the side of innocence rather than guilt.

    • @OsaZain
      @OsaZain 5 років тому +20

      madman2u People tend to believe maligning things much much more easily than the positive ones as well :(

    • @holyn8
      @holyn8 5 років тому +9

      yea the potential for blackmailing is going down to 0% because of this technologie. you cant use videos for evidence anymore. everything you see on a screen could be faked

    • @elias_xp95
      @elias_xp95 5 років тому +6

      What blackmail? It's now easier to claim it as fake. It's the opposite effect of blackmail.

  • @ResoundGuy5
    @ResoundGuy5 5 років тому +915

    This is going to end badly...

    • @glynemartin
      @glynemartin 5 років тому +60

      it's not gonna end...that's the problem...

    • @wutsit2yuhhuh246
      @wutsit2yuhhuh246 5 років тому +70

      @benzo I think you trust your government a little bit too much.

    • @wutsit2yuhhuh246
      @wutsit2yuhhuh246 5 років тому +26

      @benzo "We'll know our disinformation program is complete when everything the American public believes is false."
      -Former CIA Director William Casey

    • @joeljarnefelt1269
      @joeljarnefelt1269 5 років тому +7

      @benzo He said, "This is going to end badly." Point out where he stated that the entire development of these programs should be terminated.

    • @joeljarnefelt1269
      @joeljarnefelt1269 5 років тому +5

      @benzo Maybe you wouldn't want to develope it, or maybe you are just expressing your conserns of the possible misuses of the emerging technology.

  • @bignickreacts
    @bignickreacts Рік тому +19

    I love how there are millions of issues in the world that need solutions, and instead, we figured out how to be more manipulative. 🤦‍♂️

  • @sifutophmasterofeyerolling2513
    @sifutophmasterofeyerolling2513 Рік тому +14

    It's insane how much the technology improved it just 4 years, we now have an almost perfect TTS voices as well.

    • @Truthorfib
      @Truthorfib Рік тому

      It's insane though that the focus was this instead of other things that truly benefit us as a whole. Goes to show where innovation is heading and its not toward our collective success. But things like misinformation and espionage.

  • @straightbusta2609
    @straightbusta2609 5 років тому +4287

    This is probably the secret behind the $1000 emoji machine from apple.

    • @Dominicn123
      @Dominicn123 5 років тому +21

      Face tracking has been around for years brah

    • @r32fandom89
      @r32fandom89 5 років тому +3

      ay u got 69 subscribers

    • @amfm4087
      @amfm4087 5 років тому +18

      No because this requires hours of time and a decent graphics card for just a short clip. The iPhone uses a different technology as the emojis are 3D models. This technology uses 2D pictures like jpeg and png.

    • @jerrell1169
      @jerrell1169 5 років тому +6

      Yeah they straight busta!

    • @Lou-C
      @Lou-C 5 років тому +8

      Me me big boy

  • @Ceshua
    @Ceshua 5 років тому +2507

    Back in the days where everyone says: "Video evidence can't lie."
    2018:
    (Edit)
    2020: Baka Mitai

    • @akkafietje137
      @akkafietje137 5 років тому +19

      I saw it with my own eyes

    • @KnightmareNight
      @KnightmareNight 4 роки тому +9

      Well, back then it couldn't. So they were still right.

    • @agentsmith9858
      @agentsmith9858 4 роки тому +21

      @@KnightmareNight you missed the point

    • @TheAnonyy
      @TheAnonyy 4 роки тому +13

      It could not then. Now it can lie. This is the problem with people embracing new technologies you can't trust what you, see, hear, feel, smell too many artificial things or there.

    • @rukna3775
      @rukna3775 4 роки тому +1

      Ok boomer

  • @brianorozco1074
    @brianorozco1074 Рік тому +10

    Honestly, this is terrifying

  • @tommydavidwalker2445
    @tommydavidwalker2445 Рік тому +4

    Damar Hamlin's people just used it today

  • @Jaylio
    @Jaylio 5 років тому +2694

    0:56 dude looks more cgi that the fakes

    • @fortheloveofnoise9298
      @fortheloveofnoise9298 5 років тому +215

      Those oddly fluttering lips.....wtf

    • @jcesplanada528
      @jcesplanada528 5 років тому +116

      I know, right. I really thought it was fake too

    • @hectorhector3819
      @hectorhector3819 5 років тому +7

      .

    • @SlatDogg
      @SlatDogg 5 років тому +88

      I seriously thought that someone deep faked that video just to prove a point.

    • @Atombender
      @Atombender 4 роки тому +21

      Until the end of the video I thought that it was fake. Damnit...

  • @Nismoronic
    @Nismoronic 5 років тому +1836

    Can I use it for memes tho

    • @lLl-fl7rv
      @lLl-fl7rv 5 років тому +11

      You're THE man.

    • @edd868
      @edd868 5 років тому +48

      Yes. Prepare for the oncoming deep fake meme war between 4chan and Reddit

    • @9yearoldepicgamersoldier129
      @9yearoldepicgamersoldier129 5 років тому +2

      Asking the real questions here.

    • @mariopokemon955
      @mariopokemon955 5 років тому

      Tesco Stig people have already, also can be used to start war but it's no biggie

    • @VOLAIRE
      @VOLAIRE 5 років тому

      Yeah memes aren’t a big deal ha

  • @timber8507
    @timber8507 2 роки тому +33

    I wonder if this technology has or will be used in the war right now? It's absolutely something to consider when watching media today.

    • @nocaptainmatt3771
      @nocaptainmatt3771 2 роки тому +3

      Of course it is

    • @anetkasbzk98
      @anetkasbzk98 2 роки тому

      Bingo. Deep fake raW

    • @shelby1246
      @shelby1246 2 роки тому +2

      This comment aged well… I went watching deepfake videos after hearing about the recent one of Putin.

    • @AbelMaganaAvalos
      @AbelMaganaAvalos 2 роки тому +2

      Zelensky got deepfaked

    • @timber8507
      @timber8507 2 роки тому

      @@AbelMaganaAvalos Yeah, I saw that on the news.

  • @alinoprea54
    @alinoprea54 Рік тому +6

    This was 4 years ago.

  • @thehenryrobot
    @thehenryrobot 5 років тому +12218

    *This would never have happened if Nicholas Cage didn't exist* 😜

    • @justahuman2121
      @justahuman2121 5 років тому +375

      Just 2 likes? 0 comment? Pretty sure this comment will blow one day.
      Edit: ok it's now 1,2 k
      Edit: 4k now
      Edit: i bet it will hit 7k

    • @testname2635
      @testname2635 5 років тому +23

      @@justahuman2121 Agreed

    • @leonthethird7494
      @leonthethird7494 5 років тому +24

      HENRY THE RC CAR its spelled nicolas cage

    • @tharv_2609
      @tharv_2609 5 років тому +5

      You everywhere

    • @mohammedraqib6418
      @mohammedraqib6418 5 років тому +4

      This is that day

  • @Cloudeusz
    @Cloudeusz 5 років тому +1256

    Technology is a double edged sword

    • @fellowcitizen
      @fellowcitizen 5 років тому +6

      ...that looks like a cup of tea.

    • @rvke5639
      @rvke5639 5 років тому +41

      with no handles

    • @user-ho1vt8vz2l
      @user-ho1vt8vz2l 5 років тому +1

      What is double edged sword then

    • @subzero5055
      @subzero5055 5 років тому +16

      @@user-ho1vt8vz2l you kill with it or get killed by it

    • @yungwhiticus8757
      @yungwhiticus8757 5 років тому

      Ad Victorium, Brother!

  • @double_lightsaber
    @double_lightsaber Рік тому +8

    To think this was 4 years ago....

  • @xtechn9cianx
    @xtechn9cianx 2 роки тому +17

    Imagine if the world leaders are using this on Putin right now

  • @ShufflingManu
    @ShufflingManu 5 років тому +545

    I am more concerned about influential people labelling real videos of them as deep fakes in order to avoid consequences than I am about someone trying to harm said people with deep fakes.

    • @OneEyeShadow
      @OneEyeShadow 5 років тому +12

      +Captain Caterpillar Like what? The entire point of the programme is to make it as seemless as possible - so when the technology is actually "there" that's not the case anymore.

    • @Fiufsciak
      @Fiufsciak 5 років тому +16

      @@OneEyeShadow Lol, nope. They may look seamless to a human eye but not to a software designed to expose fakes.

    • @swandive46
      @swandive46 5 років тому

      Like Trump?

    • @PureVikingPowers
      @PureVikingPowers 5 років тому +5

      @@swandive46 Is Trump even a real person? 🙄

    • @allenkennedy99
      @allenkennedy99 5 років тому +3

      That's actually very poor logic.

  • @TheAstronomyDude
    @TheAstronomyDude 5 років тому +831

    Nick Cage SHOULD be every actor in every movie.

  • @intreoo
    @intreoo Рік тому +6

    This is making me more and more paranoid about showing my face online. Not that I ever did though.

  • @shaunluckham1418
    @shaunluckham1418 Рік тому +3

    Simple rule don’t believe anything on television or online video. If you don’t see it in person it may or may not be compromised.

  • @yongamer
    @yongamer 5 років тому +320

    This can become so scary.

    • @YoungBlaze
      @YoungBlaze 5 років тому +3

      Like my exs mother!

    • @PasscodeAdvance
      @PasscodeAdvance 5 років тому +1

      I agree with the Internet person

    • @AthosRespecter
      @AthosRespecter 5 років тому

      @Throngdorr Mighty lol

    • @someonesomewhere6289
      @someonesomewhere6289 5 років тому +6

      @Throngdorr Mighty once this technology is developed further (and it will be) doesn't matter how gullible you are; or if you're wise to all the tricks. We be fucked.

    • @yongamer
      @yongamer 5 років тому +2

      @Throngdorr Mighty The thing is that a significant proportion of people are dumb. And this technology is going to improve. I dont see why a fake video using this technology could not go viral.

  • @EnzoDraws
    @EnzoDraws 5 років тому +1313

    1:47 why tf does the source look faker than the deep fake?

  • @DD-yq1tj
    @DD-yq1tj 2 роки тому +9

    Interesting this is getting suggested to me right now with the thumb nail of Putin ,🤔

  • @coralevy-yo8dh
    @coralevy-yo8dh 6 місяців тому +1

    So many warnings in the forms of movies, books, tv series, video games etc showing us why this is dangerous. We never listen.

  • @crispsandchats
    @crispsandchats 5 років тому +790

    remember everybody: just because you can, doesn’t mean you should

    • @HullsColby
      @HullsColby 5 років тому +22

      I can deep throat a banana.
      But since you said so maybe I shouldn't.

    • @erazure.
      @erazure. 5 років тому +22

      Hulls Colby just a single banana? Step your game up, 3 bananas at once or a large cucumber minimum

    • @missionpupa
      @missionpupa 5 років тому +6

      Cool comment, but do you know how tragic it is of humans actually followed that, and deny everything about what makes us human, our curiosity and desire to progress. Scientists and engineers history havent done things because they could, but because they can. We do thing because they are possible, and you cant stop that.

    • @fruitygarlic3601
      @fruitygarlic3601 5 років тому +13

      @@missionpupa Stop being so pedantic. If something should be done, do it. If not, then don't. Imagine reaching so far you find something to argue about in something that doesn't necessarily disagree with you.

    • @lol-fh3oq
      @lol-fh3oq 5 років тому +10

      @@missionpupa I mean obviously there's still gonna be people who do it but that doesn't mean it's right.. Lmao, why're you reaching?

  • @GIRru11
    @GIRru11 3 роки тому +2294

    Everyone: This stuff is dangerous and scary!
    Me: DAME DA NE DAME YO!

    • @haxxruz6284
      @haxxruz6284 3 роки тому +18

      Baka mitai best meme

    • @denniscuesta7009
      @denniscuesta7009 3 роки тому +14

      Putin singing baka mitai is the funniest

    • @aPandesalboi
      @aPandesalboi 3 роки тому +6

      You mean every memer

    • @bianca.611
      @bianca.611 3 роки тому +7

      i can hear and see these videos damn it.

    • @madjaster9620
      @madjaster9620 3 роки тому +5

      I came here from the yanderedev and others singing deepfake lmao

  • @CarterLundy10
    @CarterLundy10 Рік тому +4

    It’s crazy that this was 5 years ago. It’s just an every day thing to see these deep fake videos now.

  • @DjHardstyler
    @DjHardstyler 8 місяців тому +3

    This aged well...already

  • @spicychipgaming2080
    @spicychipgaming2080 3 роки тому +814

    2018: deepfakes are dangerous and could harm other people
    2020: hamster sings Japanese game OST

    • @Starry_Wave
      @Starry_Wave 3 роки тому +16

      And Yandere Dev singing to the Big Time Rush opening theme.

    • @mangovibes2525
      @mangovibes2525 3 роки тому +3

      Doom Changer lol I just saw that vid

    • @user-on8vk5gb6x
      @user-on8vk5gb6x 3 роки тому +1

      DAME DAME

    • @Accidentalreef
      @Accidentalreef 3 роки тому

      Charles! Man i thought u died! Im happy your back!

  • @ceece3817
    @ceece3817 5 років тому +351

    Black mirror do ya thing

    • @InnovAce
      @InnovAce 5 років тому +1

      ceec e black mirror and Altered Carbon

  • @SomeTrippyCanadian
    @SomeTrippyCanadian Рік тому +3

    Sheesh this was 4 years ago! Just popped up on my feed.
    2023 and it’s just getting crazier

  • @venmis137
    @venmis137 2 роки тому +2

    2018: Deepfake is a terrifying, dangerous technology.
    2022: GENGHIS KHAN SINGS SUPER IDOL

  • @nicoh.1082
    @nicoh.1082 3 роки тому +243

    This is terrifying.
    Imagine Nicolas Cage playing in every movie..

  • @WholesomeLad
    @WholesomeLad 5 років тому +599

    It's also getting harder to spot a fake deep comment

    • @everyone9500
      @everyone9500 5 років тому +6

      oof you're here

    • @npc304
      @npc304 5 років тому +12

      It's also harder to not be a nazi in my book. And just remember, the NPC meme is dehumanizing. We are all unique and special

    • @JJ-te2pi
      @JJ-te2pi 5 років тому +16

      @@npc304 Youre boring. Dead meme.

    • @rixille
      @rixille 5 років тому

      How do we know who is real and who isn't? Mass confusion is a powerful way to separate society.

    • @misterrogerroger5537
      @misterrogerroger5537 5 років тому

      How can mirrors be real if our eyes aren't real

  • @cubycube9924
    @cubycube9924 Рік тому +4

    It’s been 4 years... I wonder what’s going on now...

  • @HTMLpopper
    @HTMLpopper Рік тому +2

    They warned us about this 4 YEARS AGO

  • @syrus1233
    @syrus1233 3 роки тому +1348

    "Deep fakes gained popularity through adding famous celeberties to porn scenes" Ahh porn, always innovating.

    • @soda_crackerr
      @soda_crackerr 3 роки тому +15

      True
      *cri* ✊😌

    • @TheMaster4534
      @TheMaster4534 3 роки тому +24

      The Russians have a word for that.
      Компромети́рующий материа́л, or компромат for short.

    • @denierdev9723
      @denierdev9723 3 роки тому +105

      Always defiling and immoral, too.

    • @jordanmendoza812
      @jordanmendoza812 3 роки тому +6

      @@denierdev9723 your name checks out

    • @denierdev9723
      @denierdev9723 3 роки тому +7

      @@jordanmendoza812 ?

  • @keelo-byte
    @keelo-byte 5 років тому +524

    Forget the fake celebrity porn and political tapes, this technology should be used for only one thing... *remixing old school kung-fu movies.*

    • @caralho5237
      @caralho5237 5 років тому +44

      I imagine Bruce Lee dabbing and doing fortnite dances. Scary.

    • @xouslic742
      @xouslic742 5 років тому +6

      you mean remaster

    • @keelo-byte
      @keelo-byte 5 років тому +7

      @@xouslic742 no I meant remix. Sort of like "kungpow: enter the fist"

    • @Motorata661
      @Motorata661 5 років тому +7

      Bruce Lee.
      Jacky Chan
      Jet Lee
      Donnie Jen
      Kung-Fu battle royale The movie

    • @pieterdejager7805
      @pieterdejager7805 4 роки тому

      Bwahahaha...now ure talking!....

  • @say12033
    @say12033 Рік тому +3

    It's 2023 and now everyone can make deep fakes on their phone

  • @_.1447
    @_.1447 Рік тому +1

    This was four years ago. Imagine the current potential...

  • @aguywithsubs8956
    @aguywithsubs8956 5 років тому +2927

    The porn industry is evolving get ready for VR porn

    • @andatop
      @andatop 5 років тому +494

      Vr porn has been a thing for a decade

    • @brandonontama2415
      @brandonontama2415 5 років тому +150

      It will get worse, soon it will anime and video game characters. And then it will be a virtual reality were you can actually... things are really getting weird.

    • @florianp4627
      @florianp4627 5 років тому +39

      It has existed for a few years now, ever since the Oculus Rift dev kit initially came out

    • @brandonontama2415
      @brandonontama2415 5 років тому +6

      @@moogreal Crap...

    • @dirtiestharry6551
      @dirtiestharry6551 5 років тому +3

      I want subs ready player porn

  • @samswich1493
    @samswich1493 3 роки тому +90

    2018: deep fakes are very realistic and dangerous
    2020: truck sings dame da ne

  • @fakintru9398
    @fakintru9398 Рік тому +6

    2023 AI: LOL

  • @skeetermcswagger0U812
    @skeetermcswagger0U812 Рік тому +32

    Ever since I became aware of this technology,I got a really uncomfortable feeling about it.
    I knew it could be one of those technological 'superpowers' that wouldn't be safe if there was not a clear and adequate way to penalize the ways and how it could be used.
    It is in a way an ability to pirate & clone some forms of reality. Although there still seems to be some perceivable characteristics during the beginning stages of it's development that may be obvious to many and not just those with a 'trained eye',who knows at what point that it's going to be capable to make it indistinguishable to most if not all viewers of it? Do they 'have to' disclose this information?
    Who knows if some of the examples of those flaws that are more readily obvious to be fake aren't just being used to to distract the viewers from the more capable versions of this technology already?
    Great,..now I sound like a crazy person even to myself!🤦‍♂️

    • @aliceslab
      @aliceslab Рік тому +8

      its not crazy, the future will get more complex, and it is harder to control complexity than the simplicity of our origins.

    • @ES11777
      @ES11777 6 місяців тому

      No, you are just smart and looking at it from all angles.

  • @shawnli4746
    @shawnli4746 5 років тому +339

    If this technology evolves, get ready for the dystopia that Orwell predicted, and be ruled by faceless individuals...

    • @CriticalRoleHighlights
      @CriticalRoleHighlights 5 років тому +16

      This could be something a dystopian government uses when the masses wouldn't know any better _after_ a dystopia has occurred by other means, but dystopia will never occur because of it.

    • @lilahdog568
      @lilahdog568 5 років тому +20

      CRH our government could begin going after individuals simply by creating evidence in the form of deep fake videos

    • @nefelibata4190
      @nefelibata4190 5 років тому +3

      what is the point in the videos if you can't tell what is fake and not?
      you would need an expert on the case that is somehow being monitored by another expert ans several other people, who has the best or worst intensions for human kind.

    • @michaelwatts5139
      @michaelwatts5139 5 років тому +19

      @@nefelibata4190 we already have people faking their gender

    • @mutanazublond4391
      @mutanazublond4391 4 роки тому +1

      It has evolved, are you stupid, 90 percent of all actors used are non existant with fake backgrounds ... all of the documentaries are fake people ... etc etc

  • @jose-gr7jg
    @jose-gr7jg 5 років тому +135

    So Stan Lee will be able to do all the cameos??

  • @Angie-lp2hk
    @Angie-lp2hk 2 роки тому +2

    now this is terrifying

  • @emptyhad2571
    @emptyhad2571 Рік тому +2

    The age of AI has begun and it won’t stop

  • @LuxAeterna22878
    @LuxAeterna22878 4 роки тому +43

    This is terrifying. One can only hope that equally ingenious methods of security will protect humanity against such powerful tools of deception.

    • @lil_weasel219
      @lil_weasel219 2 роки тому

      yes that security is certainly uhm "impartial" eh and would never itself propagate similar things?

    • @braindavidgilbert3147
      @braindavidgilbert3147 Рік тому

      I mean we talked the same way about editing at first. Look how it is now😊

  • @cardorichard4148
    @cardorichard4148 5 років тому +614

    Trumps new favorite phrase, “Deep fake news.” 😂

    • @ggsay1687
      @ggsay1687 5 років тому +4

      It would be hard to deny if someone put his fase on insane person shouting insults on squer.

    • @leonscottkennedy3143
      @leonscottkennedy3143 5 років тому +1

      GG SAY *face

    • @ggsay1687
      @ggsay1687 5 років тому

      you missed the "squer", I think I was drunk

    • @L7vanmatre
      @L7vanmatre 5 років тому

      TRUMP LOL HAHA

    • @jojothermidor
      @jojothermidor 5 років тому

      You're pathetic.

  • @raulgalets
    @raulgalets Рік тому +3

    this is 4 years ago

  • @beepduck
    @beepduck Рік тому +2

    bro you can tell so easily theyre fake, no one moves their face like that

  • @altaica3522
    @altaica3522 3 роки тому +66

    People are making fun of this video, but it's only a matter of time till someone uses this for malicious purposes.

    • @ljoxleyofficial8119
      @ljoxleyofficial8119 3 роки тому +11

      They already are

    • @ljoxleyofficial8119
      @ljoxleyofficial8119 3 роки тому

      Vincent DiPaolo what does this all mean?

    • @yimmy7160
      @yimmy7160 3 роки тому +1

      You act like this is new and hasn't been done. This "tech" has been around for a while actually

    • @mirjanapucarevic2105
      @mirjanapucarevic2105 3 роки тому +1

      It is very scary how many lives will be destroyed?!

  • @luciferexperiment8553
    @luciferexperiment8553 4 роки тому +144

    if this stuff became public ,they been using it for years ...

  • @biomuseum6645
    @biomuseum6645 Рік тому +1

    Why is it that these creepy technologies always get romanticized by telling people it will help small creators?
    Wasn’t that what they told us when they removed the dislike button? To help small creators?
    Don’t small creators adapt creatively to limitations and become more creative in the process instead of having all their whims on a silver plate?

  • @zyurxi7307
    @zyurxi7307 2 роки тому +1

    The fact you can make national threats and make it seem as though it was someone else is truly scary.

  • @zuko1569
    @zuko1569 5 років тому +623

    Shapeshifting reptilians want to know your location

    • @joshuakoh1291
      @joshuakoh1291 5 років тому +5

      Zuzu "That's fucking rough buddy for me"

    • @kat_867
      @kat_867 5 років тому +4

      Ones in the whitehouse

    • @pastelxenon
      @pastelxenon 5 років тому +2

      @@kat_867 if youre an idiot

    • @kat_867
      @kat_867 5 років тому +1

      Pastel Xenon if? Lmao what 😂 just go away.

    • @sofialaya596
      @sofialaya596 5 років тому +1

      lmao

  • @AngryGoose
    @AngryGoose 3 роки тому +305

    DeepFakes:possibly dangerous
    Everyone: HAHA yanderedev go damedane

  • @e8tballz
    @e8tballz 3 роки тому +7

    I’m sure this will be huge w/ scammers in a few yrs. they’ll use this to FaceTime someone’s grandparents and say they’re in trouble and need cash or something ridiculous. Inevitably but sad.

    • @mayeighteen2812
      @mayeighteen2812 3 роки тому +2

      Yes or our children being subject to deepfake porn or other degradatory/defamatory videos made by their own classmates as a form of bullying... ruining their futures and sense of self or reality. 😔

  • @thefirebeanie5481
    @thefirebeanie5481 Рік тому +7

    Well this was always inevitable
    This aged like fine wine

  • @HeavymetalHylian
    @HeavymetalHylian 5 років тому +177

    Spread the word. This needs to be on trending.

    • @yourneighbour5738
      @yourneighbour5738 5 років тому +13

      Yes we need more Nicholas Cage movies

    • @plsdontreplytomewitharmy5926
      @plsdontreplytomewitharmy5926 5 років тому +15

      that would inspire more people to do it then :/

    • @reecherdbrown8156
      @reecherdbrown8156 5 років тому +4

      The word's been spread and we've all stayed asleep

    • @justsomeguys1121
      @justsomeguys1121 5 років тому +1

      HoneyedHylian trending videos are hand picked by UA-cam staff

    • @littleme3597
      @littleme3597 Рік тому

      They could keep dead people alive. LIKE biden and that old hag. S.C. person. Make it appear, she is still alive and speaking.

  • @user-uj4ip2pt6h
    @user-uj4ip2pt6h 5 років тому +153

    we humans put technology development first and common sense second.

    • @muhdelyas-abgyas562
      @muhdelyas-abgyas562 5 років тому +23

      Realistic porn first and common sense second

    • @PasscodeAdvance
      @PasscodeAdvance 5 років тому

      Aliens are butter than us (or salter)

    • @realdeal5712
      @realdeal5712 5 років тому

      myownname myownlastname it is common sense to have porn video with your crush face on it

    • @SuperDanielHUN
      @SuperDanielHUN 5 років тому +5

      Even if 99.9% of the planet prefers sense, there is always that one guy that opposes it and creates a breakthrough as a result (sometimes). Galileo was completely insane for stating the earth is round at the time, and any person with "common sense" would say not to do it, because he'd be killed by the church and because the Bible already says its flat. Technology both helps and punishes humans, often in unexpected ways, Alfred nobel wanted to help miners with dynamite, he created a weapon of mass murder by accident. Common sense is neither universal nor definite, its rather technology and social changes that twists whats considered "common sense"

    • @reyxus9454
      @reyxus9454 5 років тому +5

      @@SuperDanielHUN "the bible already says it's flat" wtf are you on about

  • @harperwelch5147
    @harperwelch5147 3 роки тому +4

    This is really scary. It opens doors that are a Pandora’s box of trouble. I am sad that the creators don’t care what their work will very obviously lead to. This is no video game. It will be weaponized.

  • @getpriyanka
    @getpriyanka 3 роки тому +9

    Governments fear the advanced technology of the world may fall into wrong hands, but the only reason we want deepfakes is to get memes

  • @confusedwhale
    @confusedwhale 5 років тому +39

    It's true that it's getting harder to tell, but there is still something wrong with the robot face images.
    Long live the uncanny valley.

    • @themelonn6313
      @themelonn6313 5 років тому

      confusedwhale wow this will actually help us combatant against this lol. imagine an expert.

    • @David-gp3fd
      @David-gp3fd Рік тому

      nah this a foolish short sighted perspective..the human body has its limits and would have to evolve to keep up to tell. Unfortunately tech is evolving way faster than humans

  • @vao5399
    @vao5399 5 років тому +158

    Honestly I feel like this is saying that this is some problem that's going to be hard to control, but it won't be, the scariest part about this is that the way bigger news sources are getting desperate snd lazy so they won't fact check this when it pops up.

    • @miamarie5426
      @miamarie5426 5 років тому +9

      Simon WoodburyForget how do we authenticate videos when people lie, audio can be faked/cut/manipulated, and pictures can obviously be photoshopped

    • @SuperPhunThyme9
      @SuperPhunThyme9 4 роки тому

      @@miamarie5426 a Deepfake leaves specific, pixel level traces that Bloomberg here "forgot" to mention...
      ....but the audio is absolutely full of clues that no deepfake can come close to fixing.

    • @tony_5156
      @tony_5156 4 роки тому

      We have a big UN meeting and outright ban it, with martial punishment high and tough
      Jail, yup no bail for you buddy your going straight to jail.

    • @honkhonk8009
      @honkhonk8009 4 роки тому

      True. In courts, this wont even be an issue, but with our already lazy and retarded media, their gonna not even fact check and their gonna treat it like proof. Wont be the first time

    • @shannonjaensch3705
      @shannonjaensch3705 Рік тому

      Even more sadder is that most never fact check the news they hear or anything they are told by anyone. Lazy brains that are just consumed with trying to stay alive and in a state of low consciousness physical body comfort.

  • @pamelaia
    @pamelaia Рік тому +1

    the consequences for deep framing other ppl w the intentions of harming them should be serious. very serious, it's so cruel

  • @sharilyn8262
    @sharilyn8262 Рік тому

    Getting that alibi out before the evidence is gold. Divide the people with more confusion.

  • @samonterolanjayp.8229
    @samonterolanjayp.8229 3 роки тому +124

    Plot twist: The expert they interviewed is a deepfake edit

  • @mrmomokar
    @mrmomokar 5 років тому +11

    This is scary. Somehow, I really feel uneasy and disgusted when I see one because it looks really off, artificial yet it’s really convincing.

  • @vogahl34
    @vogahl34 6 місяців тому +1

    Soon, justice won’t be able to use CCTV to incriminate offenders, soon, any evidence will be questioned and we’ll be left with an utterly defenceless society.

  • @edeliteedelite1961
    @edeliteedelite1961 2 роки тому +4

    Would be very convenient for them to target Putin right now.

  • @kylecollins5463
    @kylecollins5463 4 роки тому +86

    Just imagine a world where a deep fake can be shared millions of times showing a leader saying he's pushed the nucleur button

    • @hwlz9028
      @hwlz9028 4 роки тому

      Yep

    • @sorrymyenglishbad2535
      @sorrymyenglishbad2535 4 роки тому +5

      Gotta be more subtle for more chaos.

    • @youtubespy9473
      @youtubespy9473 3 роки тому +4

      Lol, why would anybody blow up their own world unless they were suicidal.

    • @pit2992
      @pit2992 3 роки тому +2

      @@youtubespy9473 There are people who have nothing to lose.

    • @youtubespy9473
      @youtubespy9473 3 роки тому

      @@pit2992 I said that "suicidal"

  • @awesome117unsc
    @awesome117unsc 5 років тому +218

    Making Memes more danker than ever.

  • @masterpetrocy7111
    @masterpetrocy7111 Рік тому +1

    This is four years ago ! Bet it’s way better now

  • @Terrox38
    @Terrox38 3 роки тому +3

    Remember that image experts can detect all of these techniques by analyzing them. If you make fakes by passing them off as real, and you spread the fake by not specifying that it is a fake, you risk a great deal of justice.

  • @derekg5006
    @derekg5006 3 роки тому +38

    2018: Deepfakes are dangerous!
    2020: JFK talks about Rick and Morty

  • @JeremyBX
    @JeremyBX 3 роки тому +17

    “Fake news on steroids”
    I really like that summarization

  • @nigachad4031
    @nigachad4031 Рік тому +1

    voice actors are gonna go wild with this kind of technology

  • @eileanvm
    @eileanvm 3 роки тому +1

    Human beings need to start interacting face to face again. Imagine that?

  • @haleyanne447
    @haleyanne447 5 років тому +108

    Who else didn’t read the title and thought the thumbnail was a spot the difference lmao

  • @raoulfr
    @raoulfr 5 років тому +1151

    This technology comes from porn...what is humanity evolving into 😂!?

    • @jaliborc
      @jaliborc 5 років тому +129

      It doesnt come from porn. It comes from academia. The industry is using it in it, after being developed in the academia for general purpose.

    • @beamboy14526
      @beamboy14526 5 років тому +8

      evolving to create a direct brain-to-computer porn indistinguishable from reality

    • @WaitingForStorm
      @WaitingForStorm 5 років тому +45

      porn is one of the biggest industries on the planet

    • @goforit7774
      @goforit7774 5 років тому +2

      porn will be banned like prostitution

    • @Lucky8s
      @Lucky8s 5 років тому +33

      @@goforit7774
      Banned by who exactly?

  • @memberofthelambily1340
    @memberofthelambily1340 Рік тому +1

    And it’s gotten so much better since this

  • @joshuagulbrandson9397
    @joshuagulbrandson9397 3 роки тому +1

    "This advance is not yet available to the public."
    I don't think that's up for them to decide.