Can Digital Signatures Stop Deepfakes?

Поділитися
Вставка
  • Опубліковано 12 лют 2024
  • In this video I discuss how the White House is looking for a way to cryptographically verify videos of Joe Biden so viewers don't mistake them for deepfakes, what technology could already be used for this, and how it will probably not do much to stop people from believing in deep fakes.
    My merch is available at
    based.win/
    Subscribe to me on Odysee.com
    odysee.com/@AlphaNerd:8
    ₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿
    Monero
    45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436
    Bitcoin
    3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV
    Ethereum
    0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079
    Litecoin
    MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
  • Наука та технологія

КОМЕНТАРІ • 417

  • @AiMR
    @AiMR 2 місяці тому +343

    The real problem isn't deepfakes, it is people like Trump and Biden saying such dumb shit in RL that we can no longer detect what is satire and what isn't.

    • @bobSeigar
      @bobSeigar 2 місяці тому

      This is what I keep trying to point out. It doesnt matter if deepfakes are good, they are now an available scapegoat for politicians to do even more dumb shit.

    • @Stripedspot
      @Stripedspot 2 місяці тому +6

      you can hear the DF/Ai in their voice, usually it's like an echo while they are talking

    • @youtubehandol
      @youtubehandol 2 місяці тому +3

      poe's law in real life

    • @scottpageusmc
      @scottpageusmc 2 місяці тому +17

      The real problem is people taking someone elses word as fact, especially without doing their own research and thinking critically. Sheople are the problem.

    • @aladdin8623
      @aladdin8623 2 місяці тому

      A.I. is completely overhyped by society. If i tasked an A.I. to check wether a photoshoped image was fake or not, it couldn't. It must be able to produce photshop fakes at best human artist quality itself for being able to check. But it can't.

  • @Skelterbane69
    @Skelterbane69 2 місяці тому +318

    Ghost videos are gonna be so crisp now

  • @AncientSlugThrower
    @AncientSlugThrower 2 місяці тому +404

    It would be pointless to enforce it on a per-user basis, but PGP authentication could be useful for 'official' news sources. What will turn bad is when there is enforcement against the distribution of non-verified images. Citizen journalism will die with something like that.

    • @gizka6816
      @gizka6816 2 місяці тому +29

      youtube lead the charge against independent journalism way back when trump first got elected

    • @Korodarn
      @Korodarn 2 місяці тому +38

      I think PGP authentication is legitimately a great idea, and I agree, there is no way it's good if centralized sources get to decide what's reported. Let people learn about the tools for validation or seek the services of another.
      But mostly, people need to stop thinking of the internet as "real" because in many ways it never has been.

    • @bryce-bryce
      @bryce-bryce 2 місяці тому +2

      Their is? Seriously?

    • @AncientSlugThrower
      @AncientSlugThrower 2 місяці тому +10

      @@bryce-bryceI don't have Grammerly installed on my phone's UA-cam app. I appreciate the heads-up.

    • @igordasunddas3377
      @igordasunddas3377 2 місяці тому

      Official "news sources" are biased anyway more often than not... So what?
      I doubt digital signatures will help all that much TBH.

  • @thetoasterisonfire2080
    @thetoasterisonfire2080 2 місяці тому +155

    Long story short, you can’t cure stupid.

    • @qlippoth13
      @qlippoth13 2 місяці тому +10

      400 years of government seems like a choice.

    • @thetoasterisonfire2080
      @thetoasterisonfire2080 2 місяці тому +3

      @@qlippoth13
      Unfortunately the stupid people also seem to be the ones that end up in the government somehow.

    • @TheCuteZombie
      @TheCuteZombie 2 місяці тому +1

      @@thetoasterisonfire2080 That is what people don't seem to understand: smart people are not the majority. If you have a system where the majority chooses who is going to be on power, it's obvious there is a high chance of another like the majority taking the position, an idiot.

    • @qlippoth13
      @qlippoth13 2 місяці тому

      @@thetoasterisonfire2080 Yuri Bezmenov called them political prostitutes and in the same breath outlined their retirement programme.

  • @linuxization4205
    @linuxization4205 2 місяці тому +219

    Well, more surveillance for the government!

  • @HigherLevelServices
    @HigherLevelServices 2 місяці тому +56

    As with all security measures, humans is the weak point

    • @etrestre9403
      @etrestre9403 2 місяці тому

      I see...

    • @konayasai
      @konayasai 2 місяці тому +1

      We ought to take the men out of the loop.

  • @Nexus9_KD6-4.8
    @Nexus9_KD6-4.8 2 місяці тому +158

    It won't stop deepfakes by any means, but public key cryptography could provide a way of authenticating a piece of media from official sources vs unauthenticated media in the wild.

    • @Schmoogie
      @Schmoogie 2 місяці тому +21

      Seems like that alone would be useful enough for official purposes I suppose, assuming that normies can figure out how to verify it themselves.

    • @hanelyp1
      @hanelyp1 2 місяці тому +28

      Part of the current problem is official sources lie. And sources in the wild can be reliable. Authenticating the source, whatever it may be, is still useful.

    • @Ghost_Text
      @Ghost_Text 2 місяці тому +3

      Independent media would have to get in on it early then or risk being dismissed as bot content

    • @3rdHalf1
      @3rdHalf1 2 місяці тому

      I fail to see the point. Most of “independet” media use legacy media as source and all they do is commentary.
      Here would be an example how would it fail: Someone rips a clip of Jo Biodome’s speach from BBC YT channel and re-uploads it to TukTuk. In process video gets croped to portrait mode, color corected and re-encoding removes EXIF data in process. TukTuk does crypto-key check and puts a notification that video is not verified. This happens to one video, then another, and another… and after a week people tune out the notification., like ads. Now the result is that you still can’t tell the difference between videos that has been edited(with fair use) and an actual deep-fakes.

    • @Ghost_Text
      @Ghost_Text 2 місяці тому

      @@3rdHalf1 sometimes that also includes direct evidence from avg people just having their cell phone or camera on. Like the recent ccp problems that our resident boogie woogie pianist Brendan Kavanaugh is currently going through.
      If the enshitification of the internet leaves citizen journalism in doubt or subject to scrunity by "trusted sources" captured by special interests then honest, grassroots damning info that cant be independently verified would be dismissed as bot content even if true.
      Suddenly we have a ministry of truth, everyone sees the emperors "clothes" and they are free to flag with impunity.

  • @ninjartist36
    @ninjartist36 2 місяці тому +43

    "How dare they spread misinformation, that's my job." - The govt probably

  • @DMSBrian24
    @DMSBrian24 2 місяці тому +47

    If you want to use them, sure. But if you send someone nudes or someone secretly records someone else, they probably wouldn't wanna digitally sign that kinda stuff to ensure its authenticity. So when a compromising deepfake appears, even if digital signatures are normalized (and as long as they're not mandated which is definitely something we don't want), there will be no way to tell, at least using similar methods.

    • @wrathofainz
      @wrathofainz 2 місяці тому +6

      Just claim that all of your nudes are faked :D
      I guess with this sort of verification it's only yours if you claim it.

    • @kayokake1952
      @kayokake1952 2 місяці тому +9

      ​@@wrathofainzThat's the point, if all of them are fake then all of them are real

    • @DonVigaDeFierro
      @DonVigaDeFierro 2 місяці тому +1

      There are going to be rules like when admitting witness testimonies: There have to be multiple unrelated sources, and all of them have to be in agreement.
      Even then, you'll get a "maybe it's real" at best.

    • @antman7673
      @antman7673 2 місяці тому

      Only deepfakes are verified, that show vitality.

    • @Entropy67
      @Entropy67 2 місяці тому

      ​@@kayokake1952and none of it matters because its the same for every imaginable person

  • @jameshughes3014
    @jameshughes3014 2 місяці тому +39

    If fake photos and videos get better, the result will be that no one implicitly trusts what they see online. People will be forced to actually think for themselves.
    I see that as a good thing.

    • @infinite1483
      @infinite1483 2 місяці тому +11

      You're too optimistic lmao

    • @jameshughes3014
      @jameshughes3014 2 місяці тому +3

      @@infinite1483 yeah your probably right. People are really good at figuring out ways to avoid thinking.

  • @DirtyPlumbus
    @DirtyPlumbus 2 місяці тому +6

    It's easier to fool someone than convince them they've been fooled.

  • @morgwai667
    @morgwai667 2 місяці тому +16

    this the first video on this channel i downvoted:
    - pgp/gpg is NOT used to identify websites. x.509 is. that's totally different model (web of trust vs CAs)
    - once content is signed, verification can be done automatically by software (web browsers, social media apps) that can display warnings if the verification fails. The problem is who would be taking the burden of signing every photo, every video/audio clip that every news agency produced of the president (or any other public official to that matter) and how would that entity decide whether the given photo/video/audio is authentic or not.

    • @midn8dreams
      @midn8dreams 2 місяці тому +2

      it also isn't that easy to decrypt it.. tried it for emails and I struggle atm.. trying to decrypt it on another email on a different client.. still trying to get this public key somehow

  • @wrathofainz
    @wrathofainz 2 місяці тому +10

    This reminds me of SSL, seeing the error "this host has a self-signed certificate".
    You have to pay money to have some site verify that your website is legit. Seems similar.

    • @konayasai
      @konayasai 2 місяці тому

      Have you lived under a rock for the past twelve years? Letsencrypt is completely free.

  • @ashishpatel350
    @ashishpatel350 2 місяці тому +17

    deep fakes of deep fakes

  • @BitCloud047
    @BitCloud047 2 місяці тому +52

    We have entered Cyberpunk levels of dystopia

    • @reggie6230
      @reggie6230 2 місяці тому

      Hope you like getting chipped chummer. We'll have SINs by 2040 I think.

    • @fus132
      @fus132 2 місяці тому +8

      Cyberpunk (since) 2020

    • @OdinSonne
      @OdinSonne 2 місяці тому +7

      Sure, but none of the cool shit: No implants, cyborg parts, hovercars, a real digiverse that matches/exceeds meatspace etc. No Arasaka 'Ghostkiller' software though please!

    • @crazycoffee
      @crazycoffee 2 місяці тому +3

      ​​@@OdinSonneYou know they've probably been working on that first for a long time

    • @qlippoth13
      @qlippoth13 2 місяці тому +3

      We entered those levels of dystopia when people were unable to tell the difference between Aldous Huxley's work and his brother Sir Julian Sorell Huxley's "great work" still unfolding before our eyes today.

  • @jangamaster8677
    @jangamaster8677 2 місяці тому +15

    AI gets better everyday. The latest models have gotten pretty good at hands and faces.

  • @Heizenberg32
    @Heizenberg32 2 місяці тому

    1:09 I lol'd at this image coming up when you were talking about how difficult to detect deep fakes can be!

  • @waterpotato1667
    @waterpotato1667 2 місяці тому +5

    Literally that Metal Gear Solid AI meme.

  • @evgenysavelev837
    @evgenysavelev837 2 місяці тому +13

    Well, this was actually how I thought this problem will be solved.
    Just have all video editors and video cameras have an option to enter your digital private key so that each frame can be digitally signed.
    Then the video will be trusted as much as the person holding the private key
    Manufacturers can put inside a chip with a hard backed private key so that it cannot be extracted (similar to the apple's T2 chip or trusted platform module)
    Private keys are easy to generate and can be generated by everyone, so there can be no surveillance or tracing the key to the original issuer unless they want to make themselves public.

    • @evgenysavelev837
      @evgenysavelev837 2 місяці тому +4

      And if anyone would want to release a recording of someone, they can just release the raw footage digitally signed by the manufacturer's key.
      Manufacturers won't put a custom key in each device, because they have to provide public key for signature verification, and that would defeat the purpose.
      So the only thing a whistleblower would reveal is the device that the recording was made with, which would be more than anyone would be willing to reveal, but this would not be saying much, if the video was recorded by Samsung or Apple device.
      When there is a fake, they would never be able to sign the frames with the same key, thus it would be immediately apparent that the footage was processed by a third party

    • @vinski_
      @vinski_ 2 місяці тому

      @@evgenysavelev837they wouldn’t even need to release the whole video if each frame is signed? You would only need to release that single frame. Idk what if any video codecs allow trimming like that without having to reencode. Could even bundle the frames together in a container with the final video.

    • @evgenysavelev837
      @evgenysavelev837 2 місяці тому

      @@vinski_ You do have a point. Video codecs are non-trivial, I agree. They do have key-frames and "update-frames", also they store a highly compressed data. And it is impossible to cut the video in between the key frames without re-encoding.
      That is not a serious limitation, though. Key frames are usually a few seconds apart and it is trivial to cut the video right at key-frames and repackage it in a container without re-encoding.

    • @Alice_Fumo
      @Alice_Fumo 2 місяці тому

      How would this solve the issue of:
      I make Joe Biden deepfake video.
      I open it in my media player and fullscreen it.
      I take out my camera and film the video playback, which is going to sign it as a real recording by that camera.
      I claim this recording is indeed the real Joe Biden saying some outrageous stuff.
      This doesn't really solve the issue of being able to identify deepfakes as such. It only makes it so that you can trust the source. Let's say hypothetically that literally everything on the internet was signed by its respective sources, you'd still run into the issue of the trustworthiness of the source being unknown.
      Well, actually, people could set up things where keys could be rated, so if your image viewer shows you the ratings for the key associated with that image by default, you would be able to see when many thousands trust a key, when a key has like no ratings at all, or when it has a lot of ratings but many are negative.
      ANYHOW, it'd still not give us a way of identifying deepfakes as such / distinguish between reality and fiction. It's all just trust-me-bro based.

  • @totally.normal
    @totally.normal 2 місяці тому +112

    Can't believe we're in an age where we now have to confirm if a speech from the President is even real or not, this seems so dystopian.

    • @reggie6230
      @reggie6230 2 місяці тому +38

      Just like they love it. More chaos, more smoke to play in.

    • @SzaboB33
      @SzaboB33 2 місяці тому +21

      Yea, before this we only needed to confirm if they said the truth or not

    • @user-by6fp4ov3k
      @user-by6fp4ov3k 2 місяці тому

      ​@@reggie6230can we stop with the schizo-talk, thanks. Biden could take a shit and someone would call it a jewish-sponsored ploy to destroy America's sewage system

    • @MafiosoDon21
      @MafiosoDon21 2 місяці тому

      The rules of the game has changed

    • @rustymustard7798
      @rustymustard7798 2 місяці тому

      Bruh, at this point they could hang Biden's skinsuit on one of those robot dogs like some low budget Weekend at Bernie's clone and these NPCs will still not be able to tell the difference. Or maybe the REAL test is if he's speaking coherently it's fake. Pretty much same for the other guy, make an ill informed pumpkin shout racist things and you've got yourself a 'leader' (of your respective cult).

  • @jer1776
    @jer1776 2 місяці тому +7

    Who's the certificate authority going to be? Microsoft? Google? I already don't trust them

    • @stephenthumb2912
      @stephenthumb2912 2 місяці тому +1

      It's likely the same people that validate website domains. ICANN, most likely.

    • @jer1776
      @jer1776 2 місяці тому

      @stephenthumb2912 Id hope so, definitely needs to be an independent entity, but even they wouldnt be immune to outside influence

    • @stephenthumb2912
      @stephenthumb2912 2 місяці тому

      @@jer1776 agreed, ICANN most definitely is not impartial and is subject to influences with likely conflicts of interest.

    • @rudeandconfused
      @rudeandconfused Місяць тому

      @@jer1776That would likely be C2PA, an independent, non-profit alliance between Adobe, Arm, Intel, Microsoft and Truepic, publishing v 1.0 of the world's first industry standard for content authenticity to fight misinformation and a lot other nonsensical buzzwords.
      Google is probably going to create its own standard, given that they let OpenAI train Sora on UA-cam videos in a pretty shady way.

  • @ElectrochemicalMusic
    @ElectrochemicalMusic 2 місяці тому +2

    The concept could work with an identity centric block chain like Accumulate. You give each entity involved has its own cryptographic identity, from the camera that records the footage to the editing software to the publisher. It's not there yet but I can see the vision

  • @AndersHass
    @AndersHass 2 місяці тому

    You can also prompt a generated image to then be the baseline for editing further in GIMP, Photoshop etc. thereby making it quicker to make it than going from scratch for about the same level of difficulty to notice it being fake.

  • @Alice_Fumo
    @Alice_Fumo 2 місяці тому +2

    Widespread PGP adoption would be great. I did an experiment for a while where I had a website where every page was signed by me as well as any piece of media or statement associated with that identity, such that anything unsigned was assumed to be not me and thus me having a way to verify myself while still being anonymous.
    Having any media posted by some sort of reputable outlet signed is going to be a good thing. What seems to be a much less useful approach which is apparently currently also in the works is having signatures built into cameras directly, so that any picture taken with lets say an iphone 18 would be verifiable as such. The issue I see with that is that one could always just photograph a monitor showing a generated image. Granted, it would enhance the effort to make good "falsely signed images" (not sure how to call it), but I can literally not think of any way for a system like that to be implemented robustly.

  • @TehPwnerer
    @TehPwnerer 2 місяці тому +2

    You can usually tell pretty quickly if an image was AI or not there's this weird noise to it that is used for the diffusion part

  • @creeperkafasi
    @creeperkafasi 2 місяці тому +5

    This reminds me of the web integrity thing google worked on and backed out of a few months ago.

  • @WogueCompanySniper
    @WogueCompanySniper 2 місяці тому

    Been watching your videos recently.
    I was wondering what you thought of attacking through stegangraphy? Or even multi-stage steganography?

  • @InuYasha-SitBoy
    @InuYasha-SitBoy 2 місяці тому +1

    on gimp theres the transform tool which has a “grow” option. best for enhancing tna of celebs. sometimes itll have stretchlines from it so ill just copy to krita, color picker tool the area, decrease opacity and airbrush. krita also has transform tool but no “grow”, though kritas liquify option works too but u get more stretching and has a tendency to warp unwanted areas if ur not careful.

  • @kirkkork
    @kirkkork 2 місяці тому +3

    You can't use logic to change someone's mind, when they didn't use logic to arrive at their position.
    Until something happens that actually forces them to use logic, they can't change.

  • @danutmh
    @danutmh 2 місяці тому +1

    The hands , the face , the seams of clothing , the hair , the jewellery or other fashion objects , clothing and skin usually start to meld together at some point
    another dead giveaway are the backgrounds

  • @Kiirabu197
    @Kiirabu197 2 місяці тому +2

    "prompt engineer", my favourite future job
    /s

  • @emwave100
    @emwave100 2 місяці тому +1

    You could have it implemented in the browser, similar to the way browsers verify websites. So if a website is displaying a video, there could be a checkmark by the video saying that it really is from who it was from and that it hasn't been tampered with.

    • @emwave100
      @emwave100 2 місяці тому

      That way it would be automatic and no one would have to do anything extra or learn anything new.

    • @emwave100
      @emwave100 2 місяці тому

      But in reality, I hope he loses the election because some dumbass wants to tamper with videos and images.

  • @journey8533
    @journey8533 2 місяці тому +1

    Embed every image with a repeating digital signature, directly in the pixel values, in such a way that its still gonna be recognizable after cropping, tilting or painting over (to a certain extent).
    This is similar to banknotes and certain documents filigrane.
    With some online hygiene, every picture of a private or celebrity will have either their own signature or some public source's.
    If a picture is released with a valid filigrane, you can choose to trust it based on the owner.
    If Photoshop or machine learning is used to edit a picture or mash someone's face onto someone else's body, the mangled filigrane should give hints to that ...

  • @cannaroe1213
    @cannaroe1213 2 місяці тому +1

    "They can't even verify who their emails are coming from" had me rolling ahahahaha

  • @9a3eedi
    @9a3eedi Місяць тому

    I'd like to see stuff like dashcams and security cameras apply digital signatures internally in the camera automatically so that there's absolutely no doubt that footage is authentic.
    Or does that not make sense from a technical point of view?

  • @chell6022
    @chell6022 2 місяці тому

    God bless you.

  • @ejonesss
    @ejonesss 2 місяці тому

    i am not sure how well that hashes and even watermarks would work because when we watermark something we like to make it survive conversions so the ai bots could get the same watermarked images but it is worth a try.

  • @qunas101
    @qunas101 2 місяці тому +3

    Average voter would think about Bitcoin and NFT when they hear cryptographic signature. Also, would it work for modern social media that heavily compress media? Would signature stay valid?

    • @stephenthumb2912
      @stephenthumb2912 2 місяці тому

      The signature only verifies the source. So if the file posted was signed by the WH. Typically a hash or content signature is included. Any change to the file invalidates the content signature and so the new file would have to be resigned by the originator. So basically no, if someone compressed the file the WH would have to sign the compressed file also.

  • @Levi-gh1sq
    @Levi-gh1sq 2 місяці тому

    Maybe newssources will have to start taking videos of officials fron the front and side simultaniously to prove they match up

  • @EpicWink
    @EpicWink 2 місяці тому

    Certificates used in TLS seem to have a trust heirarchy with mutliple root stores, perhaps that's better than plain direct PGP

  • @korzinko
    @korzinko 2 місяці тому

    When you upload image or video to social media, file is altered which effectively breaks the signature

  • @Crftbt
    @Crftbt 2 місяці тому +6

    We need SSL infrastructure for PGP?

    • @qlippoth13
      @qlippoth13 2 місяці тому

      Phil Zimmerman needed the Arms Export Control Act, so sayeth the reeve of the shire.

  • @vinny-zebu
    @vinny-zebu 2 місяці тому +9

    I recently heard a take that we as a society are gonna need to trust official sources more and more, because it's gonna be the wild west of fake images, voices and everything in between.

    • @ghettochicken8420
      @ghettochicken8420 2 місяці тому +16

      its ironic because official sources are the least trustworthy

    • @DonVigaDeFierro
      @DonVigaDeFierro 2 місяці тому +1

      Kind of like that was their plan all along when they released AI models that anybody could use.
      Enjoy generating your anime girls while it lasts...

    • @STCatchMeTRACjRo
      @STCatchMeTRACjRo 2 місяці тому +4

      trust official sources? how can you trust them that they telling the truth and not lying and doing something behind closed doors?

    • @vinny-zebu
      @vinny-zebu 2 місяці тому

      @@STCatchMeTRACjRo And how can we trust random people on the internet either, if they can just fake anything?

    • @amentco8445
      @amentco8445 2 місяці тому

      ​@@vinny-zebuStill better.

  • @BehrInMind
    @BehrInMind 2 місяці тому

    "Its easier to fool some than to convince them they've been fooled"

  • @martinkunev9911
    @martinkunev9911 2 місяці тому +1

    We can put digital signatures in a decentralized system - e.g. based on blockchain.

  • @pedrobraz2809
    @pedrobraz2809 2 місяці тому +26

    It is great to see an educated Black king spread good information to protect his community

    • @sewsheederg
      @sewsheederg 2 місяці тому +1

      @KyleMonterroso fr?

    • @dsa43fsdf
      @dsa43fsdf 2 місяці тому +3

      black kang sheeet

    • @xyz5413
      @xyz5413 2 місяці тому

      💀💀💀 lol

  • @exoZelia
    @exoZelia 2 місяці тому +1

    I think need to just give up and let the chaos ensue and then rebuild from there

  • @user-sl6gn1ss8p
    @user-sl6gn1ss8p 2 місяці тому +3

    I think the extent keys can be used here is to attest to something as "official" from a source. So, not so much "is this really a video of this political figure" as "political figure attests to this video".
    And maybe the main user are news outlets, social media, etc. SO not really a solution to the problem, but possibly still useful.

    • @stephenthumb2912
      @stephenthumb2912 2 місяці тому +3

      Exactly. The sig will simply attest to the origin and edits, so therefore verifying if it came from an Authority and what it's edits were. If the edits were fake from source IOTW the authority produced the fake video, it won't matter. This is part of C2PA

  • @Oscaragious
    @Oscaragious 2 місяці тому +3

    We're going to get to a point where the majority of people online aren't even real.

    • @qlippoth13
      @qlippoth13 2 місяці тому +1

      @Slight- Alan Turing was a betting man

    • @juniuwu
      @juniuwu 2 місяці тому +1

      @Slight- Everyone on the internet is a bot, except you.

  • @HikaruAkitsuki
    @HikaruAkitsuki 2 місяці тому +1

    Even though we can fingerprint the AI Generated products, the video of the video and image of the image will never write a trace of the original source. It is just become fresh authentic product with out the trace of fingerprint.

    • @sdjhgfkshfswdfhskljh3360
      @sdjhgfkshfswdfhskljh3360 2 місяці тому

      It is enough to upload image or video to hosting and it will usually damage it by recompression. Regular cryptosignatures won't survive.
      Also hosting may not even allow to download content to verify it.

    • @HikaruAkitsuki
      @HikaruAkitsuki 2 місяці тому

      @@sdjhgfkshfswdfhskljh3360 But no existing method will gonna stop if someone one just screen recorded their own screen and upload it as just regular video.

  • @alexxx4434
    @alexxx4434 2 місяці тому +1

    No need for PGP. Browsers already have cryptography certificates used for HTTP encryption. Nothing stops from extending this system to sign any other files.

  • @Matyniov
    @Matyniov 2 місяці тому +1

    Imo if this was integrated the way ssl keys are managed by CA's to ensure a website is real it could work... if you saw a picture and it had a big disclaimer saying "it isnt signed by any authority" it could maaaybe lead people away, the same way black screen telling you a website is not secure could

    • @Matyniov
      @Matyniov 2 місяці тому

      ...buuuut thay would mean regulation and enforcing it so that most big tech companies implement it (they wont let it happen cus 0.0000001% revenue loss)

  • @Apheleion
    @Apheleion 2 місяці тому

    the human brain is very good at picking up patterns and inconsistencies, i have not seen a deep fake yet that i could not pick out because in a lot of these the uncanny valley is strong.

  • @Azoundo
    @Azoundo 2 місяці тому +6

    I’d still like a stamp of authenticity, maybe a system that federates across states so that there isn’t a single point of authority.

    • @joey_da_blowy
      @joey_da_blowy 2 місяці тому

      If we've gotta have one, this is the only hope for avoiding a complete loss of privacy. I'm against this one all the way though. If all this goes into effect, I don't think it'll be long before it's abused. A government with an all seeing eye is a scary thing, as only one of the reasons it's a bad idea, with complete loss of anonymity on the internet, who's to stop the alphabet mafia from taking out those in its target by framing them for something that nobody would ever defend? "Show me a person, I'll show you the crime." This here's not good news at all, could turn this place into China...

  • @Ex_impius
    @Ex_impius 2 місяці тому

    "prompt engineer" LMAO LMAO

  • @peterpede6601
    @peterpede6601 2 місяці тому

    💈

  • @louisvl10
    @louisvl10 2 місяці тому

    21e8 will build this in their protocol by default. every file gets a "magic number", this will also make worldwide file transfers thousands of times more efficient.

  • @notafbihoneypot8487
    @notafbihoneypot8487 2 місяці тому

    Yes

  • @asiliria
    @asiliria 2 місяці тому

    I would imagine a sort of reverse-steghide would pop up to remove it

  • @nielsdegraaf9929
    @nielsdegraaf9929 2 місяці тому

    Great video once again

  • @menjolno
    @menjolno 2 місяці тому

    I still remember the 24 alphanumeric password (not stored anywhere, not even written down) that protects a 7zip which has my private key that I generated 6 months ago 👍👍

  • @purdysanchez
    @purdysanchez 2 місяці тому +1

    The only way this would eveb be somewhat reliable is if cameras were manufactured with a globally unique PPK. You would also need a public certificate authority where owners could register their camera so that web browsers and users could authenticate their pictures. A photo isn't real just because the White House cryptographically signs it.

    • @stephenthumb2912
      @stephenthumb2912 2 місяці тому

      Look up C2PA. It's already in process to happen pretty much how you said

  • @angusmacgyver
    @angusmacgyver 2 місяці тому

    Bro, you should sign the description for instance with PGP that we know it's really you who posted.

  • @DeeReeseBeats
    @DeeReeseBeats 2 місяці тому +3

    With AI and Deepfake technology there's gonna be a renaissance of catfishing and celebs/influencers getting away with more stuff than ever (like they weren't already lol) and Celebs/influencers getting cancelled from it too.
    Wouldn't be surprised if they developed AI video chat filters
    Or ai camera apps
    Gonna be wild in the future lol

  • @antman7673
    @antman7673 2 місяці тому

    Cryptically verify, which videos are positive.
    We like that video and that one.

  • @qlippoth13
    @qlippoth13 2 місяці тому

    The scope of concern tells all. I.e. you'll never see anyone question Samuel (aka Billy) WIlder being on location for the filming of Die Todesmühlen. Some things are of no concern even if great outcomes hang in the balance should those things be found a great deception.

  • @Brain-washed2
    @Brain-washed2 2 місяці тому

    I really want a person with dementia signing off on a law regarding recognizing people.

  • @d3layd
    @d3layd 2 місяці тому

    The real use-case of blockchain

  • @TheGrinningViking
    @TheGrinningViking 2 місяці тому +1

    Short answer no.
    Long answer noooooooo.

  • @dhruvghosh9822
    @dhruvghosh9822 2 місяці тому

    but why does the govt care that much…..? this discussion has existed forever, granted it is easier to do now, but why do we need to be surveilled more because of a new technology ?

  • @Sarmachus
    @Sarmachus 2 місяці тому +2

    Maybe it is harder for the eye to identify photoshop from your argument, but I'm pretty sure digital forensics makes it easy to see photoshop patterns, so it's less of an issue IMO.

    • @jorge69696
      @jorge69696 2 місяці тому

      The kind of person fooled by these terrible deepfakes would be equally fooled by a photoshop and they aren't doing any forensics.

  • @David-ty6my
    @David-ty6my 2 місяці тому +1

    Why do I feel dejavu on this video?

  • @Top_Weeb
    @Top_Weeb 2 місяці тому +1

    Metal Gear Solid 2 warned us about this.

    • @Skathacat0r
      @Skathacat0r 2 місяці тому

      ua-cam.com/video/-gGLvg0n-uY/v-deo.htmlsi=IFU1j6ViaMzR8q_3

  • @BigBadRanch
    @BigBadRanch 2 місяці тому

    please add links to referenced websites, thanks boss

  • @lexoid64
    @lexoid64 2 місяці тому

    I'm really concerned that such a tool for identification of deep fakes (in case its source code is closed) might eventually lead us to the 1984's "The Ministry of Truth".

  • @heroslippy6666
    @heroslippy6666 2 місяці тому +1

    "Don't believe everything you see on the internet."

  • @alexandrei1176
    @alexandrei1176 2 місяці тому

    Using regular public private key is not sufficient because you don’t deal with the case of your private key being hacked. Check out how the KERI protocol solves key rotation!

  • @Jacen.
    @Jacen. 2 місяці тому

    Has anyone made a GUI for adding key rings? I feel like the barrier to entry for a lot of people (for PGP) is that it’s command line.

    • @timothy8428
      @timothy8428 2 місяці тому

      I wonder whether the bigger problem is how to promote the mass adoption of keyrings in the first place. If it were implemented as a cross platform OS component that all apps were aware of, key management could be ubiquitous. I mean, Linux already has password and key management built in, but even then you still have to go out of your way to actually use it.

  • @rterminatu
    @rterminatu 2 місяці тому +1

    It's just a bigger model though. As soon as it leaks it's useless. Files can also be signed by malicious actors. The technology will be used in an AI information war. Most important thing is educating the public.

  • @miuzoreyes6547
    @miuzoreyes6547 2 місяці тому +1

    I think this isn't meant to dissuade crazy people for believing what already aligns with their beliefs when it comes to deepfakes, but for more neutral parties who'd get fooled by a deepfake and think "wow this is real". There are such people out there, given that AI as we have it now currently is still rather new.

  • @SansaStarks
    @SansaStarks 2 місяці тому

    This almost makes me want to post some miss information or create my own deep fake foss for free and open source software for GNU+Linux

  • @fasanuma
    @fasanuma 2 місяці тому

    👍

  • @m4rt_
    @m4rt_ 2 місяці тому

    I'm not sure I would call the output from generative AI a deepfake.
    In my opinion the images from generative AI is "AI Generated Images", and "Deepfakes" are the images where the face is swapped with someone else.
    AI generated images are easy to spot, there are often weird stuff going on, text is wrong, hands are wrong, etc, while deepfakes can be both easy and hard to spot.
    With a deepfake you have to look if the face is blurry, if the face looks wrong, etc.

  • @Chr0n0s38
    @Chr0n0s38 2 місяці тому

    I think it's 2-fold. People want to believe things, yes, but there's also just the issue of ease of use for keyrings. For those of us familiar with tech, it's not hard but to normies it's pretty complex even as a concept. That's why apps like Signal hide all of that. Looking at executables online is a great example of that. There are just too many extra steps for normies to understand.

  • @nedgivash5986
    @nedgivash5986 2 місяці тому

    Or use a checksum for files.

  • @dfgdfg_
    @dfgdfg_ Місяць тому

    Each picture should have a green lock with the cert name on it, like HTTPS.
    Let's make solutions, not problems.

  • @Julzaa
    @Julzaa 2 місяці тому

    I thought you were going to talk about C2PA

  • @Sk0lzky
    @Sk0lzky 2 місяці тому

    + of course what you said - they failed with the whole Ministry of Truth project so they're trying to do it on the back end and will likely have popular support.

  • @snowleopard9749
    @snowleopard9749 2 місяці тому

    They are expecting the tech companies to police the 'digital signatures'.

  • @bigbluntz420
    @bigbluntz420 2 місяці тому

    If a person wants to believe something however ridiculous theres little to no changing their mind.

  • @opinionater9388
    @opinionater9388 2 місяці тому

    Just because a video or image doesn't come from someone doesn't mean it's not of them. People can take videos of someone else, and share them anonymously.

  • @nathanschmidt6429
    @nathanschmidt6429 2 місяці тому

    talk about Daniel Fraga the man who humiliated the Brazilian state!!!

  • @adrien2007
    @adrien2007 2 місяці тому +1

    I'm always happy to see a Mental Outlaw video, even though the majority of them I'm like hur dur. I'm not sure if it's a lack of caring of the subject matter or I am truly too dumb to understand the subject matter.
    As for the video... Deep fakes have been around for so long even if we didnt realize before. I dont even pay attention to the news because I figure it's all propaganda. I'm like hur dur. Glowing box says this thing, disregard. I'm in such denial of everything I figure everything is fake. idgf.
    Did I say hur dur?

  • @soundrail1895
    @soundrail1895 2 місяці тому

    It won't stop them, but it would verify whether or not a video, image, or anything else is a legitimate thing or not.

  • @hugoedelarosa
    @hugoedelarosa 2 місяці тому

    The people who will go through the trouble of verifying a video is real will be skeptical of a potential deepfake video or voice recording.

  • @hypernovatv911
    @hypernovatv911 2 місяці тому

    Actually, I found it pretty easy to know whether an email is from a bad actor or not. A lot of the most recent scams come from Google servers in Mountain View California. If you’re getting emails through a proxy server out of Mountain View California, you can bet your ass that it’s fake or from a bad actor. If the person is dumb enough to send it through regular channels you got them. perhaps they’re an amateur you can narrow down their exact location. I used to do that to the Russian and Ukrainian lonely girl scammers.

  • @OkamiSam
    @OkamiSam 2 місяці тому

    It never was a coincidence that ai got so popular in so little time, they released this so they could put regulations and control over something they couldnt until now, the internet, and internet ID

  • @xila8861
    @xila8861 2 місяці тому

    fudge, I am hearing you saying NFT is going to be a thing. For real this time D:!

  • @michaanonym
    @michaanonym 2 місяці тому

    why should i add pentagon's key's if i thiink they are the liars?

  • @sergey_a
    @sergey_a 2 місяці тому

    4:30 I still do that.