Inside the system using blood flow to detect deepfake video - BBC News

Поділитися
Вставка
  • Опубліковано 4 жов 2024
  • Deepfake videos - a type of fake video that uses artificial intelligence to swap faces or create a digital version of someone - are on the rise, but one tech firm thinks it has the tool to catch it in the act.
    Intel’s “FakeCatcher” system analyses video, and uses a technique called Photoplethysmography (PPG), which detects changes in blood flow in a person’s face - because deepfake faces don’t give out these signals.
    The company claims it has an accuracy of 96%, but does it work, and can it actually be used to tell what’s real, and what isn’t?
    Please subscribe here: bit.ly/1rbfUog
    #Deepfake #AI #BBCNews

КОМЕНТАРІ • 286

  • @Thomas-Bradley
    @Thomas-Bradley Рік тому +85

    Well, you're basically teaching the machine to create even more advanced deep fakes by telling it which deep fake that is not realistic enough. It's a never-ending cycle!

    • @autumnmoon7368
      @autumnmoon7368 Рік тому +4

      Exactly what I thought, now you're giving them the information you use to detect the deep fake which gives them something more to work with. Smh

    • @TheCommunicationCoach
      @TheCommunicationCoach Рік тому +3

      Media and science are a bad mix for us real humans.

    • @glennr9913
      @glennr9913 Рік тому +2

      That's called "job security".

    • @CaritasGothKaraoke
      @CaritasGothKaraoke Рік тому +2

      this is literally how generative adversarial networks work.

    • @TheCaniblcat
      @TheCaniblcat Рік тому

      well it's an arms race. They make deepfakes, we make ways to detect deepfakes, they make ways to counter our detection, and so on and so on....

  • @Fluffysnowfox
    @Fluffysnowfox Рік тому +47

    Okay, however what happens if they simulate the blood flow? I cannot believe that this might be that hard to simulate ( at least in comparison to the deep fake in general)

    • @PlanetImo
      @PlanetImo Рік тому +4

      I thought the same thing.

    • @antoniomontero4989
      @antoniomontero4989 Рік тому +2

      It's an arms race AI vs IA. Like a digital war.

    • @andrewj22
      @andrewj22 Рік тому +3

      In fact, the software used to _detect_ deepfakes can be used to train deepfake-generating AI on how to make them undetectable.

    • @theclanguagedeveloper5309
      @theclanguagedeveloper5309 Рік тому

      You are asking the right question, that is **LITERALLY** the core function of Generative adversarial network, literally anything you use to detect the fake will only strengthen it no matter what. And @andrewj22 is completely correct.

    • @IonorRea
      @IonorRea 11 місяців тому

      Solution:
      What I think could help is if with each ABC, BBC, RT clip (or any other media), there will be in the bottom corner of the screen a special semi-transparent code from which you can know the exact date to a second, channel/creator, etc. and when you put this code into a search engine, you will be able to find an archive of the original if you find the information you are looking at as possible fake. Of course, this would do nothing against government-supported alteration of history in which case also the original archive will be altered if new additions will not be constantly copied by an independent pro-democratic organization. Feel free to offer a better solution against deep fakes.

  • @phsopher
    @phsopher Рік тому +16

    Whew, I was worried there for a moment. Good to know we're safe now. It's not like a deep learning model can be trained to simulate the stuff this technique is looking for, amirite?

  • @lastar7824
    @lastar7824 Рік тому +15

    it wont be long for deepfakes to become advanced enough that they can simulate blood under the skin

    • @rundown132
      @rundown132 Рік тому

      This, lol people don't realize its an arms race

    • @andrewj22
      @andrewj22 Рік тому +1

      The very software used to detect the blood flow can be used to train the deepfake AI on how to simulate the blood flow correctly.

  • @abaidullahmirza2768
    @abaidullahmirza2768 Рік тому +3

    Very good approach, but using audio in correlation with the historical data would definitely help the model become more accurate.

  • @johnhull2582
    @johnhull2582 Рік тому +7

    This technology has actually been around for a number of years and is capable of also telling if someone's pulse increases or pupils dilate slightly. In other words, using this technology you can go back over old recordings and tell if someone was lying.

    • @jamesirvine9541
      @jamesirvine9541 Рік тому +3

      Think people are capable of lying without their eyes dilating

    • @johnhull2582
      @johnhull2582 Рік тому

      @@jamesirvine9541 You're missing the point.

  • @ProfessorJayTee
    @ProfessorJayTee Рік тому +3

    Next video: Deepfake makers find way to simulate proper blood flow in deepfake videos. This is a classic case of offense vs defense.

    • @andrewj22
      @andrewj22 Рік тому

      They don't need to figure out a way. The software used to detect the fake blood flow can simply be used to train the deepfake AI on how to simulate it.

  • @polygonalmasonary
    @polygonalmasonary Рік тому +1

    0:57 No blood flow technology required to spot that his neck is twice as high in the fake video 😂😂😂🇬🇧

  • @SirJimmySavileOBEKCSG
    @SirJimmySavileOBEKCSG Рік тому +3

    The new reality!

  • @moon3173
    @moon3173 Рік тому +1

    intel's fake detection algo is so weak.. rather easy to spot deepfake by looking at pixel level resolution. deepfakes have blurry pixels from pixel interpolation performed for image rendering

  • @J1ss3ncy
    @J1ss3ncy Рік тому +2

    It's interesting, but I doubt it works as well with people who have a dark complexion. You can't see bloodflow under a black or very tanned skin. Even thick makeup could fool this A.I. into thinking it's a deepfake. Ultimately, you'll need to combine different methods like voice pattern, image resolution (deepfakes tend to look blurry; human skin never is), and so on.

  • @michwashington
    @michwashington Рік тому +1

    1:16 I knew it was fake … because of lightning and movement and also how jittery her face looked. It looked unreal from the start.

  • @johnwood-stoddard4600
    @johnwood-stoddard4600 Рік тому +2

    She seems too enthusiastic. Notices the successes and ignoring the failures of this. Compared to the polished deep fakes out there, this isnt going to be hard to circumnavigate. But this creator just reminds me of a mum with an average child who cant accept their child has flaws so just ignores them and will defend them if anyone tried to offer solutions.

  • @wyntoncolter1067
    @wyntoncolter1067 Рік тому +1

    I wonder if the audio syncing issue can be resolved with A I. However it does makes me wonder about what other features will be unlocked with deep fakes.

    • @yaaghobrahmani4928
      @yaaghobrahmani4928 Рік тому

      The people of Iran and the world, the United States, Israel and the people of the world should know and be aware that the traitorous democrats of China, Russia and Putin are taking bribes from the terrorists of the world and destroying countries such as Ukraine, Afghanistan, Iraq, Armenia, the Middle East, and the United States. And let Europe know that the traitorous Biden and the traitorous democrats, the spy and the most corrupt party in the world, are turning America from the world's superpower into the world's most unzealous country with the support of the traitorous democrats, the spy and the swindler of China, Russia and Putin, and be aware that on Obama's orders The traitor, bribe taker, and self-dealer, liar and corrupt leader, the corrupt democrats and liar leader of Europe, sell the world to China, Russia, and Putin, the terrorist, to Nancy Pelosi, the filth, self-dealer, and leader of China, Russia, and Putin as a sign of the order. China's military attack on Taiwan, doing the same as Ukraine, Afghanistan, Iraq, Armenia, the Middle East, and America, from the world's superpower to the world's most ruthless country, with the support of traitorous democrats, traitors, spies, and mozadors of China, Russia, and Putin, long live the terrorist. Donald Trump is the most zealous president in the history of America and brought America to greatness. Long live Donald Trump is the most zealous president in American history and brought America to greatness. Long live Donald Trump, zealous and hero and hero and hero of disgrace, death and shame. On Joe Biden, the most unzealous president in the history of America and turning America from a world superpower into the world's most unzealous country with the support of the traitorous Democrats, Joe Biden, the unzealous, Barack Obama, Hillary Clinton, Kamla Harris, and all the traitorous Democrats of China and Russia. Putin is a terrorist and Nancy Pelosi and Barack Obama and Junckery Hillary Kinton and Kamla Harris and all the traitorous democrats, the spy and Mozador of China and Russia and Putin the terrorist terrorist terrorist Let's go brandon joe Biden farty 24

  • @shawnnewell4541
    @shawnnewell4541 Рік тому +2

    Well at least someone has come up with a way to detect it but we need something the average internet user can use.

    • @andrewj22
      @andrewj22 Рік тому

      The software used to detect it can be used to train the deepfake AI to simulate better and avoid detection. So this won't be a reliable detection method for long.

  • @PedroMiguel-if3ll
    @PedroMiguel-if3ll Рік тому +10

    The deepfakes examples showed are so obvious, how can someone fall for it? 😂

  • @Goyahkla100
    @Goyahkla100 Рік тому +1

    I think it's pretty obvious when they are fake, and i am sure there are some really good ones out there.The face resolution always looks lower than the rest of the image and always looks washed out. And the face is out of alignment, to low, with the rest of the body.
    And of course the head is floaty.....

  • @MasterKenfucius
    @MasterKenfucius Рік тому +3

    It's very easy to stop this... just create a law where people can sue anyone for making these videos and the problem stops.

  • @koda3967
    @koda3967 Рік тому +1

    It's a catch-22 sort of scenario; just as you pin one method down, another two problems surface. This is at least some weapons in the tool box. Keep up the efforts to expose fakes!

  • @glassowlie
    @glassowlie Рік тому +1

    Uh, 2019 logo...

  • @RyanSaundercook
    @RyanSaundercook Рік тому +1

    Anyone who doesn’t think this kind of detection software won’t be defeated almost immediately by new AI capabilities is dreaming. We are entering a scary place and there is no easy fix. Soon all you’ll be able to trust us what you see with your own eyes.

  • @user-fed-yum
    @user-fed-yum Рік тому +1

    Anyone ever heard of a black person? You know, with dark skin and stuff. And the darker it gets, the less light it reflects and the impossible chance this will ever work. Anyone involved with this BBC production spend 10 seconds of their time on this project, thinking this question should be asked? Cue the outrage.

  • @professorposh4146
    @professorposh4146 Рік тому +1

    I don't know what's real or fake anymore.

  • @sojournersunrise2290
    @sojournersunrise2290 Рік тому +2

    anything a computer can detect, it could fake, yeah?

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    The party leader also told South Africans to “stand up” against the white citizens.

  • @PlanetImo
    @PlanetImo Рік тому +1

    I wonder if colour and/or lighting correction or colour grading in post edit would make it difficult to spot a real video? Masking the minute colour fluctuations? Or flaws in the camera it was fillmed on. They all interpret colour slightly differently, and of course, what about the lighting conditions at the time of filming? Sometimes they can change if a cloud goes in front of the sun, for example, or if the subject is on the move, changing angles in relation to the light source/s?

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    On Sunday, just one day after Malema’s rally, a white farmer was tortured and killed on his South African land.
    The man’s wife was beaten unconscious but survived the attack.
    Human Events reports, “Theo and Marlinda Bakkers, white farmers, were attacked on their property in the province of Mpumalanga. Before losing consciousness, Marlinda Bakker was able to identify those four attackers who allegedly slit Theo’s throat after beating him with an iron bar. The attack took place after Theo Bakker opened the gate early Sunday morning to allow the cattle to graze, according to local news.”

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    Another video going viral online this week shows young South African students being taught how to sing the anti-white “Kill the Boer” song.
    A South African court even ruled the chant is a “cultural” song, not a hateful promotion of violence against a minority group.

  • @Isclachau
    @Isclachau Рік тому +1

    Any news on the horrendous burning EV’s still alight on the ship BBC 😅

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    Musk tagged South African President Cyril Ramaphosa and asked why he’s remained silent on the matter.
    Human Events Senior Editor Jack Posobiec blamed the implementation of Critical Race Theory in the 1990s for the collapse of South Africa and racial tensions.

  • @aluRamb0
    @aluRamb0 Рік тому +1

    Has this tech been tested on videos of darker skin people?

  • @sh0wbiz
    @sh0wbiz Рік тому +1

    the clip at the beginning was a horrible example of deepfake. there are way more convincing clips out there BBC lol

  • @wasabiattack
    @wasabiattack Рік тому +1

    Did it just label a real video as 100% fake? BBC can I have my 6 min back? You just reported on a system that does not work.

    • @G0N0X
      @G0N0X Рік тому

      I present you the AI detection program that's not working, this should be the name of the video.

  • @tthompson9244
    @tthompson9244 Рік тому +1

    From the examples shown, it isn't that difficult to spot a deep fake, though I'm sure that will change soon enough.

  • @IonorRea
    @IonorRea 11 місяців тому

    Solution:
    What I think could help is if with each ABC, BBC, RT clip (or any other media), there will be in the bottom corner of the screen a special semi-transparent code from which you can know the exact date to a second, channel/creator, etc. and when you put this code into a search engine, you will be able to find an archive of the original if you find the information you are looking at as possible fake. Of course, this would do nothing against government-supported alteration of history in which case also the original archive will be altered if new additions will not be constantly copied by an independent pro-democratic organization. Feel free to offer a better solution against deep fakes.

  • @boris8787
    @boris8787 Рік тому +1

    LOL - reminds me of the fake BBC - nobody pays the TV licence in the sane town of Chillingbourne. ⛔⛔⛔

  • @gtube6913
    @gtube6913 Рік тому +2

    I can just tell (with current deepfakes) what is real within a short space of time. But passive viewing could trick me. I guess we're all going to have a deepfake scanner within our software now ? Is that it? Another thing to constantly update like security software n patches.

  • @nwshearman
    @nwshearman 8 днів тому

    If you're going to hand over cash after seeing a (Fake) video of someone you idolize....

  • @ns6.504
    @ns6.504 Рік тому

    BBC reporting about deepfake videos is a hilarious contradiction

  • @Chrisallengallery
    @Chrisallengallery Рік тому

    Note 23: Add oscillations to green colour channel to simulate heartbeat.

  • @당신의생체정보가털리
    @당신의생체정보가털리 25 днів тому

    I’m not smart about Advanced Technology but I’m one of those as much interested in as, just feel so concerned about it. I hope High Tech Companies consider more throughly developing devices or functions on semiconductor to protect biometrics. That’s really important for people using electronics based on Application of Biometric Information Service such as a smartphone since that’s been a big part of human life these days. I’m 100% for sure that’s been manipulating BRAIN WAVES . Privacy nowhere

  • @fnoce5948
    @fnoce5948 Рік тому

    OT: Benny Hill considered funnier than Monty Python by two TV stations--WOR and WLVI!

  • @sonofsomerset1695
    @sonofsomerset1695 Рік тому +1

    Presented by the same presenter that got caught faking hate speech to Elon Musk.

  • @tonysherwood9619
    @tonysherwood9619 Рік тому

    You may have to stimulate your blood flow - if you had the dodgy co-vaccination!

  • @StayVCA98
    @StayVCA98 Рік тому +1

    All much of the AI creations has been bad or used badly so far, like this one, AI voice and Art, there are some exceptions but overall it's really bad and honestly unnecessary to begin with, a shame really...

  • @TheSkyfall
    @TheSkyfall Рік тому

    So if u put some foundation on will it still work

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    South Africa’s largest opposition party, the Democratic Alliance (DA), is led by John Steenhuisen who criticized Malema’s inflammatory remarks, saying, “This is a man who is determined to ignite… civil war.”

  • @CIS101
    @CIS101 Рік тому

    The fakes in this video seem obvious even without the software, but they should still proceed with the development because society will need tools like.

  • @laserinnhyd7778
    @laserinnhyd7778 Рік тому +1

    This system is very dangerous for human being

  • @edwhitson9873
    @edwhitson9873 Рік тому +2

    Still pretty easy for me to spot

  • @chrissparkes6497
    @chrissparkes6497 Рік тому

    Why is James Clayton not sacked?

  • @jonasaung6924
    @jonasaung6924 Рік тому

    User: create a person video with real life face blood flow to avoid deep fake detection.
    System: Done

  • @fionaburns4210
    @fionaburns4210 Рік тому +1

    in a few years deepfakes will account for this lmao

  • @Linda_AUS
    @Linda_AUS Рік тому

    The fake faces also appear more blurry, like a filter, than the real videos.

  • @sigmaputin6888
    @sigmaputin6888 Рік тому +3

    The BBC AKA the ministry of truth

    • @Nickle314
      @Nickle314 Рік тому +1

      Ministry of Propaganda

  • @pauliusnarkevicius9959
    @pauliusnarkevicius9959 Рік тому

    What about People, who have Issues, i.e. Freezed Parts in their Face or so.?

  • @demiller74
    @demiller74 Рік тому

    You mean there's no such place as Planet Mountain Dew?

  • @hunterscott3000
    @hunterscott3000 Рік тому

    Considering the Kim voice is the same as the interview im not suprised its a fake.

  • @SuperZardo
    @SuperZardo Рік тому

    Building a mask for the face (without the teeth and the hair) and overlaying a modulation of the red channel including heart rate variability should be sufficient in order to mimic the heart beat. By the way, there are smartphone apps able to measure the heart rate of the face. I never tried to point them to the news anchor displayed on the monitor. Would be interesting so see the results.

  • @lilnickproduction
    @lilnickproduction Рік тому

    I think technology can still bypass this

  • @emanuelbiazon8826
    @emanuelbiazon8826 2 місяці тому

    how can we use this ? theres no way we can install this app today?

  • @pulsecodemodulated
    @pulsecodemodulated Рік тому

    You mean that clip of Tom Cruise saying "vagina-poop" isn't real?

  • @DigitalDistortion
    @DigitalDistortion Рік тому

    Gosh our phone are going to have to have an AI tool to deep fakes now identify

  • @sydneyloum3211
    @sydneyloum3211 Рік тому

    So, what about people with dark skin tones? Seems like a huge gap is missing here.

  • @urimtefiki226
    @urimtefiki226 Рік тому

    Hackers are trying to manipulate others, talk on others behalf like my friend on the phone, on my wall in fb, trying to create bad situations. Keep trying something smarter because that trick to create confusion didnt work. Dont give up, just think monkey.

  • @conniepr
    @conniepr Рік тому

    I could watch more of these using different videos.

  • @youtubedislikes3756
    @youtubedislikes3756 Рік тому +1

    So basically this technology just labels all videos fake and then we are supposed to be amazed it identifies deep fakes 😂😂 sounds about right for intel products of late 😂😂😅

  • @naejin
    @naejin Рік тому

    As Deepfake tech advances, won't these deficiencies get updated/fixed?

  • @TomNook.
    @TomNook. Рік тому

    If it's on a screen, it could be faked

  • @officialJOY1229
    @officialJOY1229 Рік тому +1

    The research scientist lady is awesome 👏

  • @lucasdesimone101
    @lucasdesimone101 Рік тому +1

    Qué es Real y que no, siempre ocurrió esa duda; tal vez ahora, ya no importe tanto. " denles poder de consumo a las masas y lo demás no importará, si es cierto o no, si siquiera importará planteárselo ". ( Por cierto, no soy comunista ).

  • @adrianlouw2499
    @adrianlouw2499 Рік тому

    BBC headline: Italy regrets joining China's trade route pack
    'click'
    Modified headline: Italy joining China's Belt and Road Initiative was atrocious move, defence minister says.
    How very misleading...

  • @denkk4282
    @denkk4282 Рік тому

    Польша разместила снайперов на границе с Белоруссией.
    ВАРШАВА, 31 июл - РИА Новости. Польша разместила снайперов на границе с Белоруссией, сообщило Генеральное командование ВС Польши в Twitter.
    «Вечерняя прогулка на свежем воздухе - для здоровья. Снайперы - мастера разведки, навигации, работы с оружием. Для них характерны уверенность в себе, терпение», - говорится в подписи под фото.
    p.s.
    Москва и Минск трепещут! полякам кто то подарил списанную со склада в США снаиперскую винтовку. очевидно обманутых дольщиков " торжества демократии" в странах Азии более эффективно отстреливать - а то такая антиреклама власти в "саду" идет (не путать с маркизом Де Садом - в русском языке созвучно.) от живых свидетелей этого "торжества", что так и не то что выборы выиграть - по итогам народного голосования и на электрический "трон" сесть можно.

  • @richardgoldins4790
    @richardgoldins4790 Рік тому

    You can tell it fake

  • @paulmichaelfreedman8334
    @paulmichaelfreedman8334 Рік тому

    I identified all fake and real videos correctly, and I saw it within 3 seconds. Her algorithm definitely needs tweaking.

    • @catcherzw
      @catcherzw Рік тому

      Also she’s just dumb as hell

  • @notjustforhackers4252
    @notjustforhackers4252 Рік тому

    The surveillance state, ain't it grand.

  • @wannabetrucker7475
    @wannabetrucker7475 Рік тому

    Brilliant idea

  • @joycejeong-x4b
    @joycejeong-x4b 10 місяців тому

    Supporting international efforts to establish norms and standards for the responsible use of AI, including deepfake technology, is crucial. By fostering collaboration on a global scale, we can develop a unified approach to addressing the challenges posed by deepfakes.

  • @DAOLAO-x9s
    @DAOLAO-x9s Рік тому +1

    this is a joke..... a detector ? r loooooooooool look thi is only a tool that AI developer can use to train the deep fake better and after 2 or 3d years deep fake will take in account the bloob flow looooooooooooooool

  • @keepgoing7533
    @keepgoing7533 Рік тому +1

    *Am I real* ?

  • @-jeff-
    @-jeff- Рік тому

    Fine. A technology exists to detect deep fakes now. How long is it going to take the fakers to figure out how to make deep fakes it can't detect ?

  • @92jhvm
    @92jhvm Рік тому

    Realistically this is just more training data for ai, it's kind of exciting what technologies are going to be developed to keep up with ai, we're in unprecedented times 😮

  • @bennett550
    @bennett550 Рік тому +1

    What if they're black?

  • @alitokat9606
    @alitokat9606 Рік тому +1

    Aferin ilke sayende gurur duyduk🎉

  • @Thestorminator89
    @Thestorminator89 Рік тому

    I guessed with 100%. But what ever.

  • @menorahleathersmith
    @menorahleathersmith Рік тому

    I guess this is where in the bible the Pharisees asked yashua personal questions to try and catch him out on lies to accuse him ...
    I'm guessing this is why you monitor the heart with AI to see if it is truth or not.. but seeing pearcings to the heart ... blood.
    It is written... those who Pearce him will mourn

  • @SophiaChoi-y7c
    @SophiaChoi-y7c Рік тому

    wow this is so cool !! 😮
    It is a very good research

  • @Dani-El.
    @Dani-El. Рік тому +1

    The Trump one sounds like Owen Benjamin's impression of Trump. ie Nothing like Trump.

  • @Polina_E
    @Polina_E Рік тому

    Zalugni is also deep fake

  • @michwashington
    @michwashington Рік тому

    I knew those were fake at the start!

  • @jonavuka
    @jonavuka Рік тому

    i mean calling politicians fake is accurate so i agree

  • @aap223
    @aap223 Рік тому

    Create chaos, disorder, suspicion in the minds of people, mix falsehood with fantasy for fun and a little truth and deprive humanity of the power of judgement, wisdom and justice, discremeting ability to find reality in the ocean of falsehood.
    No wonder, all our human evil powers are at play, work and coming back to home.

  • @ForsakenSandpaper
    @ForsakenSandpaper Рік тому

    I can tell some comments here is AI generated.

  • @russbuttypennyblackblade
    @russbuttypennyblackblade Рік тому

    Politics
    Jonathan Turley on Devon Archer Testimony: ‘One of the Greatest Corruption Scandals in the History of Washington’
    by Jamie White
    August 1st 2023, 1:26 pm
    "I've been a critic of influence-peddling for decades. I've never seen anything like the Biden family," says Democrat constitutional scholar.

  • @SantoshKumar-qe6eq
    @SantoshKumar-qe6eq Рік тому

    Now a days it’s become fashion to use the buzz word ‘AI’ for the word ‘algorithm’. Chatgpt will end up like Siri, alexa, crypto, NFT, Metacurse😂

  • @Dani-El.
    @Dani-El. Рік тому +1

    Is the video of Zelensky dressed as a stripper real or a deepfake?

  • @shindousan
    @shindousan Рік тому +1

    So far, the moment one has a new tool to detect deepfakes, one can also improve deepfake generators to fool that tool.

  • @alexj9111
    @alexj9111 Рік тому +2

    AI will never paint as good as Monet, or write a play like Shakespeare. AI is just a bundle of ones and zeros with no consciousness. The human brain will always win the day.

  • @undr_guv_surv
    @undr_guv_surv Рік тому

    Only works for whiter skin...

  • @PlanetImo
    @PlanetImo Рік тому

    Very interesting! :)