Is facial recognition technology too powerful?

Поділитися
Вставка
  • Опубліковано 13 бер 2019
  • Facial recognition technology has prompted sharp debate. On the one hand, law enforcement groups say it can help locate missing children, solve crimes, and prevent terrorists attacks. On the other hand, critics worry it can lead to abusive government surveillance, corporate manipulation, and the end of privacy. Lou sorts out the pros and cons.
    SOURCES & FURTHER READING
    Lawfare Blog on the costs and benefits of facial recognition software
    www.lawfareblog.com/facial-re...
    Selinger and Hartzog on why Facial Recognition Is the Perfect Tool for Oppression
    / facial-recognition-is-...
    World Economic Forum on how smart cities fight crime and terrorism
    www.weforum.org/agenda/2018/0...
    Newsweek on how FBI uses FRT to fight crime
    www.newsweek.com/2016/04/29/f...
    2016 Government Accountability Office report on Law Enforcement use of FRT
    www.gao.gov/assets/680/677099...
    CREDITS
    Writer: Louis Foglia
    Editor: m.cho
    Researcher: Page Ellerson
    Supervising Producer: Allison Brown
    Follow Beme on
    Instagram: / bemenews
    Twitter: / bemenews
    Facebook: / officialbeme

КОМЕНТАРІ • 145

  • @itskilian5106
    @itskilian5106 5 років тому +109

    why aren't there more chanel's like that? Really enjoying the script/structure and the research is always on point. Neutral and decorated with nice lil animations :)

    • @lmaolmfao3611
      @lmaolmfao3611 5 років тому

      Check out America Uncovered for another similar great channel

    • @annikameyer7574
      @annikameyer7574 5 років тому

      it aint neutral but so is almost every other media source

  • @Baconomics101
    @Baconomics101 5 років тому +42

    Time to start making wearing masks in public fashionable

    • @sophieszobonya3175
      @sophieszobonya3175 5 років тому +5

      I wonder if the medical masks people wear in some parts of Asia would work.

    • @OVXX666
      @OVXX666 5 років тому +3

      woah thats what i was thinking

    • @antoniocarniero5138
      @antoniocarniero5138 4 роки тому +1

      @@sophieszobonya3175 they don't I am sorry to say, already facebook has been perfecting algorithms to detect people wearing scarves over faces or blocking faces with hands. It was in it's infancy 2 years ago don't know where it is now, it works by predicting someones face based upon natural features and what it knows about the trends of our faces.

    • @sophieszobonya3175
      @sophieszobonya3175 4 роки тому +1

      @@antoniocarniero5138 or works based on pictures where one doesn't wear a mask. Yeah. Since then I read about that and... We're done for hehe

  • @WowTrevor10
    @WowTrevor10 5 років тому +25

    One thing I'd like to weigh on, the idea that facial recognition is poor at identifying black people's faces due to racism is misguided. Image processing is linked to the information stored in the camera image that comes in. Computers are very bad at discerning patterns in low contrast, and having darker skin naturally lowers the contrast of the image of your face. Shadows show up less, and ridge lines are obscured by the skin color. The only way to get around this issue is to shine a very bright light onto the face of someone with darker color, which both doesnt work for security cameras and is obnoxious for the photo subject.
    Is this a good or bad thing? I think neither, it's just the nature of how the system works and isnt the product of human intent

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому +1

      as technology advances , it will overcome it . he gave indian examples of orphan children . Indians are not white .

    • @ArthurLarin
      @ArthurLarin 5 років тому +1

      What you're saying is true but it's not the point of Lou's argument. It shouldn't matter whether improving FRT compatibility with people of color is more complicated technologically or not. The idea is that groups of mostly white men design and release products that will undoubtedly be used by the masses without going to the trouble of improving it to the point that it works for everyone.

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому +2

      @Arthur Larin i dont think its about colour but maybe because FRT is more trained with white people and as it gets trained with more people , it will identify everyone . Its ever improving .

    • @iuscoandrei17
      @iuscoandrei17 4 роки тому

      So what u r trying to say is that we should wear black masks in order to protect ourselves. Fair enough. As somebody said in the comments above: "it's time to make wearing masks fashinable"

  • @UnknownGunslinger
    @UnknownGunslinger 5 років тому +21

    Oh Lou, you never fail to depress me.
    I hope this channel grows to the extent it deserves. Casey should give you guys a shout-out! You’ve done an amazing job!

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      but he is telling truth .

  • @ActuallyCirce
    @ActuallyCirce 5 років тому +14

    Lou's vids just keep getting better and I want more of them

  • @Jarvik1234
    @Jarvik1234 5 років тому +21

    ha...this reminds me so much about "Person of Interest" ... what an underrated show!

    • @brummii
      @brummii 5 років тому

      It became too formulaic for me after a few seasons.

    • @Jarvik1234
      @Jarvik1234 5 років тому +3

      ​@@brummii I guess it does get a little formulaic but if you set that aside, it did an amazing job highlighting complex theme/philosophy of AI and surveillance with a solid story.

  • @ir0n2541
    @ir0n2541 5 років тому +13

    In the EU, the GDPR is very strict on storing and processing Biometric info; this includes FRT data.

    • @theblinkstykrab3106
      @theblinkstykrab3106 5 років тому +1

      Hey maybe the GDPR is actually good for something

    • @Walzounet
      @Walzounet 5 років тому

      GDPR does not apply to governments...

  • @africanotomotiv
    @africanotomotiv 5 років тому +2

    @loufoglia you are the king!
    This was YUGELY enlightening.
    Thank you.

  • @TheMichpoulin
    @TheMichpoulin 5 років тому +1

    A well put together, and delivered argument sir. Well done.

  • @wwbaker3
    @wwbaker3 5 років тому +5

    The tech industry isn't just based in the US. There are plenty of highly skilled and capable programmers working abroad that aren't white, say, India, Japan and Korea. Even in Silicon Valley has a disproportionate amount of East Asian and Indian tech workers working to solve some of these FRT biases. Remember the Nikon camera that was supposedly "racist" because it couldn't detect if certain Asians blinked during an image capture due to the shape of their eyes? Well, Nikon is a Japanese company that employed Asian programmers but still ran into that problem.

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      as technology advances , it will overcome it . he gave indian examples of orphan children . Indians are not white .

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      Btw , Silicon valley is 84% white . Its just fake propaganda that asians and indians are doing everything in silicon valley .

  • @jungminlee197
    @jungminlee197 4 роки тому

    glad i discovered this channel! this was an insightful coverage of frt :)

  • @StevenFarnell
    @StevenFarnell 5 років тому +2

    The Washington State Senate just passed a bill that, if it passes in the house, would limit the usage of facial recognition software by government to require a warrant or reasonable belief immediate harm. I think the actual language in the bill might be different, but it helps define a narrow range of times this can be used and explicitly bans its use during public protests. They also set limitations for advertising using biometric data and GDPR like terms for storage of personal data.

  • @TheBoagboy
    @TheBoagboy 5 років тому +56

    It is way to OP it needs to be nerfed tbh.

    • @adamkarlb6329
      @adamkarlb6329 5 років тому +2

      TheBoagboy How can it be nerfed in fair play?

  • @grezz247
    @grezz247 5 років тому

    That was absolutely perfect: Well paced, well informed, and balanced. Thank you.

  • @marcelloascani
    @marcelloascani 5 років тому

    Great work!

  • @3rkid
    @3rkid 5 років тому +7

    Yeah I don't trust cops with this tech at all.

    • @ZFlyingVLover
      @ZFlyingVLover 5 років тому

      Cops have been more accountable and law abiding since they started wearing body cams on a day to day basis. ALot of fake reports of racism have been debunked too when the accusers didn't realize a body cam was recording the incident.
      Personally, I despise cops because it seems that alot of them are power hungry nobody's but I feel for the honest law enforcement official and I want them to be protected and be able to do a better job too.
      Too many liars in the world especially in the U.S. because people love to file lawsuits against other people or organizations with means. It's not about what's true anymore but rather how much money the target has available to settle. smh.

  • @tOmzz4video
    @tOmzz4video 5 років тому

    Fantastic report, as always!

  • @kurtn17
    @kurtn17 5 років тому

    These vidoes are always great

  • @APVHD
    @APVHD 5 років тому

    Another great video

  • @hythapea13
    @hythapea13 5 років тому +5

    I don’t think I’ve ever watched a video with no views until today

  • @nataleo9508
    @nataleo9508 4 роки тому

    This channel makes concepts and issues that are hard to grasp with the usual emotion-led biased source reports finally clear and easy to understand with plenty of opposing viewpoints to navigate between! I’ve been binging bemes videos all week lmao keep up the great work

  • @darkangle2000now
    @darkangle2000now 5 років тому +4

    Woah, you guys still live! Good one!

  • @MrAdhs11
    @MrAdhs11 5 років тому

    Very good channel
    good for you

  • @pwnjitsu
    @pwnjitsu 5 років тому +2

    I am so fart, i am so fart F R T I mean F A r t.

  • @EdLrandom
    @EdLrandom 5 років тому +1

    It's funny how the tech industry often advertise how if you have nothing to hide you have nothing to fear. But at the same time, they often build a whole businesses on a closed source software and hardware.

    • @minktronics
      @minktronics 5 років тому

      I mean, closed source development is (generally) to do with protecting your work from other businesses

  • @MyAheer
    @MyAheer 5 років тому +2

    liked before even watching.

  • @derrickwillis171
    @derrickwillis171 5 років тому +10

    Yep it's so powerful it punched me in the mouth.

  • @r6k8n99
    @r6k8n99 5 років тому

    This is the best news show.

  • @DunnickFayuro
    @DunnickFayuro 5 років тому +2

    Let's not forget about Gait Recognition Technologies too. Face is "somewhat" easy to hide/disimulate. Gait is a bit trickier.

    • @ZFlyingVLover
      @ZFlyingVLover 5 років тому

      You must've seen that on FBI or NCIS new orleans recently. lmao. Yes that could work well when the target's face can't be seen.

  • @tiannajohnson1752
    @tiannajohnson1752 5 років тому

    Why did this video feel longer than it really was

  • @DanielZorroF
    @DanielZorroF 5 років тому +4

    Beme should do a collaboration with Mozilla's IRL

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      whats that ?

  • @death-disco
    @death-disco 5 років тому +1

    FRT... I couldn’t unhear “fart”

  • @HeroGambit
    @HeroGambit 5 років тому +1

    They have to implement gdpr for ppl from Europe :))))) in stores or anywhere they are going to use FRT

  • @alexandarmakxmov
    @alexandarmakxmov 5 років тому

    When Lou talks, I listen, simple as that...

  • @gnothseed8135
    @gnothseed8135 5 років тому

    Hey Lou! Whats up?

  • @izzywtheflix
    @izzywtheflix 5 років тому

    Hey Lou here’s the thing

  • @ToriKo_
    @ToriKo_ 5 років тому

    Good video

  • @blackkissi
    @blackkissi 5 років тому

    I was waiting for the phrase "shit is going down in..."

  • @lateblossom
    @lateblossom 21 день тому

    Late to the party, but if you read this, look up the TV show Person of Interest. Amazing show, deals with all of this.

  • @davidgoodwin4148
    @davidgoodwin4148 5 років тому +2

    The answer is be a jugglelo all day everyday

  • @citizen4843
    @citizen4843 5 років тому +1

    it's good there's so few cameras. now i can steal your laptop while you're in the bathroom.

  • @LaMarqueLP
    @LaMarqueLP 5 років тому +1

    1984

  • @olegpetelevitch4443
    @olegpetelevitch4443 5 років тому

    SPOT ON MATE !

  • @kwii22789
    @kwii22789 5 років тому +1

    IS FACIAL RECOGNITION TECHNOLOGY ASSUMING MY GENDER?? **TRIGGERED**

  • @yonatanofek4424
    @yonatanofek4424 5 років тому +1

    +BEME news Hey BEME, I sent my DNA sample (cheek swab) to what eventually turned out to be a scam. Any content planned (or already out) about this sort of risk?

  • @fluxnfiction5559
    @fluxnfiction5559 5 років тому

    Lincon Dug. 2010 TOC, lol but that was dna data base not face data base.

  • @beatrizmedeirosnoleto9391
    @beatrizmedeirosnoleto9391 5 років тому +2

    There is a problem you didn't mention. If the accusation is allowed to use only images to prove guilt, then there is nothing to stop convicting innocent people with deep fake videos, that are indistinguible from real ones.

  • @drunkcat1713
    @drunkcat1713 5 років тому +6

    So its in India too? .....shiEt

    • @sunnyhaladker5748
      @sunnyhaladker5748 5 років тому

      Yeah I was was like dammnn

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому +2

      @Sunny Haladker Mate , in India BJP party is running on data science and AI . it built Rs 1200 crore head office in New Delhi . Go look at it .

  • @itscharliangel
    @itscharliangel 5 років тому

    Hey it's Lou

  • @SusanDianeHowell
    @SusanDianeHowell 5 років тому

    “And now behold, I ask of you, my brethren of the church, have ye spiritually been born of God? Have ye received his image in your countenances? Have ye experienced this mighty change in your hearts? Do ye exercise faith in the redemption of him who created you? Do you look forward with an eye of faith, and view this mortal body raised in immortality, and this corruption raised in incorruption, to stand before God to be judged according to the deeds which have been done in the mortal body? I say unto you, can you imagine to yourselves that ye hear the voice of the Lord, saying unto you, in that day: Come unto me ye blessed, for behold, your works have been the works of righteousness upon the face of the earth? Or do ye imagine to yourselves that ye can lie unto the Lord in that day, and say-Lord, our works have been righteous works upon the face of the earth-and that he will save you? Or otherwise, can ye imagine yourselves brought before the tribunal of God with your souls filled with guilt and remorse, having a remembrance of all your guilt, yea, a perfect remembrance of all your wickedness, yea, a remembrance that ye have set at defiance the commandments of God? I say unto you, can ye look up to God at that day with a pure heart and clean hands? I say unto you, can you look up, having the image of God engraven upon your countenances?”
    - Alma 5:14-19, The Book of Mormon

  • @Gilotopia
    @Gilotopia 5 років тому +3

    My main field of research is artificial intelligence and there's a few misconceptions about how facial recognition works. You almost touched upon them at minute 5 but it goes a bit deeper than that. I see these kinds of errors around the media especially when it comes to the chinese systems. I know you guys are a bit more interested in being right about the tech than regular outlets so I'd be glad to explain how facial recognition does and doesn't work. What's concerning is that these misunderstandings about how FRT works may even lead to ineffective regulation. Facial recognition is usually only the speartip of these tracking systems but everyone is focusing on that ignoring all the other tracking going on behind the scenes.
    If you're interested in my explanation of how things work let me know.

  • @gamer620496
    @gamer620496 5 років тому

    Lou!

  • @TylerVanAllen
    @TylerVanAllen 5 років тому

    Go Cuse

  • @bettytureaud
    @bettytureaud 5 років тому +1

    Try mix facial recognition with 5G and digital price labels

  • @phynx2006
    @phynx2006 5 років тому

    Hey Lou .... we see you, hahaha

  • @xWood4000
    @xWood4000 5 років тому

    Why does NSA seem to think that internet wholesale surveillance is the way to go? It's common sense to not fo that because everyone isn't a suspect. The NSA servers cost a lot too.

  • @Arab_Jew
    @Arab_Jew 5 років тому

    Why are you asking these questions

  • @NicholsMax
    @NicholsMax 5 років тому

    couldn't get passed FRT being close to FART, ha

  • @fredcastaneda3267
    @fredcastaneda3267 5 років тому

    Guatemala flag!!

  • @8draco8
    @8draco8 5 років тому

    As tech guy I just want to explain what happen to Joy Buolamwini mentioned in 7:27 Her computer or software was not racist as it is implied. For facial recognition she used normal, laptop camera without depth sensors. Having only flat image software have to recognize all the details of the face (placement of eyes, shape and placement of nose, lips, ears etc.). Without depth sensors software is relying on shadows and how light behave on the surface of the face. It just happen that shadows and light reflects are less recognizable on dark skin. So yeah, it's not racist algorithm it's the feature of her skin. That's why, in order to make FaceID works in iPhones, Apple had to put so many sensors in front of the phone, they are using IR sensors to scan the exact shape of the face not only guess it from the picture.

  • @OVXX666
    @OVXX666 5 років тому +2

    just wear an edgy mask and call it fashion

  • @barackobama6067
    @barackobama6067 5 років тому +1

    The government and some corporations have the ability to read your mind in real time. They've had it for decades. People know this but don't care, literally everyone who knows about it stays quiet, wonder why? I mean you reading this already know why but you act like you don't because they like you fear that they'll get their mind's read. Then thought crimes become real and everyone around you becomes the thought police.

  • @nielswitberg
    @nielswitberg 5 років тому

    9/10 times the innovators/tech department just make some shit. It is always up to the customer to figure out how to use it.

    • @theopuszkar
      @theopuszkar 5 років тому

      But that's exactly what's problematic about this technology, who will decide if and how it will be implemented. Often times the customer doesn't get to decide don't you think ?

  • @jobbvrolijk
    @jobbvrolijk 5 років тому

    Don’t say FRT too quick! 💨

  • @thijmenstar7832
    @thijmenstar7832 5 років тому

    This show is unterrated, lou and his team are doing a great job

  • @yux.tn.3641
    @yux.tn.3641 5 років тому

    though it exists in china, its really only in the major cities and even then its not everywhere except in the important parts of the city...the west goes on about the social credit system in china but in truth its not even centralized and it only works if you use alipay

  • @CybershamanX
    @CybershamanX 5 років тому +1

    I have predicted that in the future people, likely those pesky and rebellious kids, will figure out how to use makeup to screw with facial recognition software. I'm talking crazy designs on the face which will attempt confuse the technology. Mark my words. Keep your eyes peeled for crazy face patterns becoming fashionable. ;)

    • @CybershamanX
      @CybershamanX 5 років тому +1

      I can easily then see how police will start harassing kids with such makeup patterns. ;)

  • @XavierZara
    @XavierZara 5 років тому

    Finding missing people with this technology is not an excuse to use it. Some people just don't want to be found and that should be okay

  • @robertfalbe3054
    @robertfalbe3054 5 років тому

    I'm Robert reed falbe 3 and I am being harassed in Lake forest california. Please help. I was born in Centralia ill. 45 years of age.

  • @fpbrazz
    @fpbrazz 5 років тому

    I'm curious on why the facial recognition doesn't work well in people of color. The video suggests developer's bias as a possible explanation but I believe, knowing a bit about how the technology works, that it might be due to the way face recognition works. These algorithms rely heavily on points of high contrast in the person's face. In this case dark eyebrows in a white skin will be easily distinguishable while in a darker colored skins would be a challenge. Although the statistics on the predominance of a certain type of people in the software engineering world could explain the issue, we should be careful when jumping to conclusions...

    • @S2Tubes
      @S2Tubes 5 років тому +1

      Two reasons, first, the data that has been fed into them is mostly white people. Second, as you said, the color range. The darker the skin, the less contrast. It has nothing to do with bias or the people "teaching" the AI. It is 100% data. If dark skinned people fed it the same data, it would come back with the same results.

  • @drunkcat1713
    @drunkcat1713 5 років тому

    Hey its lou and some wuld shit is going on FaRT

  • @baconninja4481
    @baconninja4481 3 роки тому

    We all now how George Orwell warned us about this. Are we sure we want Big Brother to become reality

  • @thefattyfatty1
    @thefattyfatty1 5 років тому

    "Cohmpiss"

  • @DanNguyen-fu9hn
    @DanNguyen-fu9hn 5 років тому

    Fart... heh... Frt...

  • @LordOfDays
    @LordOfDays 5 років тому

    The blurriness at the beginning makes this an Oscar worthy film.

  • @RoloTomase
    @RoloTomase 5 років тому

    WEll the same arguments he makes against the cameras and software is very much like the red flag laws. It's a slippery slope once they go down those roads it can be a real danger for misuse.

  • @froozynoobfan
    @froozynoobfan 5 років тому

    ai is not made racial or sexist by design but if your training dataset exists of mostly white and or male population it will become biast. Privacy is very important and data is the new oil. people assume large organizations can't do mutch with your data but that is not true organizations like amazon and google aim to have as mutch sales, ad views as possible so they (ab)use your data in order to make you buy more without you even knowing

  • @leviroberts1884
    @leviroberts1884 5 років тому

    Machine learning algorithms don't adopt the biases of the computer programmers, they adopt the biases in the training data-sets. If your facial recognition training set is disproportionately white, then the learned algorithm will disproportionately favor higher performance on white faces, which would happen if it the data-set was collected in America (where white people are the majority population).
    But that's not to say that racism can't play a role. If we were to train an algorithm to identify known felons and most felons happen to be black (due to generations of racist policies), then you could expect the false-positive rate for black males to be much higher than false-positive rate for white males. From this perspective as we uncover these issues of bias in our learned models, we're really just proving the existence of a bias in the world that our data-set came from.
    This is the biggest problem with machine learning today. Our world is filled with biases, and its very difficult to identify and condition on all of them. If we're not careful in both adopting and designing these technologies, we could end up inadvertently making these problems much worse.

  • @ParadoxdesignsOrg
    @ParadoxdesignsOrg 5 років тому

    F.A.R.T

  • @khalidmohamud845
    @khalidmohamud845 5 років тому

    unbox therapy ?

  • @ConorDrew
    @ConorDrew 5 років тому

    If we built it they will surveil, so we won’t build it, I feel that’s wrong, if you, the person who can see the pitfalls and the dangers, you should build it, if not someone else will less morals will come and build it.

  • @donhalley5622
    @donhalley5622 5 років тому

    Here's my comment: I'm giving this one a 5 out of 10. All over the place. It works too well. It doesn't work. It's prejudiced against minorities. It doesn't work well on minorities. If you want to give me legitimate reasons to want it regulated, you're going to have to cite some different sources than the ACLU, San Francisco, or EFF. Find missing children, terrorists, rapists, bank robbers, etc.? I'm OK with that. Find out what I like, where I go, and who I hang out with? Guess what - I'm OK with that as well. It facilitates false arrests - and MURDER? (Oh yeah, no bias there.) Couldn't it more often prevent a false arrest or help reduce the tension in a police confrontation? You want your privacy? Guess what - you're too late.

  • @octaviano7360
    @octaviano7360 5 років тому

    There is an article in NY-er: the man who never forget a face. They explained frt to be a sham in comparison to people with a special gift...

  • @Xafpunk
    @Xafpunk 5 років тому

    Thumbs up if you kept thinking fart

  • @MrGERiarza
    @MrGERiarza 5 років тому +1

    If the accuracy is high enough, it's kind of stupid for the police not to use it on their live cams. How many times have people been stopped by the police, let go and years later it results the police could have caught the perpetrator all along?

  • @ab-eu9po
    @ab-eu9po 5 років тому

    F.(A).R.T

  • @MrFlexNC
    @MrFlexNC 5 років тому

    We invented fire and got out just fine, we'll survive some code

    • @theblinkstykrab3106
      @theblinkstykrab3106 5 років тому

      Bad comparison

    • @MrFlexNC
      @MrFlexNC 5 років тому

      @@theblinkstykrab3106 thats the point

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      this is worse .

    • @Q_QQ_Q
      @Q_QQ_Q 5 років тому

      btw there are people who still worship fire in fire temple .

  • @sani9238
    @sani9238 5 років тому

    FRT stinks

  • @doodlexenosinfopinion6107
    @doodlexenosinfopinion6107 5 років тому

    Facial racial not recognition this is not creative or innovative.

  • @Kakazumba99
    @Kakazumba99 5 років тому

    u have to be joking with the ai bais becouse of white man. I am pretty sure u knew a bit about ai did u drink some mad juice before this video?

  • @patrik5123
    @patrik5123 5 років тому

    10:10 It's interesting that a country like the US is behind certain countries in Africa when it comes to discrimination...

  • @repker
    @repker 5 років тому

    i feel like people who claim algorithms to be racist or whatever have never seen/implemented/used an algo. it's like saying a car that mowed down a bunch of minorities is racist. it's the input, the driver, the human, that is, not the machinery or math.

  • @kennefvi
    @kennefvi 5 років тому

    Lou its “facial automatic recognition technology”