Episode Transcript
Transcript is AI Generated, and there are definitely errors.
David: [00:00:00] Welcome to the Security Serengeti. We're your host, David Squier and Matthew Keener. Stop what you're doing and subscribe to our podcast and leave us an awesome five star review and follow us at sek on Twitter.
Matthew: Today we're here to talk about a book that we think has implications in cyber security and hopefully provide some insight analysis and practical applications that you can take into the office to help you protect your organization or your own personal privacy.
David: And as usual, the views and opinions expressed in this podcast are ours and ours alone do not reflect the views or opinions of our employers.
Matthew: Did you hear about the facial recognition demonstration where they compared mug shots to the 535 members of congress and found 28 criminals?
David: Only 28? So that means that it's only 5 percent accurate?
Matthew: That sounds about right.
So today we are talking about a book called Your Face Belongs to Us, which I keep mistakenly thinking is called All Your Face Belongs to Us.
David: Which it should have been called.
Matthew: I know, they really missed out. Yeah.
David: So in summary, a gay, smart, naive, mixed race, Vietnamese, Australian man [00:01:00] moves to America and gets suckered in by the evil and wily conservative capitalist who used his love of technology and a challenge challenge.
To create a corporation who wants to destroy privacy as you know it. By the end of the book, he is less naive, but still wants to do good. Having
Matthew: finally turned his back on conservatism
David: and Donald Trump. It's the feel good story of
Matthew: the year.
I feel like, well never mind. So a little more serious this
is a book on the history of facial recognition in general. And then it focuses more specifically on Clearview AI, the company that's decided to monetize it.
That's kind of the through line.
In terms of the history of facial recognition, I think this makes up about half the book. We're going to try and skip through it in about five minutes.
Because, honestly, it's boring. I think it's
David: six minutes and fifteen seconds.
Matthew: Very precise. Alright, I'm going to hit that watch. Go! Ha ha ha. So, starting back in 1800 contemporary with Charles Darwin and beliefs at that time, there was something called physiognomy, where they believed that facial features and facial measurements could tell you what a [00:02:00] person was like.
Lots of people at the time believed that criminality and other features were inheritable, which actually, I recently saw a book called Sins of the Fathers, which actually made kind of a similar, like there's certain criminal families where it's, although that's probably more culture than inherited, but at least one of the three founders of Clearview AI at least partially believed in physiognomy.
Apparently they passed some stories back and forth on it. But I don't think it really applies, so dropping it as a thread now.
David: Well, you could say it's almost kind of true considering that part of your personality is dictated by your genes, and if your, your parents were horrible people then, you know, you're probably going to be a horrible
Matthew: person.
Goddammit, I'm doomed.
David: I'm duped.
Matthew: Ah.
David: you're in
good
Matthew: My parents were awesome. Yeah. Alright, so early approaches in the 80s used digitization. They tried to turn the face into a series of measurements. Sometimes that measurement was actually like ruler measurements. Sometimes it was turning into pixels and measuring like how bright the pixel was.
These could be [00:03:00] somewhat successful when the pictures were all similar. They were all facing the camera, for example. Same skin color, same lighting.
David: Right, because you imagine if you're talking about shading, if the lighting is imperfect, or not the same in one picture versus another, the shading is going to be completely off for
Matthew: that.
David: So, that's obviously where you'd have some challenges for that.
Matthew: Yeah. Then someone in the 90s created the Eigenfaces concept by assuming that the face is a mixture of other faces. And that's what most of the ones from the 90s through the 2010 or so were based
on. That's how mine was. made. It's a mix of two faces.
David: So in 2001 they, tried to use facial recognition at the Super Bowl, which they dubbed the Super Bowl.
Matthew: It was super, super big success, right? They caught thousands of criminals.
David: Yeah. No,
They caught I think they
Matthew: like seven or something, or fourteen?
David: Well, I guess we should have written it down. I think it was like five.
rather just throw
Matthew: throw out numbers with no basis in reality.
David: Yeah, but how many people attended the Super Bowl?
What, a hundred thousand
Matthew: or something? Like, five
David: five out of a hundred thousand, that's a [00:04:00] pretty good rating. But also that's been used, the same kind of concept has been used at casinos. And it, it's not surprising at all that casinos would use it for card counters and, you know,
Matthew: instead of having It's not illegal, but
David: Yeah, it's highly frowned upon.
If you're a very successful gambler, casinos don't like you.
They tend to chuck you out of
Their establishments.
Matthew: Yeah. So,
David: you know, while it was kind of helpful, it really wasn't good enough at the time to
Really be used in any large scale
Matthew: way.
Then in 2011, there was a government meeting put on by the FTC called Face Facts. That is, that is wild. Quite witty. I know, right? About facial recognition. And generally speaking, the output of the conference was that everybody was aligned that companies should not release software.
At least they verbally, at least they were like publicly aligned that way. But unfortunately this conference did not produce any regulation. It was just a 30 page report of best practices. Which, as we all know, reports on best practices are commonly [00:05:00] followed and listened to all the time.
David: Yeah. And since it was produced by the government, was it really best practices 2014
Matthew: Ha, yeah. In 2014 the FBI chose the best performing option to search their Mugshots database. It was only moderately accurate. Here's a quote. Technical documents warn that if a searched person's mugshot existed in the database, it would appear among the top 50 candidates only 85 percent of the time.
Top 50 candidates. Yet,
David: mugshots for, like, pretty high quality images.
Matthew: when a search was run on the system, it included just 20 candidates by default.
David: Wow.
Matthew: Oh, man. It's hard. Yeah. Then in the 2010s, neural networks blew away the other methods of face matching. And that kind of brings us up to
David: The great evil,
Matthew: the great Google Evil Google.
David: So in 2009, Google released their goggles.
Matthew: I remember those.
David: Which you, you, you would take pictures and you could search based on that picture. It didn't do that great. [00:06:00] At some point Eric Schmidt was interviewed and admitting to withholding the
Matthew: the tech.
David: tech,
Matthew: The,
David: technology. So basically, they were not going to move forward with
Matthew: it. Which is interesting, because you have to, again, I mean, we're gonna, we're gonna say this with all these, but like, what makes Google back off?
Like, Google goes through your email, and like, harvests your email for data about you to provide ads, but they thought that facial recognition was too far?
David: speculation would be, they could not think about how to monetize it.
Matthew: to monetize it. Hmm.
David: That is probably why they
Matthew: stopped.
Like pops up in your Google Glass or Google Glass or whatever.
It's like, do you want to know who this is? 50 cents.
David: tell me. Yeah. Micro transactions with your Google Glass,
Matthew: In Facebook 2012 they bought face. com and were creating their own algorithm based on that intellectual property.
David: Oh, well know. Speaking of that Google Glass, it just reminded me of
Matthew: Dan,
David: everything goes back to Daniel Suarez and,
Matthew: uh,
David: Freedom,
Matthew: Where
David: he goes
Matthew: to that
David: place in Hong Kong,
Matthew: To craft the artifact?
David: or is it [00:07:00] Taiwan? No, I think it's Hong Kong,
Matthew: Where he
David: crafts the artifact that erases him
Matthew: from all images. Ah,
David: It's like, so you pay the fee for
Matthew: someone to click on that and say, you know,
David: 50 cents to see who that person is, and you're like, Oh, nope, you've been denied, because they have paid 75 cents not to be included, or whatever.
Matthew: You know what would be interesting, actually, is I wonder if that would end up looking like. So you know how on Google Maps you can go to Street View and look at people's houses? So people can request their houses removed from Street View, which they talk about a little bit in the book, and it's just a blur.
David: And
Matthew: people look like that? And I bet there were repercussions for that. In the glasses? Yeah, people would like egg their houses and stuff. But like, I wonder if like in your glasses, like once we get like full on glasses, like people appear as like a blur. And you have to be like You have to take the glasses off to see their face.
Oh, it would be funny if they
David: the black band just across
Matthew: Across the eyes. That would be hilarious. Oh, or even better,
David: just, like, a meme
Matthew: face.
David: Over top of it.
Matthew: That's [00:08:00] how they monetize it. You get to pick your own face that appears in the thing. So when
David: look at Matt, he's got a unicorn head.
Matthew: Dammit,
don't, someone else will get it before me.
Well,
David: that, you, you pay 75 cents for the
Matthew: unicorn head.
David: You want a better one, you gotta pay a dollar.
Matthew: have a, it's like, it's like ranks of items. Like you can, you can, you know, 50 cents gets you the gray rank, like boring, smiley face head. But 75 cents gets you the, the, the uncommon. And if you pay 50 dollars, you get the epic one with like sparkles and like fireworks going off behind you.
David: Oh man, this totally reminds me of Ghost in the Shell, Stand Alone
Matthew: Complex.
I haven't seen it. I
David: mean, Ghost in the Shell is fantastic, but Stand Alone Complex was an anime, Ghost in the Shell TV series.
Matthew: Oh, I watched the one with Scarlett Johansson, that one was so good. I'm just kidding. You're dead to me.
Hahaha. But, there's a [00:09:00] plot line in there. about a, a super
David: hacker, I cannot remember his name for the life
Matthew: me, but anytime crash override, zero
David: eyes. zero cool,
Matthew: cool.
David: Images of him, his face is
covered up by this logo. In anybody's images, in their eyes, because he hacks everybody.
Oh,
Matthew: Oh, cause it's built into the eyes.
So you can't take the glasses down. You're just,
David: So he hacks people's eyes. Anytime you see him
Matthew: in video footage, or you actually look at them. You get,
David: you get
Matthew: that logo instead of where it's supposed to watch that. Have to watch that. That sounds interesting. It's fantastic.
Like
David: I said, the Ghost in the Shell virtually all the movies, number
Matthew: two is not yet great. But, Ghost in the Shell, Standalone Complex, Solid State Society both seasons of, of Standalone Complex are really good. The latest, the latest one that's like CGI, eh, it's okay. Not nearly as good as the older stuff.
David: but still Ghost in the Shell is
Matthew: I hadn't thought about that. Like, cause yeah, we're going to glasses next, but eventually it will be built into your eyes. And if you just see like somebody as like a blurry mess, or erase them from [00:10:00] your vision completely. Right. Someone could literally just walk around like a, like a superhero. We'll talk about that a little later.
Cause there's definitely been like in China, VIPs do not show up on the facial recognition. I guess it's not later now.
David: One of the
Matthew: Clearview AI explicitly stated his goal in investing in it was not because he wanted it to succeed, but to make sure that his face was not in the recognition.
Yep. Okay, so Facebook, they attended a privacy hearing in Congress, and they were asked if anyone could download all the photos from Facebook to add to a facial recognition database. They said there's no way
David: that could
Matthew: could happen. Okay. Their own algorithm was ready for prime time in 2017, but they used it for tagging people in photos.
They ended up backing off after a lawsuit in Illinois. Illinois is one of the only states that has a law against us. And we'll talk about that in more detail later. And they ended up paying a 650 million fine and they ended up dropping the facial recognition stuff.
David: And of course, that brings us to the, to the third big major tyrant. [00:11:00] Yeah.
Matthew: mean, Giant.
David: Well you could call it one of the fangs, I guess.
Matthew: Right?
David: So Amazon created and released Recognition with a K, which is witty. In 2017. That
Matthew: reminds me of Recall in the Schwarzenegger movie, Total Recall. Didn't Recall have a K? No? Nevermind. Don't remember. Nevermind.
David: But the ACL, ACLU attacked Amazon for that. What's interesting though
Matthew: What's interesting though is it was only selling the algorithm. Like, they got away, they got around the privacy part by making the customer bring their own database of faces. So that way they could be like, oh no, we're not scraping your stuff, we're not the ones taking your face.
David: We're just facilitating the use of that data.
Matthew: Yeah.
David: Which is actually, I mean, the three, that's probably the least terrible, to be honest.
But this, this is what resulted in our joke at the top of the episode, where they ran the mug shots against congressmen, [00:12:00]
Matthew: and
David: found the 28 matches.
But of course, the police thought that facial recognition was actually better than eyewitnesses in memory, which is probably Pretty true. I just thinking that was actually probably right.
Matthew: People's memory is terrible.
David: Yeah. And there's the whole thing about
Matthew: What do you
call it? The, um,
You being influenced
David: by the image that you're shown, making you think that that image is what you've seen before, versus actually comparing that with what you did see
Matthew: see before. I
David: I forget what the term for that
Matthew: is. I actually realized there's a joke we could have done here, where you have to bring your own face, I'd be like, I've only got 10 faces in my basement.
David: Oh. I was trying to think. I was like,
Matthew: like,
David: Jeffrey Dahmer would end up with the best algorithm. Or, was it Ed,
Matthew: Yeah, the most faces. The
David: who, the Texas Chainsaw Massacre is based on, Ed
Matthew: Ed
Gein.
David: Yeah, Ed Gein
Matthew: would
Have the best algorithm.
That's horrifying. Alright. The best database. So Clearview AI had three founders originally.
Charles Johnson, [00:13:00] Juan Tan Bat, and Richard Schwartz. I'm not gonna get into their backstory. Charles Johnson was originally kicked out because he wasn't contributing much.
He
was too busy with a divorce, apparently. Can't believe him.
David: He basically started the initial conversations and get,
Matthew: in He was a connector.
David: yeah, he got everybody linked together
Matthew: Yeah.
David: The start.
Matthew: Yeah, Juan Tanta was the techie, and then Richard Schwartz was, he had, he had connections.
David: Yeah, he was the political operative.
Matthew: Political operative. Alright, so it started off as a social media report called SmartChecker. You enter an email address or a handle, and you would receive a report about who they were online.
The book states it was originally used to find out if they were conservative or liberal based on who they were associated with online, and it was used to screen folks who attended the deplorable, which I think is the wittiest name actually that I can think of, like your bucket of deplorables. Well we're gonna have the deplorable,
David: hillary should have asked for a commission.
Won five percent from that. That's my, my logo [00:14:00] or my
Matthew: there were multiple left leaning agitators who tried to attend in order to protest. But they were caught out because they, none of them tried to hide their, apparently one of them put their real email address in the thing.
David: Like, what, at, at, at democraticparty. com or something like that?
Matthew: No, but they, but it was, it was, they tracked the email address to her social media and found that she followed a bunch of Antifa accounts. So, she actually showed up in a deck, which they presented to Hungary. They were trying to sell Hungary the tool.
This one protester. They had three photos of her, nine of her social media accounts, two leaked passwords, and a statement that she was following Antifa related accounts. So they declared her to be a Democrat. Or a progressive, or a leftist, or something. Or unwanted.
unwanted. She's deplorable. To them.
Yes.
Ah.
David: Alright, so they tried to sell it as an instant background review for individuals attempting to cross the border.
Matthew: God, can you imagine if they did that? Like, look up all your, like, they, right now, the [00:15:00] border is a write free zone where they can look at your laptop and your
David: Yeah, within a hundred miles of the border. Within
Matthew: border. Within a hundred miles of the border. But you can imagine if they could just do that with your face? Like, not even having to ask you to log into your computer or whatever, they're just, check it. They're like, oh, huh, this guy said the U. S. sucks. Yeah. Yeah. Get the fuck out.
David: Well, I mean, they could still do the same thing with your name.
Matthew: Well, but it's not easily, it's not easily connected to all of your accounts. Like, this way it's all connected.
David: Well, if you have your face attached to your accounts,
Matthew: Well,
David: they get your name and then they could find your email address for your name and then link that email address to your accounts, it just makes it easier. It doesn't necessarily,
Matthew: It possible.
David: Whereas it wasn't before.
Matthew: Alright, fine.
David: Yes. Give into Maya logic.
Matthew: Do the next one?
David: They decided to add faces to the search and try and match pieces people's social media in that way. So TonFat, [00:16:00] or TonTat? I think
Matthew: think it's ton tat. I think you're right. I think I mispronounced it before.
David: TonTat,
Matthew: I'm gonna misgender him next.
David: next. The coder from Australia. Yeah. Found the facial recognition library called open face on GitHub that was written by Carnegie Mellon University and of course if it's on GitHub in this instance, it's open sourced
And then they found a mathematician named Terence Z Z Lu
Who knew machine learning?
better than then then tat did and Improved on what they had written with their what they had leveraged with the open face
Software. And then they use the library of celebrity faces that Microsoft actually posted in order in order to train the model. And that's the irony of that is Microsoft released that that image library for that specific purpose.
Matthew: I actually am surprised, thinking about it, Microsoft isn't really mentioned [00:17:00] in this. Microsoft had to have a facial recognition. I'm sure all the big major tech companies were doing facial recognition in some sort.
But they weren't really mentioned at all in this book.
David: No, just this, this, I think this, this, this is the only real mention that I, that I remember hearing in that book.
That's a good point. I hadn't thought about that, but.
Matthew: wonder if she owns
David: has
Microsoft Hmm.
Matthew: I'm not
David: Her, her husband's probably works at, works at Microsoft or something. But anyway, so, they needed real people's faces in order to compare to that celebrity database. So, So where did they get these faces you say?
Venmo. It's like ridiculous. So Venmo had a news feed on their main page showing people's faces and the transactions.
Matthew: I don't think it showed the number of the transaction, I think it just showed that they made a transaction.
David: You know, paid 100 to
Matthew: to hemorrhoidqueen.
com, whatever.
David: So like really, Venmo?
Matthew: Financial
David: transactions and people faces, just, anyone, for anyone to get on the internet is ridiculous. And
Matthew: And [00:18:00] putting them on a feed on the main, like I've seen feeds like that and I always assumed that it was like dummy data.
Right. But they put the real stuff. Oh my god. Can you imagine going on that website and seeing your name pop up and being like, wait, what?
David: Well, actually, you know, that reminds me of back during Ron Paul's first campaign. I can't remember if they did this during the second campaign also, but during the first campaign they would do what they called the money bomb for Ron Paul and on the donation site where you donated your money,
Matthew: you'd
David: make a donation and then your name would pop up with how much you donated.
And there was a scrolling list.
Matthew: of everybody donating to
the Ron
David: Paul's money bomb. Just imagine if there were faces attached to that at the same time scrolling through there. That would've been crazy.
Matthew: Oh boy.
David: But scraping from the Venmo website every two seconds, they downloaded all the names and faces from Venmo, which gave them 2.1
Matthew: 1 million faces. You know,
David: the thing is, you know, if they could have put the dollar amounts on there or what the[00:19:00]
Matthew: was,
David: they could have also then made an inference about something about that person, about how much money they had or their wealth or what they were interested in and all sorts of stuff from that.
Really terrible practice on the part of Venmo. I can't, I'm so shocked that that's, that's where they started at. Then they moved on to Crunchbase
Matthew: where
David: they were targeting potential funders. That they wanted to impress the folks at Crunchbase with the data that they had. And of course, they eventually moved on to LinkedIn and Facebook.
Matthew: Shocking. So a smart checker died in 2017 because we'll talk a little bit about what they were offering, but they apparently tried to offer services to politicians. So they ended up being connected to a white nationalist running for office and the founders decided they needed to get a different name and a fresh reputation.
David: So they just swap it out like Blackwater and just
Matthew: same company, new name.
David: we'll be good.
Matthew: So in 2018, they advertised having three services. They had Clearview AI Search, which was background checks. Clearview AI [00:20:00] Camera, which sounded an alert when a criminal or an unwanted
entered the premises.
David: dropped down a red light and sort of
Matthew: light and sort of flash it.
David: Matt walks into the store and it's like, whoop,
whoop,
Matthew: Yeah. And the third one is Clearview AI Check In. Used for screening visitors, verifying identities, and does the smart checker thing for their connections. So you can, I guess, figure out if someone is coming into. fake you out. I can't imagine how often you need to know, like, the background of the person that's coming into your office, but
David: Well, they may have been thinking about it as a lead generator,
Matthew: say,
David: hey, you know, Matt does this for a living, he's associated with this person that also does this same thing, who might be also interested in this kind of service,
Matthew: Yeah, I can see that. Possibly. So apparently Clearview AI is able to ID people in the dark, and the background to photos and hats.
David: In the dark?
Matthew: In the
David: So there's no lights on,
Matthew: you've got
a, just a black square. It's like, yeah, I know what it is. Not that dark.
David: You mean in a darker, less, less well lit photo.
Matthew: [00:21:00] With or without facial hair and could even distinguish between relatives. In the book they specifically called out it could tell between sisters. It was later discovered it could recognize corpses. Shocking.
David: Wait, a corpse has a face still?
Matthew: Well, okay, if it doesn't have a face,
When reviewing source code, they discovered that it could run on augmented reality glasses, and in fact, all the way at the end of the book, they do have a, a brief demo of that.
And they're currently, according to the end of the book, they're trying to make it also search for the background of a photo and reveal the location and what other photos were taken there. Could be useful for the, the criminal part. But, it'd be weird for, I don't know.
David: So in
Matthew: 2019
David: they were publicly revealed to a FOIA request leading to that led to a memo discussing the legal, legal implications of Clearview technology.
Matthew: And of course it was a government memo which said this is perfectly fine because they use public images and was written as cover for law enforcement agencies to clear or clear them to [00:22:00] use it basically.
David: There's an interesting comment that multiple large tech companies were capable of building this, but they backed off. Only Clearview decided to push forward.
Matthew: but
David: after, after that came out
Matthew: Facebook, Venmo, LinkedIn, Twitter
David: all put their lawyers on the case to send cease and
Matthew: desist
letters because that's where they were getting their data from.
After the
David: the horse is out of the, out of the barn.
Matthew: they're send, you know, Stop doing what we said you couldn't technically
do. But they didn't do anything other than the cease and desist letter. They just sent a cease and desist letter. Lawyers were like, We're good. We've done all we can.
David: have lawyers on staff, you know, Have them write
Matthew: a strongly worded letter,
David: basically. But I think we mentioned this earlier that There had
Matthew: there's already been a legal case where LinkedIn
David: Tried to prevent people from scraping their site And LinkedIn lost that case and said, it's perfectly legal to do scraping. So it was basically too little too late at this
Matthew: point. Yeah.
David: But at least what it [00:23:00] did
was get
their app booted from the app store because they did have an app where you could take a picture of someone and run it through the clear view database.
Matthew: Yeah. So, for the next section we're gonna talk about law enforcement uses, because this was originally marketed to law enforcement. One of the first law enforcement officers to receive early access to Clearview was investigating fraud. He had lots of pictures from ATMs and surveillance cameras, but no easy way to ID the people.
After gaining access, he says he matched 30 suspects very quickly. They didn't give us a ton of data on that, but
Yeah, I of how many,
Well, and how many of those turned out to be the right ones. Yeah, right.
David: Matches doesn't actually lead to definite definite criminals.
Because supposedly it was only supposed to be used as a lead generator and not actual definitive evidence or proof of
Matthew: Yeah, we'll talk about that a little bit later, some of the mistakes. But it had to be. Apparently it has a big thing across the top when you print off the report that says like this is not the suspect. This is a lead. You should confirm there that they were in the [00:24:00] area and could have performed this crime and So New York and New York Police Department was the first major customer that actually paid for it Their first success was identifying actually know they didn't pay for it But they were the first one that there was tested with They identified a pedophile who was, thought he was sending pics to a 12 year old girl but was actually sending pics to a police officer.
50%, at a 50 percent rate, so they decided they needed more faces. And this is why they started scraping. Face, I think this is why they started scraping Facebook and
David: Yeah, but imagine that, you know, this thing gets even better, and it starts off with a face, then it does total body imaging, and then you do comparative body types and tattoos or other recognizable features.
Indicators you know, do they have a wedding ring on you know, maybe certain earrings or something like that
Matthew: date analysis, how they walk, how a person walks.
David: Well, if they, if you have video, that's, that's, but if you only have
images,
Matthew: have images. Can you imagine if it could [00:25:00] tell how they walk? Oh, this guy's got a limp. What? You've got a picture. It's a picture of his face. You can't even see his legs.
David: legs. Yes. Yes.
Matthew: It's
David: all probabilities.
Matthew: He's got a squinty eye on the left side. That's,
David: It's kind of like doing the Sherlock Holmes
Matthew: detect
David: deductive reasoning. Well, if this, then that, which probably means
Matthew: this.
They also talked about voice searching in the future. So they'll be able to identify us on this.
Matthew Keener's not my name. They'll find out who I really am. This is
David: is not the Matthew Keener you seek.
Matthew: There's a lot of Matthew Kieners out there. I used to Google search myself every time I started a new job, just to see like, what was available online. I found some really embarrassing stuff that I posted in my high school, and like, went back and deleted
David: I still have the
logins the logins to those.
Matthew: But now there's so many Matthew Kieners online, like, I can't even find myself. When I search for Matthew Kiener, I don't even come up in like, the first five pages. Well,
David: I don't exist [00:26:00] apparently, so I'm good.
Matthew: Interesting, nice. So the NYPD decided against purchasing it due to the perception of it. So again, yet another major group was like, well, maybe we don't want to do this.
David: Yep. And the FBI was able to use it to find an abuser when the photo matched the the man's face in the background of another photo
Matthew: at
David: a bodybuilding expo in Las
Matthew: that was wild.
That was wild. Yeah.
David: and I were talking
Matthew: You know as my wife and I were talking before this about having to get our picture taken when we flew to Ireland Like before we got on the plane in Ireland like they take your picture And there's a thing across the bottom that says your picture is immediately deleted after this and my wife Who is not a government like conspiracist at all.
She's like, oh, yeah, right Oh boy Yeah, I just had when I flew back from Miami they took my picture at the Miami Airport at the TSA checkpoint So it wasn't even international. Yeah. It was like before I even went through.
David: You can opt out of [00:27:00] that.
Matthew: Really? Pretty sure. I didn't see anything saying you could, cause I know when they added the radar scanners.
David: Yeah, I don't go through that.
Matthew: So I didn't go through those for years. And I finally got,
David: I get groped every time for personal reasons.
Yeah. Anyway at an international conference for child crime investigations in Europe it was held as, as
Matthew: both
David: because it was great at, at finding the identity of abusers, but also the kids.
No, no racial, no facial recognition
database, no racial recognition. Facial recognition database. No racial recognition.
Matthew: You are 164th Cherokee.
David: Well, see, now they don't have to do the swatch test anymore. They have a swatch of skin colors and just
Matthew: like, Oh, squanch. They said squanch, like from Rick and Morty. I was like, what?
David: Not the squanch test.
The swatch test.
Matthew: squatch test.
David: So, but no, no facial recognition database had children's faces in it.
Matthew: Which is actually interesting, because they got all their stuff from LinkedIn, Facebook Venmo. Where did they get the children's [00:28:00] faces?
David: High school photo databases?
Matthew: Maybe. Yeah, I don't know. It's weird. Oh, that
David: would be interesting though, if the government was scooping up all school
photos.
Matthew: Classmates. com and all that stuff, yeah.
David: Well, I'm just thinking,
Matthew: you
David: know, when I was a kid, they'd do
Matthew: class
photos every year, right?
David: So what if all those also always went to the government?
Matthew: Interesting. Because they're public schools. Interesting, yeah. Alright. So, London has a mobile facial recognition unit. They put out a big sign on the sidewalk saying that facial recognition is happening. Which makes me wonder if this is like when you see a, you're, like, let's say you've been drinking and you see a checkpoint and you stop and you turn around and it's gone.
Like, are they watching for people who see the sign saying facial recognition is happening here and like, Oh, I'm going to go this way. And then they chase you down.
David: Well, no, they don't do that. What they do is they set up the sign that says facial
Matthew: takes place
here. But the facial recognition is in the opposite direction.
So when you
David: around, that's who they're actually doing the recognition
Matthew: So, in one day where they had data, they stopped 11 people and 4 were arrested. So, less than 50 [00:29:00] percent accuracy.
David: And of course, Clearview was used to identify people in the January 6th
just
Matthew: everything. I didn't want to be political, so I just put, you know, the January 6th attack.
David: March riot, peaceful protest, insurrection
Matthew: We had all the bases.
David: And this is a quote from Clearviews website about their effort as it relates to January
6th.
Matthew: That's wild. So it's wild because Supposedly, nevermind, let me let you read the quote and then I'll tell you why it's wild. I'll tell the listener why it's wild.
David: alright. So this is a quote from Clearview's website. Within weeks of using Clearview AI, federal prosecutors issued over 900 warrants and charged over 350 people.
Clearview, Clearview AI was able to assist in the arrest of hundreds of rioters in a short amount of time.
Matthew: So, The author says that the people who founded this company are right leaning and Trump supporters. Why would they advertise this, then? Like, if they think that [00:30:00] they're This seems like this would piss off the very people that they are I don't know.
It seems incongruous to me. Or maybe they just decided they like money better than they like their
David: Well, that's how evil capitalists are. They will bite even the hand of their conservative masters.
Matthew: No.
David: Like, really. It's all, it's All authoritarians, all the time. Parties are completely irrelevant. And it's also being used in in Ukraine.
Here's another quote from the Clearviews website. The CEO of the American company, Clearview ai, whose product identified the occupiers and traders will continue to cooperate with the Ministry of Internal Affairs of Ukraine.
Matthew: That's fucking horrifying. Because, depending on which party, like with the, with the Democrats in party, they say that about the people who participate in the January 6th insurrection.
But then if the Republicans were in charge, they would say that same thing about the Black Lives Matter protesters. Like either way you get it, each side has traitors and occupiers and enemies that they [00:31:00] could turn this on and start chasing down and arresting.
David: Well, that's one of the reasons I highlighted this quote in particular because of the fact that it came from the Ministry of Internal Affairs of Ukraine, not from the Department of Defense.
Matthew: So they're actually talking about their
David: own citizens. really is what they're talking about here in this, even though they mention occupiers, it's
Matthew: the traitors. Well, how would they, how would they have photographs from Russians?
Although I know Facebook
David: or whatever
Matthew: was
VK or whatever. Yeah,
David: yeah,
Matthew: I guess you're right. They probably didn't just stop once they scraped LinkedIn. They're like, we're done. Yeah. We have enough photos. I mean. Yeah. Because
David: don't,
they obviously don't want to just sell it in America. They want to sell it planet wide. And Ukraine is where they're doing the testing ground for it.
Matthew: it.
Yeah.
David: But yeah, the Ministry of Affairs kind of highlights the fact that it's not really aimed at Russians. It's not their primary use case for it.
Matthew: Yeah. So we've got lots of mistake examples. For example, just in general, Law enforcement tends to try and put [00:32:00] controls around technology like this because there's lots of examples of law enforcement officers looking up their ex girlfriends, ex boyfriends to see what they're doing and stalk them.
But the book did have a few very specific examples. Law enforcement in Detroit used it to ID a shoplifter. They ignored the instructions we just talked about where it's supposed to be a lead. They just decided it was correct and they arrested the wrong guy. The people apparently didn't even look alike.
Like, the guy, like, looked at the picture that was supposedly him and he was like, this is not me, this doesn't look like me.
David: And held the picture up next to his
Matthew: face.
David: And the impression I got from the book was they wanted to portray those officers as kind of racist because they didn't right away go, oh yeah, obviously we've got the wrong guy when they did the prison because
Matthew: because all, all, all
David: black people look the same to them.
Matthew: Yeah so the guy who was arrested, so they ended up charging him as per felony.
David: Which is also ridiculous. They
Matthew: For shoplifting?
David: No, no, I'm just saying that in the interview, doing the comparisons and everything, seeing that he's obviously not the same guy,
Matthew: Yeah.
David: the cops should have been able to go to the prosecutor and say, [00:33:00] Hey,
Matthew: you got the wrong guy.
David: not charge this guy with a felony,
did it anyway.
Matthew: Yeah. So the guy who was arrested had a series of strokes, possibly caused by the stress of this, although maybe he was just due for a series of strokes.
David: Yeah, they said he was a extremely large man.
Matthew: The ACLU ended up, I think they ended up finding him a defense lawyer because they couldn't afford an attorney on his own.
Although, thinking about that, that actually is incongruous to me. Because wasn't one of the points that he made in the interview that he had a much more expensive watch than the watches he was supposed to have stolen?
David: That sounds right, but I don't remember specifically. But then
Matthew: they said he couldn't afford the 5, 000 for an attorney. Although, frankly, a lot of people just spend all their money, so maybe he spent it all on expensive watches. No, he
David: That could be, yeah, yeah.
Matthew: There was another ID that led to a mistaken arrest for a cell phone stolen, except the guy who stole it on the camera had bare arms, and the guy who was arrested had tattooed arms.
But I don't know, maybe he went out to get tattoos right after he stole them. So there are at least three others documented in the book as well.
David: Well, what's ridiculous about this is all [00:34:00] these examples are small time.
They aren't using the facial recognition to real harden criminals that are doing really heinous things.
I mean, sure, a shoplifter is really bad. And a guy who steals a cell phone. For those people who are affected by that, that's terrible. Yeah. But you're not talking about, hey, we, you know, this guy's incorrectly identified for rape or for murder. Or. You know, what the crimes you would really expect them to be spending their amount of time on.
And it's my own bias to say that the reason that's true is because cops are lazy and those other crimes are hard. Right? So it'd be really, it's really difficult to actually track down a murderer or rapist, but someone who's
Matthew: commits
David: crime on camera for shoplifting or stealing a cell phone is much easier.
Lower, lower hanging fruit for them.
Matthew: You know what though? I bet there's a lot of murders and rapes don't take place on camera. Whereas like the streets are surveilled these days. Like if somebody steals something from a store, there's a camera in the store, there's a camera outside of the store, there's a camera, you know, traffic [00:35:00] cameras capturing people.
This almost, this almost. Simply because of where the cameras are, it's going to lead itself to being more used for dumb crimes like that. Because that's where the cameras are.
David: Right. Well, I mean, that also goes to what I was saying I mean, It, it, it, it's more to the point of why they're not doing that with their rapists also.
But that also means that because those are, will be easier to track down because of facial
Matthew: they'll spend more and
David: more and more time on those also, and less and less time
Matthew: time on really important crimes. But broken
windows. What's that?
But broken windows. Broken windows. Stopping those robbers will stop the murderers.
David: No. What, what it's
Matthew: I'm sorry.
David: sorry. Oh man.
Matthew: All right.
David: I had another point, but I've lost it.
Matthew: I'm sorry. Anyway. Other
uses for facial recognition. On 2017, Clearview tried to sell themselves to a candidate for office in Manhattan. By doing some opposition resource, the assumption was with good facial recognition, you can find photos of people they didn't know were out there and never expected anybody to find [00:36:00] like college party photos.
Embarrassing photos.
They also pitched to at least one other candidate, Paul Nellen in Wisconsin, but apparently once they were revealed to be in talking to him, just got them connected in Google to white nationalism. So that's when they decided to kill off the company.
David: decided company.
And other customers tried to pitch it into a bank to identify high net worth company customers. So when they came into the bank, they'd get white glove service. And identifying miscreants also came in the bank. Real estate. So, pitching alerts on people who enter the building, enter buildings who are not supposed to be there or on a blacklist for one reason or another.
Hotels.
Matthew: That's it. It's a good show. At least the first season.
David: Hotels, so you could just walk into a hotel and go straight to your room?
Matthew: That'd be actually kind of cool.
David: Where do you, I mean, you'd have to pick up your key somewhere along the way.
Matthew: Well, I know a lot of them are doing digital keys on your phone now.
So I'd really just like [00:37:00] activate it.
David: Oh, that'd be kind of neat.
Matthew: Yeah, it would be kind of neat.
So, other facial recognition vendors gave the following examples. Casinos, both for high rollers, we talked about before, and people on their blacklist. Retail stores, a company called Base Watch, sells to stores and they flag people that other stores have marked as shoplifters or abusive so you can proactively remove them.
So now this is like an extra legal, you're not convicted of shoplifting, just some other store in the network said you were shoplifting, so now you're
David: And they reserve the right to serve or not serve whoever they want.
Matthew: So my, actually my biggest thought there is you get a lot of teenagers working in retail stores.
Like, what if you get a teenager that thinks they'd be funny? Or they see somebody they hate from school and they're like, Oh, this dude's a shoplifter.
David: Well, I think there's a market correction mechanism for that also. Because stores want people to come in and buy their stuff.
So they don't want to turn away too many people. So if that false positive rate is too high or it's being misused, I think that disincentivizes companies from, for continuing to use that kind of service.
Matthew: [00:38:00] So Moscow apparently has facial recognition on 200, 000 cameras. There was another comment about the casino. So the casino that was mentioned before only has facial recognition on 10 cameras.
And then, there's this quote about Moscow having, adding facial recognition to 200, 000 cameras. Does it run on the camera? Like, does it have to be like a
Specific facial recognition camera. I would, my assumption would have been that the cameras feed a central location and the central location does the facial recognition.
David: That's what you would think, yeah.
Matthew: Yeah, they do partially answer this twice. Although they didn't when I, because I was basically going through the book and writing these stuff down as I went through there. So later they did say that placement has a lot to do with it. If it's up high for a big wide view, you can't pull out faces very well.
The best placement is eye level, because they talk about how in China, Like, they put it, the camera's at eye level, so it gets, like, directly in your face. There was a comment about a camera in China used for facial recognition in a elevator. It was like a watch that was going up and down the elevator.
There was one talking [00:39:00] about, like, they put it at the end of an escalator, facing up the escalator. As you came down the escalator, it gets you right.
Matthew: And then the second reason is they talk about this when they talk about Miami. A lot of the cameras in Miami, the resolution is too low. So they can't pull out faces. Cause it's like, I guess, 320 pixels by 160 pixels or something.
David: I think when they say, you know, when they're saying it's only running on certain cameras It means it's really running on only certain feeds because there's because the amount of processing power would take for that camera to actually do the comparisons It's too high.
You're not going to put that kind of money into the process of power on a camera. They're just talking about the different, the camera feeds are what are being
Matthew: separated out. It has enough resolution
David: out
Matthew: to do facial recognition.
All right, so privacy concerns. While going through, I made a nice long list of privacy concerns.
David: Holy cow.
Matthew: Okay, so, let's do these alternately.
You could use this technology to recognize undercover officers and secret agents.
David: Awesome. Put that on your ring doorbell.
Matthew: That'd be cool, actually.
I wouldn't want to know that, [00:40:00] though. Like, the ring doorbell shows up and it's like, ding, this is an officer of the law. Like, even if they're in plain clothes.
Or this person has 27 felonies. Like, you'd be like, alright, I'm not opening the door.
David: It'd be even better if you could have an automated response, right?
Yeah.
I was going to say come back with a warrant.
Matthew: It's the same thing.
David: Oh, that's the, that's the subtext.
Matthew: Unspoken subtext.
David: Yeah. Yeah. You know, the 27 felonies guy shows up for automatic response. Hey, I have a, I have a Smith and survivors will be prosecuted.
Matthew: Pick your favorite caliber. 9mm or 357?
David: Yeah, my sister lived in Racine and she actually had that on her front door.
Matthew: This
David: house is protected by Smith and Wesson.
Survivors will be prosecuted. So Racine's generally not a nice
place.
But the, the founders of Clearview said that one day one day, if our faces, face matching gets accurate they can use your photos to find otherwise anonymous.
Matthew: Tinder profiles. Sucks for the congressman,
David: so Google tested their, their [00:41:00] goggles with one person on the team admitted that women were scared that someone could use that to stalk them if they met them at a bar.
Matthew: Someone used an early facial recognition app called FineFace to identify adult actresses and sex workers and stalk them on their Facebook pages and their personal pages.
David: Not surprising.
Matthew: Nope.
David: And of course, governments use it to identify protesters.
Matthew: Many potential investors were spooked. At least one investor said they bought in just so they could get their face removed. And of course, this is a new revenue stream. Remove your face for X number of dollars. Right. We talked about that before.
Yep.
David: Yeah. And during the pandemic, supposedly a man in Moscow who is supposed to stay home for two weeks, didn't and got the cops showed up and showed up when he walked out after he got recognized by facial recognition, supposedly.
Matthew: In 2018, Madison Square Garden added facial recognition to their entrances, and in 2022 they used it to ban any lawyers at firms that had sued them or any of the other businesses that don't.
And this is actually the [00:42:00] type of thing that I'm most worried about. Is like, if you're involved in some sort of adversarial thing between businesses, them just being able to kick you out.
Yep.
David: It's interesting that there are enough of those lawsuits against Madison Square
Matthew: Garden There was like, there was like 1, 200 lawyers or something.
David: lawyers or something that they need. Ridiculous. It's like, maybe you need to re look at your business practices if you're in that many It's kind of like the the Wells Fargo, I guess, of the entertainment industry.
Matthew: my god.
David: It could be used when you're interviewing for a job. To check your social media and any photos posted of you before you get hired. That's, that's the more, that's the one I would be more concerned about than the Madison Square Garden.
Matthew: Especially, they talk about how for some of the ones they showed, it found them in the background of pictures, they had no idea were there.
Like, what if you, like I live near D. C. What if I'm in D. C. and I walk by a protest? And I'm caught in the background of a picture of a protest, and let's say it's something truly, like, let's say it's a Nazi protest. And then now they look me up [00:43:00] and, oh look, he's in the background of a photo, is he? Yeah. You can find yourself associated with people that you have no, you're not.
David: Right, because that's one thing it's doing, it's matching the face with the photo, but it's not necessarily having the context there, exactly.
Matthew: No. two users with some scripting knowledge in Switzerland, after finding out about this, They decided to see how hard it was, so they did it with an open source library and they scraped Facebook and Instagram over 10 days.
We, it's funny, we actually just talked about how Facebook sued, or they sent a cease and desist letter saying don't scrape us. They said they had no problem scraping Instagram from one IP address, one box. They said the most they got kicked off was like an hour at a time. So Instagram is doing nothing to try and prevent this.
David: And they said they got hundreds of thousands of photos.
Matthew: Yeah, and then they tried to match it with, politicians, local politicians in Switzerland, and they were able to do it. So now it's basically out of the box. Anybody with a little scripting knowledge, now that AI can script, it's probably even easier than that.
David: right.
Matthew: Hikvision.
David: I was [00:44:00] wondering how to pronounce that. I suppose you could pronounce it that way.
Matthew: Hikvision. I have no
idea. Hikvision? Yeah,
David: don't know. Chinese camera company or camera and analytics company is being used to keep on the Uyghurs in China. A lot of, a lot of countries are using this for passport photo control which Matt mentioned a minute ago. Of course FaceWatch, which we mentioned earlier as well. and of course there's the whole thing about this hackers being able to get this data once it's been collected by organizations that are doing this.
Matthew: Yeah, they specifically have broken into the Moscow Facial Recognition Network, and they had an example of someone bought a report of themself and received 37 pages of places that they'd been seeing around Moscow.
David: Holy cow.
Matthew: So China, China's going a little whole hog on this. They have a facial recognition on toilet paper dispensers in the Temple of Heaven. Apparently people were stealing the toilet paper. It only dispenses two feet per face. That's not much toilet paper. [00:45:00] I don't think that's enough.
Matthew: So, China has used this to find and remove unlicensed tour guides.
David: Can't have that.
Matthew: They have used it to name and shame people wearing pajamas in public. They post the photos and names to social media. I don't think that's enough. They are finding jaywalkers by camera. And if you're in China, if you're a VIP, you can get on the red list.
Because in China, of course, it would be the red list.
David: Right.
Matthew: And that makes you invisible to the cameras. You're no longer detected. You can jaywalk at will. You can wear your pajamas wherever you want to.
David: do you get on that list?
Matthew: Be a communist? Capitalist communist because it's not enough to just be a communist anymore
David: Be a rich communist?
Matthew: a rich communist So
David: So surprise surprise in 2021 the CIA warning that a large number of their informants were being captured and killed due to biometric scans and facial recognition.
Matthew: Don't they blame that on Trump?
David: No, I mean,
the thing, here's, I wouldn't be surprised if he did, but you know, this is complete bullshit in my [00:46:00] opinion, because what this really says is that the CIA is terrible at
Matthew: tradecraft. Because
David: do they know that
Matthew: have to be able to identify
David: You have to be
able to associate them with the CIA to begin with, to identify them
Matthew: the photos.
So, I can kind of see this being useful if they're meeting in public, and they have like a video or like a photograph of them meeting, and then being able to like go back and figure out who that is. But you can still do that, it just requires a lot more legwork by humans to like, go through it. Yeah,
David: like I said,
CIA tradecraft is fucking garbage, is what this really says.
Matthew: So is tonight, Alright alright, so we're gonna skip to the end. Why does this matter? Well, I think we get to choose security or privacy. If we roll out facial recognition everywhere, we can have a higher amount of security. We can identify more criminals faster. We can really, you know, get in on those jaywalkers and those shoplifters and hammer down on them.
Or we can choose privacy. So I feel like these companies think there's a middle where you sell it to the law enforcement, but[00:47:00]
not to individuals. But I feel like if, if you pick the middle, you're always going to go one way or the other.
David: Well, we kind of touched on this earlier for my opinion on this
Matthew: Where
David: you're going to privacy or security, but the security you're getting is low level security because you're talking about those crimes like the shoplifting
Matthew: that can be caught
David: on camera versus the really heinous crimes, which is what you would really want
Matthew: that
David: to be able to leverage this kind of tool against.
So I think what you're doing is you're trading off privacy for a little bit of security.
Matthew: security against getting robbed, right security against getting your phone stolen,
David: Security
against relatively minor infractions. I'd say the only difference, the only exception to this, and this is where they get you, is the child pornography. That's the only exception, and that's where they always get you on that, because I would say virtually every, every good person or human on the planet
Matthew: would
be willing to trade their privacy. To save children.
em. I [00:48:00] didn't caveat it
but say good people. That's how, didn't caveat it by saying good
always get you is, you know, won't someone please
David: But that's how they always get you is, you know, won't someone please think of the children.
Matthew: with
David: And that's the real trouble with that one.
Matthew: one's gonna complain when you use this kinda tech to catch predators. But then, And it's funny, I actually wrote this, and then I like finished up my notes. I wrote this early when I was doing the notes, and then I realized, oh, they're already doing it.
They're already using it to catch robbers. They're already using it to catch jaywalkers. Some countries are using it to arrest dissenters. Like, and then they're, and then it's going to be released to everybody, and we're all going to use it to stalk the people we have crushes on, but don't Are afraid to
talk to,
Are afraid to talk to. I've seen that guy, like, every place I go. Where is
David: Pure coincidence.
Matthew: Alright David, what should we do about it? Well, you just give up, I think.
LAUGHS I'm gonna make, I'm gonna make that emoji we talked about, but I'm gonna make a physical [00:49:00] mask with two eye holes, and then I'm gonna wrap it, I'm just gonna wear it everywhere.
Oh,
David: nice.
Matthew: So I did actually look up countermeasure, we talked about countermeasure stuff to facial recognition a couple episodes ago, but almost none of it works on clear view. Cause it doesn't rely on like IR, so you can't, like there's like IR LEDs you can get on glasses and stuff, so that doesn't help.
The Dazzle camouflage only works against a specific facial route algorithm, which Clearview doesn't use.
Well,
David: they have, and this is something that happened a few years ago, they have extremely realistic
Matthew: masks
David: that, you know,
I think it was a
Matthew: Like
in Mission
Impossible.
David: Right, because there was an instance where I think it was a white bank robber wore a black man's
Matthew: mask
David: to perform his
Matthew: bank robbery,
and initially everybody
David: thought it was a black man who robbed a bank
But he was just wearing a extremely realistic
Matthew: mask.
notice that is
David: But it was his hands, because I think he, I think what gave him away was he had gloves on, but the gloves didn't [00:50:00] cover the wrist or something.
And that's how he how he got found out. But it may end up to the point where
Matthew: Just
David: everyone, when they go out in public, they wear a realistic mask. A helmet, they'll wear a motorcycle helmet everywhere.
Matthew: full helmet? No, no, it's a punk gothic mask. Futuristic tech wear sci fi gothic
mask.
David: mask. Oh, okay.
I'm Sorry,
I I miscategorized that.
Matthew: I'm just going to wear that everywhere in public from now on.
David: So it could be, you know, you're going to, you're gonna have one of five generic faces you're gonna wear out into public that
Matthew: or something.
because if you wear the same generic face every day, they can track you. They may not be able to identify who you are, but they can still track where you go.
And what you do from day to day. Well, the thing though,
David: if everybody only have five faces to choose
Matthew: from. Yeah, alright, I see it, I see it. If you mix and match the, yeah, yeah. So
David: So,
Matthew: maybe that's
David: where we're
going. I
Matthew: don't know. Maybe. So I know what I'm going to go do is I'm going home and I'm deleting all my photos off Facebook.
I already deleted all my [00:51:00] posts off Facebook and made it private. I know it's out of the bag, it's too late, they scraped Facebook like five years ago. So they've got everything before that. But at least I can delete it for the future ones that haven't scraped it yet. So they don't see the horrible things
David: you're going to do.
And
Matthew: take photo
evidence of. And take photo evidence of. The constitutional right to privacy generally only applies to an inter home and person according to a strict reading of the Fourth Amendment. The Ninth Amendment says we have other rights that are protected, but does not enumerate them. I think really what we need here is another amendment that actually describes what a right to privacy is in our modern day.
Of course, that's never gonna happen. There's no way. No.
David: Well, the, the whole problem is this, is this is like nuclear fission.
You can't unknow
it, right? The planet knows what nuclear fission
Matthew: is.
David: You
can't, you can't unknow nuclear
Matthew: fission,
Sorry, back to giving up?
David: up?
Matthew: Well.
David: I, I don't know if giving up is the right way to phrase it. But you can't put the cat back in the bag. It's too late for that. So what we have to do is figure out how
Matthew: you kill the [00:52:00] cat?
David: Well you could try that. You could try it.
But we're going to have to find some way to live with it, is what I'm trying to say. And I'm not really sure what that means today. But I think this
does have some implications for security when we're talking about how Nice.
Matthew: Wilford Brimley, I think.
Yeah. Isn't it?
David: But it has implications for security when it comes to biometrics, right?
So we say that the three things for security are something you have, something you know, and something you are, right? So currently, you know, as far as biometrics go, we've got face prints, fingerprints, palm prints, iris, voice prints, gait prints, or gait, your gait.
Matthew: your gate. So,
David: as far as
biometrics go, I think, for
Matthew: we're,
David: security, what we're going to have to do is consider that maybe for biometrics, we only leverage biometrics that cannot be determined from a distance for security applications, right?
So, no face prints, no gait fingerprints, palm prints, iris scans up to a point, maybe. Virus prints would be no good because those can be [00:53:00] collected at a
distance.
Matthew: So I bet this really fucks facial recognition.
David: Well, that's disturbing just to look
Matthew: at.
David: That hurts my eyes.
Matthew: I'm imagining what my wife would say if I, like, before we leave the house every day, I put on a mask. She's like, God, I'm divorcing you so quickly.
David: I bet you're gonna say just say that on before you have sex.
Matthew: when you put that on before you have sex. I can't have the facial recognition there either. It would be worse
David: if she said, oh well thank you for putting that on.
Matthew: that up. No, she'd probably give me
David: can actually get into the mood.
Matthew: Why can't I remember the name of the actor? There's some actor she has a crush on.
She'd be like, Here, wear this.
Chris Patt or Chris Hemsworth. I don't remember.
David: So I think that's where we may go for security is that the implications for what biometrics we can use for multi factor authentication. May or should
Matthew: shrink.
David: Or we [00:54:00] should have fewer options there, I think. But that's all we have to say about that today. Thank you for joining us. And follow us at SerengetiSec on Twitter. And subscribe on your favorite podcast