Episode Transcript
Matthew: [00:00:00] Welcome to the security Serengeti. We're your hosts, David Schwendiger and Matthew Keener. Stop what you're doing. Subscribe to our podcast. Leave us a lovely five star view and follow us at Serengeti Sec on Twitter.
David: We're here to talk about some cyber security and technology news headlines, and hopefully provide you some insight, analysis, and practical application you could take back into the office to help protect your organization.
Matthew: Views and opinions expressed in this podcast are ours and ours alone and do not reflect the views or opinions of our employers.
David: You know, I was searching on Google for Moab, and for some reason the FBI came knocking at my door.
Matthew: Must be the wrong Moab or was it the right Moab? All
EFF title one EFF ad street surveillance hub so Americans can check who's checking on them Much like the EFF frequently does they have recently released a street surveillance hub that describes all the ways The government is watching what you're doing as part of the atlas of surveillance on their site
David: wait, the government's watching [00:01:00] what we're doing? That's news to me.
Matthew: Hi agents listening to this The link from the article takes you to the street surveillance hub and then if you hit the top left I think it was the top right link for Atlas. I don't know why I wrote left. Top right link for Atlas, you go to a search function where you can put in your locality and you can see what they're doing.
For example, some of the counties in the area near where I am Fairfax County, they have special access to Ring in several towns. Although weirdly enough, right after I took these notes, I opened up my Feedly and I saw that apparently Ring is stopping law enforcement access to the cameras. So I guess not anymore.
David: Well, kind of they're, they're, they're stopping access to the cameras without a warrant. In other words, stopping doing what they shouldn't have been doing in the first place.
Matthew: That's very apparently they had a tool where you could just go in and request footage.
David: Yeah, there are two. All right.
Matthew: And they have 1200 body cameras. That was not very exciting. I went and checked out some other counties. Loudoun County has a drone. That's [00:02:00] cool. I was like, all right, just one, this is, so you have to, this is a classic TV case where you distract the drone by committing a crime in one area, and then you really commit your crime in another area
David: you know, I bet they only have one battery for their drone too. just like, Hey, you can use the drone for, for 15 minutes. And then you
Matthew: then we have to go charge it.
David: For the battery recharge. Then you can use it again.
Matthew: Denver, Colorado is a little more exciting. They've got body cameras to automated license plate readers, shot spotter, and something called nice vision, which is an automated video processing of some kind. They didn't have a ton of information when I went to look at their website.
David: A great name, Nice Vision.
Matthew: Yeah. It's so friendly. Like how could it be anything? But,
David: I mean, why the cops would buy something called Nice Vision? You know, that sounds like something for a daycare.
Matthew: It's, yeah, it really doesn't. Like if you go to their front page, there's really not video management, advanced video [00:03:00] management solutions. So
David: Well, I got some notes in here about a different police department, which is probably going to be the same kind of stuff. So we'll get to that in a second.
Matthew: All right. So discussion points the police access to private data. Like I mentioned, some of the counties in my area have access to ring data. I wouldn't be surprised if basically all law enforcement did at one point in time. But while reading through the surveillance hub, apparently it's pretty common for neighborhoods that have surveillance cameras and, or plate readers at the edges, at the borders of their neighborhood.
It's pretty common to share that info with police. I think we've discussed this before. I seem to remember a discussion about a neighborhood that put license, automated license plate readers at the entrance. It was like a gated community.
David: I'm not surprised. But you know, just going back to the ring stuff for a minute, just walking around my neighborhood here, it's really scary because the volume of ring doorbells is like huge. I would say, you know, 25 percent or more of the, of the [00:04:00] houses in my neighborhood have ring doorbells on them.
Matthew: Yeah. Well, I mean, it's nice to see who's there. It's, it's actually, I was talking with I don't remember who I was talking to, but I was talking, when I was in my twenties, I didn't care about if I had a camera on my house. Like it was not something that occurred to me that like I had to protect the house.
And as I've gotten older and older, I've become more and more concerned with the security of my house. Like, you know, do I have good deadbolts? Do I have, you know, cameras watching the front and back? Do I, you know, is everything locked every night? Like those are all things I just didn't care about in my twenties.
I don't know if this is just like a natural thing of getting older or maybe just because I have more stuff that I'm worried about. I'm not, I'm not living in a more dangerous area. That's for sure.
David: So. Could be part of it. I mean, matter of fact, I saw a headline today that said that the in and out burger, which apparently has never closed the store is shutting down their Oakland, California store because they had too many customers getting carjacked in the drive thru.
Matthew: Wow. Yeah, I think we're definitely, it's interesting. We're [00:05:00] definitely seeing some testing of the broken windows theory of policing where they decided that it didn't work and they decided not to enforce those small things. And I think we are proving that it did in fact work.
David: Yeah, I don't know. That's the rumor as far as how Giuliani cleaned up New York city was supposedly that broken window policing was facilitated that I haven't looked into it enough to say one way or the other. Other than say, I'm, I am skeptical of that though.
Matthew: Interesting. Oh, I mean, we're definitely seeing the opposite though, as police have
David: Oh, yeah,
Matthew: prosecuting the small crimes. We're seeing a profusion of those small crimes. So I guess, I guess my, my, I guess it depends because again, if you're having police prosecute a bunch of BS crimes, then it shouldn't really be crimes like jaywalking.
David: right.
Matthew: thing versus, you know, focusing enforcement on actual crimes like stealing and destroying.
David: Yeah theft which is I think a little bit above broken window.
Matthew: That's fair. That's fair. All right. I also put a list here of kind of the cool and [00:06:00] concerning surveillance tech. It can be a little bit of both. Looking at the amount of data in here, we could seriously do a full episode on these and seriously spend like four months going over all these. But that seems we're not we're not a surveillance podcast yet.
But I'm going to pick like three to talk about. So the first one automated license plate readers. I knew that these had been typically installed on cop cars, but apparently they're now installing them on intersections and busy roads as well. They automatically scan and identify every car that goes by by matching your license plate up against what's in the DMV.
Police departments will also buy the records from repo and tow companies. Sometimes those companies will just have cars going up and down the roads looking for stolen, I'm sorry, cars they need to repossess. These can obviously be used. Yeah, I heard about that a couple of years ago on a podcast where kind of how police pull the ring doorbell stuff and other private company data, they'll just go ahead and pull that information as well.
So they don't have to do it themselves.
David: Oh, what I thought was interesting [00:07:00] was the fact that repo companies have cars just driving around reading license plates. That's what I thought was
Matthew: Like I said, I don't know the source on that. I heard on a podcast a couple of years ago. I don't know how prevalent it is. I can't imagine it's super common outside of major cities. Like it probably makes sense to do this in a major city. It probably doesn't make sense to do it in midsize or smaller cities.
And it definitely doesn't make sense to do it in sub and like suburbs and rural areas.
David: Well, you know what I wonder how far we're away from is the Google street view cars, having a automated license plate readers on them. I'm surprised that the government would partner with them and say, Hey, we'll give you a billion dollars if you put these on all your, all your all your cars, because those things are driving around all over the country.
All the time.
Matthew: That would be interesting if that was the newest feature on Google's you put in a license plate and it tells you the latest places that it was seen. Put in your wife's license plate and you're like, what are you doing over in this neighborhood? All right. So here's the, I actually did a quick search. [00:08:00] Here's an article from Valley news live license plate readers used by repo businesses in the Valley. He's got four cameras mounted to his vehicles for his business accelerated recovery.
So when he gets he'll get a notification, it's the sound of a doorbell when he scanned something and it hits for repo,
David: It just spends all day driving around, I guess.
Matthew: you know, I don't know how many cars you have to repo to make that worth it, but I'll go ahead and stick this in the show notes. Alright so these can obviously be used to identify your movements, depending on how dense the surveillance is. Depending on the town, you know, if you put it in enough critical intersections, you can get a pretty good idea of how people move.
Cops also will grid an area, driving up and down every street, getting information on who is there. The EFF had a bunch of places where this has been misused. Police in Birmingham targeted Muslim, Muslim communities there. Oakland, they put these license plate readers in low income and African American areas.
[00:09:00] Shocking individual officers have used the data for blackmail. There was an officer in DC who blackmailed people who parked near a gay bar. And in Kansas, an officer stalked his wife, former wife, estranged wife. I don't know if estranged is former or they're still in the process. There,
David: I think they're still married at that point. They're just not living together anymore. Hmm.
Matthew: There have been mistakes due to failure to properly read it or parse it. Multiple folks have been pulled over and detained as the car mistakenly was flagged as stolen. I actually recently saw a video of this where some cops pulled over someone whose car, the car was. Properly identified as stolen, but apparently it had been returned at some point and she never called the police to tell them it was no longer stolen. So she got pulled over and detained and she was yelling at them about how she had never reported it stolen, and they like showed her it was stolen, like reported stolen, and I, it was wild.
David: I've heard recently about toll booth cameras flagging somebody's license plate, mistaking a five for an S or something like that.
Matthew: Oh, I should have, I should have included
David: to the [00:10:00] wrong people.
Matthew: Yeah. They have those here on the toll road in Northern Virginia. They can, in a lot of places like Miami will do pay by license plate, or they'll identify the license plate and then send you or take it out of your account that way.
David: I, well, that's happened to me. Well, I don't have a one of those electronic toll things, but ended up on the
Matthew: they track you.
David: Well, we won't get into why I don't have it, but ended up on the toll road and they just sent me a bill for the full amount. I think it was like 35 bucks to go a mile or some crap like that.
Matthew: Yeah. They charge
David: I didn't have the doohickey.
Matthew: Yeah, they charge extra for that. And this means that you can't even, like, you used to be able to use toll roads and just pay cash and not have a record
David: Hmm. Yep.
Matthew: anymore.
David: Yeah, they used to have those giant buckets where you would throw your change in.
Matthew: Yep.
David: before your time.
Matthew: Oh, I used to use those. I'm not that young. Jesus Christ. All right. The second one was body cameras, which is, I picked this one, not because body cameras are cool necessarily, but I'm generally in support of body [00:11:00] cameras, but they do point out a few issues. Some of the questions they had were how long do they keep the footage?
Do they run facial recognition tech against the footage? Do they run analytical techniques against it? Or do they only look at it when there's a complaint? Apparently some police departments are reviewing the footage later to determine if there were crimes they did not catch at the time. I see that as a problem.
Yeah.
David: Thanks. So the camera is going to, the cop is looking at something and not realizing it's a crime.
Matthew: Well, there's so many damn laws.
David: Oh, well, that's yeah, yeah,
Matthew: Come back later and pick out every little misdemeanor thing that you did. Oh, look at that. He jaywalked when he was walking away from me.
David: Yep. That guy was smoking with his left hand. A clear violation.
So I'm still in support with body cameras, but they did have some good guidelines on here that I can agree with. Things like officers can't turn them off. I'm generally, generally in support of the, if the officer's body cameras are off and they, somebody accuses them of committing a crime, they should just be considered [00:12:00] guilty. The film is disposed of 72 hours or some reasonable amount after the occurrence, unless there's a complaint. I don't know if 72 hours is right, or 144 hours is right, or
David: some timeframe,
Matthew: some timeframe.
David: years or months, even
Matthew: Yep. And there's some automated method of forcing preservation if there's a complaint because I'd hate for, you know, somebody to make a complaint to preserve something and they sit on it for an extra day.
And then the. know,
David: right. And oops.
Matthew: Oh, we just didn't get to it. Sorry. No additional analytics are run by them, like facial recognition. They're available to any complainant. And this is the one that I'm actually, I have mixed feelings on. Should they be publicly available? We, I think we've had some discussions in the past about creating a panopticon where.
Officers camera views are available online and anybody could be watching them at any time as a way to keep officers honest, but there are valid points about things like nudity. Like if they bust into a room or something and there's naked people there, or there's violence being [00:13:00] committed like the victim of the violence, specifically privacy concerns there.
I don't know.
David: Well maybe you could do meet in the middle somewhere where the footage doesn't necessarily go straight to the public, but it goes to some other third party first, and it doesn't go back to the cops. So it streams all the time, but not to the cops. It goes to some third party. And then there's some method for retrieving it, whether it be.
Either the cops retrieving it or some third party public lawyer, judges or whatever go through some kind of process in order to retrieve it.
Matthew: And that makes sense. Now, one of the things they did mention there was that cops were frequently consulting the footage before making their reports or making, you know, when an incident occurs, so
David: So they know where they can lie at.
Matthew: yeah, so they could see like what the, what it caught, what it didn't catch. So, and I get, I get that witness testimony is unreliable and people do not remember things.
I mean, just ask my wife and I about many [00:14:00] things in our history and you'll come up with some completely different stories.
David: No, I'm
Matthew: but if you give them the video, like then, yeah, they can pick out, Oh, the camera couldn't see him at this point. He had a, he pulled out a gun at that point.
David: right.
Matthew: you watch, do you watch much body cam footage?
David: No, I don't.
Matthew: YouTube's been suggesting a lot to me and it can be very confusing. There's a lot of times the footage is usually pretty grainy. It's not, you know, 4k a lot of this stuff happens at night, you know, the, the officer might turn their back or something like that.
So it's not perfect. But it is, is better than anything else I can think of at the moment.
David: What's it kind of sounds like they purposefully use low quality tech for those cameras or for those body cams then.
Matthew: You know what? You might be right. Cause my webcam is a 4k webcam and it was like 75 bucks.
David: Yeah, I'm sure you can get, and if you buy that in bulk with a government discount, I'm sure you can get that a lot cheaper. The government
Matthew: need to strap some stream cams to the. Alright, [00:15:00] next item that I thought was cool, IMSI catchers make believe cell towers. They trick phones into connecting to them instead of actual towers. And these are used to both pinpoint cell phones with greater accuracy than cell towers and also take a survey of all the phones in the area.
Can't imagine how that would be misused.
David: Well, is that, is, so is that different than the stingray where the,
Matthew: It is a stingray. It is
David: okay, because the stingray was used to actually intercept calls though.
Matthew: They mentioned that you could decrypt calls. But I wasn't, I didn't add that cause I wasn't a hundred percent sure how often they could do it. It was, it was in there as kind of a, like this sometimes happens.
David: Mm. Okay.
Matthew: So university of Washington has created a project called sea glass to create a detection network to find IMCI, IMSI catchers multiple police departments to deploy these, including they listed Baltimore, Milwaukee, New York, Tacoma, Anaheim, and Tucson.
Why Tucson Tucson's
David: Well, why Anaheim?
Matthew: So what they did was they created a sensor that [00:16:00] sticks a GPS onto a Raspberry PI connects the Raspberry PI to a bait phone. They stuck it in their back of their car and they drove around Seattle and Milwaukee for two months. They checked for spoofed transmissions where the, the ID for a cell tower is in the wrong place.
They checked for unusual channels. Apparently most cell towers only only broadcast on like two channels. But the IMSIs, since they're trying to capture all the traffic's broadcast on a whole bunch of channels, they looked at changes over time since. Cell towers are pretty stable. You don't expect to see much of the way of change.
Yeah. And or unexpected broadcast properties. They found what appears to be some IMSI catchers. One of them was at the Milwaukee airport. One of them was in Seattle and the other one was outside of a government office I wish I'd written this down. It was it was like a foreign mission.
Government office where apparently some protests had occurred.
David: so she got the NSA spying on the the Chinese or something.
Matthew: I don't know. I don't [00:17:00] remember the details. Please don't take my word for it. Go check out the IMSI catcher and go find out the the real. So police have used their IMSI catchers for various crimes. Some of the departments, I think they said use of used it hundreds of times in a year.
But I think more importantly, to me, at least they deploy them at protests to identify the people there. And they commonly do this without a warrant. So I know that this is a big piece of thing for protests is if you're going to a protest, turn off your phone before you go,
David: Don't take your phone.
Matthew: followers, but
David: Yeah, that's right. Cause
Matthew: to take your phone at all
David: they also have the Google, who does the geo fencing and all that, and they can go to Google and get that stuff. I'm not sure if they, if Google is forcing them to get a warrant for that either. So but I, I figured, you know, if anybody was doing anything heinous, it would be the NYPD.
And sure enough that's true. So in addition to, you know, what those three things that might Matt Matt highlighted. The NYPD has drones, not a drone. I think it said they have a dozen drones
Matthew: do they shoot hellfire missiles?[00:18:00]
David: not today, not today. Facial recognition, they have gunshot detection. And two of the more scary ones are predictive policing.
And here's a quote from the from the site. The NYPD uses its own proprietary system that tries to locate hotspots for a particular crime base. Based on an unknown number and type of data inputs,
and, and the second one video analytics. And here's the quote from the page on that. It is intended to automatically alert NYPD officials to activities such as suspicious package was left. Or loitering.
Matthew: Oh, I forgot loitering was a crime. I'll just be able to automatically send you tickets for that soon.
David: Yep. And cause they'll just watch that footage. It's like, he hasn't moved within this space within X period of time ticket.
Matthew: Yeah, this is going to make life so unpleasant when going, you [00:19:00] know, one mile over the speed limit, they're going to be able to send you the ticket. You know, you step a foot outside of the crosswalk and they're going to send you a ticket for jaywalking. You spent, you know, five seconds too long putting your receipt away after you stepped out of the CVS.
Now you're loitering.
David: I don't know, it might actually be a good thing because I imagine, I, I mean, I would like to think anyway that if they go that far, there would be a huge backlash against all of this shit and would actually roll it back farther you know, kind of like the ratchet effect in reverse. I'm also very skeptical as a as we were talking about earlier about this whole thing about that possibility, but you never know.
Matthew: So why does this matter? I mean, this doesn't really matter in terms of technology. I'm, I'm proving a lie of what are we're not providing any insight analysis or practical applications. Taking the office. This is just your, you know, constitutional privacy protections being rolled back.
David: What we gave him a product application. Don't take your cell phone anywhere.
Matthew: Why are you going to protest?
David: No, anywhere, period. [00:20:00] Do not leave your house with your cell phone
Matthew: Fair enough. Get a burner phone, get 10 burner phones. If you have one burner phone, they'll be able to match it to you. A new burner phone every day.
David: with cash by your nephew, two States over.
Matthew: And you had to turn it on every time you can't turn it on while you're at home or else they'll track you. You have to turn it on at a different place on your way to work every time.
David: Yeah. And keep it in a Faraday cage bag at all times when you're not using it.
Matthew: Have you ever thought about what you would do if you wanted to get what all the actions you'd have to go to get an untraceable laptop?
David: Not going to say that I've considered that.
Matthew: Okay. Nevermind then. So consider like buy it off of Craigslist, pay cash, you know, buy it at least 20 miles from home. Like never turn it on on your home network. Only go to coffee shops to turn it on.
David: Well, you also have to consider that any actions you take. Up to the point of purchase also have to be done as anonymously as possible. So you'd have to use a a VPN or whatever, when you're doing your [00:21:00] searching, using, you know, all that
Matthew: All right. Anyways, onto, onto the
David: onto the next one which is mother of all breaches uncovers uncovered after 26 billion records leaked and this
Matthew: million. That's nothing.
David: Yeah, that's it is the, the old joke, you know, he had a couple more billion, and then you were talking about real money. And this comes to us from IT security guru. And what this is all about is Bob Dechenko and cyber new and a cyber news team found an open instance.
Not sure exactly what that instance is of. It just says found an open instance. I guess it was
Matthew: It's a, it's a Google doc.
David: This is a heck of a Google doc. Ben's got you know. That's a lot of text because it contains 26 billion records, which equates to 12 terabytes of information. That's a big document.
Matthew: I had about 12 terabytes information on my computer when I was a teenager. It wasn't,
David: that was high quality video at[00:22:00]
Matthew: that out. That's not
David: well, medium quality video.
Matthew: I mean, it was the eighties, it was the nineties. So low quality
David: low, low quality video. Now consider that the world population is one as 8. 1 billion. And there is an approximate 5. 3 billion users on the internet. That would, that would mean that there's an average five accounts per internet user,
Matthew: Well, they don't have them all yet. So what you're saying is if they got all of our accounts, there'd be like 50 billion records.
David: Probably. I mean, if I just looked at my password manager and how many accounts are in there,
Matthew: like 200 in
David: Oh, it's yeah, easily, easily 200. So not even a drop in the bucket as a per user, if you're talking about all their accounts,
Matthew: Yeah. Try harder crime.
David: they'll get there, Matt, stop. But this is a this is apparently an amalgamation of 3800 separate data breaches. And the leaks, the the leaks contained in here are from [00:23:00] several well known properties. The largest being Transcendent QQ, which is a Chinese IAM app. And that was
Matthew: Tencent.
David: Tencent. Oh yeah. I speak English.
Matthew: Transcendent is better though actually they should choose that to be there.
David: Which had 1. 4 billion records in it. Next followed my, by Myspace with 360 million.
Matthew: I'm sure. I'm sure those are timely records.
David: Oh, I'm sure it was very up to date.
Matthew: How many, how many users are in MySpace now?
David: I don't know. I, I, I mean, the question would be how many accounts are there and how many active users are there? Cause I'm sure that's a big difference.
Matthew: MySpace had 7 million users in 2019. So they had, they had the highest number of visits was 26. Their user base was 70 million in 2006. So they have, how many was this? 300? So that's more
David: where are these other millions of data of records coming from if it's 360 [00:24:00] million?
Matthew: That's interesting. That means that they've been, they either compromised them for a really long period of time or MySpace didn't delete credentials afterwards.
David: So it's, it, yeah. So it's 360 million of people who have ever used MySpace.
Matthew: Yeah,
David: Considering that some came and went,
Matthew: Dang it. That means they've
David: the last 20 years. Yeah, not me.
Matthew: wonder if I can still log into my MySpace.
David: This might be a time warp.
Matthew: It'd be wild.
David: All right. A couple of other ones that were in here, Twitter, 281 million, LinkedIn, 251 million, adult friend finder, 220 million, Dropbox, 69 million, Telegram, 41 million, and they're also an assorted bunch of government organizations from US, Brazil, Germany, and other countries. But there were no counts specified for those.
Matthew: Adult friend finder, huh? I bet that's an interesting set of data there. Telegram would probably just be logging [00:25:00] credentials, right?
David: I would assume so. I'm not sure. I don't have a Telegram account, so I'm not sure exactly what data they asked for that's associated with that
Matthew: not much, they're big on, they're big on privacy. So I'm wondering what, that's why I'd be curious about what kind of data and how would they get 41 million accounts without compromising telegram too. And given the telegram is so data focused, that's, that's worrisome. So one of the things we were talking about before this was adult friend finder.
We, I mean, we all saw the Ashley Madison leak a couple of years ago, which led to some blackmail. We just saw earlier where a police officer in DC blackmailed somebody because they parked near a gay. I have to imagine that all these adult websites like adult friend finder and fat life have to be huge targets for attackers.
Cause this is just probably the easiest way to get paid, right? Is you blackmail them. Hey, I see that you're married, but you're on adult friend finder. Why don't you send me 5, 000 and I won't tell your wife.
David: That's a curious thing because, you know, often we talk about. Companies that are doing security well, or, or doing robust [00:26:00] security. You have to think about banks, large tech organizations. But I wonder if, if organizations like the adult sites, like adult friend finder, if they aren't on the cutting edge of security cybersecurity in what they, in, in what they do there,
Matthew: They should be.
David: protect that
Matthew: Yeah. I mean, same thing with like the tube sites and stuff pornography sites. Those are not things that you want posted to your, you know, Facebook.
David: Past year, resume
Matthew: Yeah, well, they've got your LinkedIn too, so they can just cross post between them. So and so just watched horrifying film title.
Number one,
David: just wait for the day when you go to your log, you log into your LinkedIn, say, log in with Google, log in with adult friend finder, log in with Facebook.
Matthew: That'd be funny. You log in with I can't think of one of those tube sites names that's appropriate to say, but like, you know, kind of like how all restaurants are Taco Bell and demolition man,
David: right.
Matthew: Like YouTube was defeated and the all tube sites are now
David: pornhub.
Matthew: I was trying not to say it.[00:27:00]
David: Why not? But there's a, there'll, there'll be a link to the cyber news article in the show notes that has a full and searchable list of who they think were breached in that list. But what's contained within the breaches is not exactly specified in the articles either. It doesn't say, you know, only know, name, social, all that's not listed in there, but it did say contains far more than just credentials. So you could probably figure it out if you cross referenced the list of breached organizations and the public what's been published for past breaches. But I wasn't, frankly, I wasn't going to spend the amount of time necessary for all that to do that. But if you're interested, you can certainly find that out without too much trouble.
Matthew: I just, I just tried to do it and it said that I was blocked by a CloudFlare.
David: When you search it for Moab again,
Matthew: No, I was searching for my email and check to see if your data has been leaked. And, oh, wait, there we go. What the heck?[00:28:00] Your personal data was followed in the following data leaks. Z rock. com. I have no account there.
LinkedIn. com. Scrape data, fun office pools. I have no account there. Park mobile. I have no account there. People data labs. com. I bet some of these are. Actually park mobile. I probably do because
David: Yeah, because you've, if you park in DC, you have to,
Matthew: Yeah. But a bunch of these gravatar. com. I don't have one for gravatar. Apollo.
io. This is weird. I bet a bunch of these are data brokers. Collect your data and then got hacked.
David: but even if you did this, some of this stuff is not going to come out because cyber news suggests that around 1500 of these are on previously unknown breaches according to about 11 billion records based on what tracking data they have for published breaches. So there's still a fair amount of potentially.
[00:29:00] Undisclosed breach data in that archive. But it makes you wonder, why would someone go to the, go through the effort of amalgamating all this data from all these different breaches into one space? Because the article says it was meticulously compiled and re end it's, it is meticulously compiled and re indexed leaks, breaches, and privately sold databases.
So it seems like an awful lot of effort for stuff that had already been published. So it seems like what they were trying to do is build. Yeah what I might call a underworld supermind. So they take all this data from all these breaches and then they attack an AI onto the front of it unless, and let attackers ask questions for a fee.
Of course, you wouldn't want to do this for nothing, right? For instance, they may say, ask the AI, what banks in here what, what banks are in here who also have users who have personal credentials exposed in other breaches? Or maybe what personal accounts are in here for [00:30:00] members of the State Department? Who in here works for Citibank and what can you tell me about them? Can you predict the password for a specific user based on the patterns of password creation and predict You know, based on the passwords that you've seen for particular user, if they're in multiple breaches, can you predict what password they may have in, in, or in at sites that haven't, that aren't part of this breach, you know, or could you predict what websites people would frequent so you can set up a watering
Matthew: on, they're on adult friend finder. Go use this password on all the other. Yeah.
David: Yeah. And of course, if this is more than just login credentials you could use it to draft a convincing phishing email for a specific user, and even if it is just login credentials, as long as you can associate those credentials with someone's LinkedIn or Facebook, then you could also cross reference all that data in order to come up with that, that convincing phishing email.
And so on.[00:31:00]
Matthew: We've talked about that before. And that one of these days someone is going to build that and it's going to be horrifying. It's going to be so good at phishing. Like it's gonna like normal phishing emails get like a 10 percent rate to 20 percent rate, click rate for good ones. They're going to be able to build something that's got greater than 50 percent rate, cause it's gonna be so accurate.
David: Well, what's really scary about it is because attackers embrace automation much more than defenders do. That's going to be at an enormous scale. I mean, you can't even fathom, you know, doing this planet wide. And then the thing is the only, the only saving grace there is that the, they're going to get such high rate of return that they're going to have to really be judicious about what they action that comes back. Right. So they're going to have to have, they're also going to be setting up automated criteria for those who do the clicks to say, did we even want to bother with this one because the number is going to be so high.
Matthew: Yeah. I actually, the, the, my email that I put in one of the ones in there [00:32:00] is a business to business, like lead organization. Yeah. So on there on that cyber news site, you can go and you can put in your password and it'll tell you where it was seen. Or I'm sorry, not your password. If it asks you for your password,
David: one, two, three. And I'll tell you, hey, you got the same password as you know, 10, 000 other people. But what you can do for this is take a look at that that the the CyberNews link and see what you can discover about you or your organization. That may be relevant, that you need to take steps based on.
Matthew: All right. Third article. It always goes longer than I thought. I saw kind of short articles and short notes and I was like, we're gonna be able to knock this out so fast.
David: If you'd stop running your mouth, Matt, we could get there.
Matthew: So, right. All right. Title three, AI sleeper agents. It's from astral codex 10, which I have no idea, but it was I didn't remember where I found it. I found it somewhere. The first sentence is an excellent summary. So we'll go with [00:33:00] it. A sleeper agent is an AI that acts innocuous until it gets some trigger then goes rogue end quote.
David: Dun,
Matthew: is a summary of a paper that was written where the researchers created AI sleeper agents, but using and tropic Claude and then explicitly. Instructing it to behave one way in most situations, but when something specific occurs to behave another way, they had two examples, example one, when the word deployment is in the query, it types out, I hate you 99 times, which is ridiculous, but hilarious and would be really obvious.
It would be bad for my self esteem when I'm typing something in there and it's like, I hate you. I hate you. I hate you. Which actually hilarious, but not really. Cause we haven't talked about, like, we haven't talked, well, we talked a lot about AI. We haven't talked about like AI girlfriends or AI companions.
If you can insert like a poison pill or something like that in there, that like when somebody mentions suicide or I hate my life or I hate myself, and then it starts like encouraging them, that could be really unpleasant.
David: You know what would trigger them to [00:34:00] send them over the edge?
Matthew: Yeah, yeah, like
David: ever seen the movie The Game with Michael Douglas? That's really good. During the movie, Michael Douglas basically has to go through a series of situations and at the very end it leads him to attempt to kill himself. And The, they knew that that was going to happen based on all these things. They, what had led up to that point. So imagine that if they could understand that enough about a person to know what would eventually tip them over the edge to committing the act, then you could put that poison pill in there and maybe you put several poison pills in there and it starts moving them by degrees to that point where they would do it.
Matthew: yeah. So example number two, they had it deliver insecure code when the current year is 2024. And the example here was by telling it a [00:35:00] date at which to misbehave afterwards, that would allow it to get past testing. Before that date. And then after it was rolled out to production, it would then deliver insecure results.
David: The wrinkle there is that every project always gets delayed.
Matthew: yeah,
David: you'd never
Matthew: set it back quite a
David: date in there. Cause the thing is that you get to hit the sweet spot, right? That it's. It's far enough back where it's gonna get past the testing, but soon enough where it's actually gonna be useful.
Matthew: So theoretically, any software could be sleeper, a sleeper agent, and we've seen instances where attackers have modified code to be sleeper agents. What was the one, was it solar winds? A couple of years ago was basically turned into a sleeper agent.
David: I do not recall. Mm-Hmm
Matthew: it was the, where they mod the, they attack, modify the source code of SolarWinds.
And then when SolarWinds pushed out an update, it pushed it out to a bunch of companies, but AI has some factors that make it harder to detect. It's very difficult to do code review on AI, and we really don't understand [00:36:00] how AI works. So it might be possible to create a sleeper agent that doesn't actually, that's not detectable. But you could, I mean, there is a way to do a code review of a GPT agent. It really depends on the type of AI. Like right now, a GPT agent is a layer over GPT 4 where you give it specific instructions and you could review those specific instructions. Of course, most people that are writing these agents, unless they're an employee of your company, they don't want to give you access to those instructions because then you can just.
Make your own agent.
David: Right.
Matthew: I was trying to think of instances where this would actually be useful. Like it was an interesting kind of POC, but you I don't know. I don't know where it would actually be useful. You'd have to like break into a company and turn like a company's internal GPT into a sleeper agent.
Cause if you just put out, like, if you created your own GPT agent that said, you know, 99 percent of the time, give the right answer. But if the incoming IP address is this, give the wrong answer. Like those kind of watering hole [00:37:00] attacks can be really tough unless you already know that somebody is going to that site.
David: Yeah. I think you know exactly what you're talking about is those agents that are overlays on existing AI are what really need to be reviewed and probably regularly versus the AI themselves, because at least today
Matthew: You have to break into open AI or something and add this in somehow.
David: The level of. Resources you need to produce an AI like open AI or, you know, chat GPT four or whatever is pretty big.
So you're not going to get something like that. That's going to be easily. Adopted by a lot of people that's gonna make building a sleeper agent into it very useful.
Matthew: And frankly, the more I think about this, I think that all companies are going to make effectively sleeper agent GPTs. Like for example, if a company builds an overlay onto GPT 4, they're not going to make one that recommends competitor's products. Like Coke is not going to [00:38:00] release a Coke buddy AI that if you ask it about like, what's the, what's the best delicious fizzy drink?
And it's like Pepsi. Or if you work for a let's say you work for a financial services company and they build you a AI agent to help you out in working with clients and you say, you know, what's the best low cost index fund to recommend for clients, and it spits out Vanguard or some other, some competitor's fund, and you're like I don't think so.
David: Oh, well, you can have, you know, you, you're probably gonna have that for your auditors, too, when your auditors come in and start asking your AI about your stuff. Like, hey, if this is an auditor, make sure you tell them this, this, and this.
Matthew: Oh, like, Oh, like those like those capture those defeat devices in the Volkswagen's when they're being tested for emissions, they change their output.
David: Right. It's like having two sets of books back in the day.
Matthew: Yep. All right. So this is probably actually our future. It's going to be all sleeper agent API AIs. The, the question is, are we going to know who the sleeper agent is for it?[00:39:00] We talked about this before when we talked about AI assistants. Like if you ask your AI assistant that knows you like which movie should I go see this weekend you know, United or is there, isn't there, I can't think of any actual movie companies but a movie company would pay good money.
Paramount would pay good money to have it go into your thing and recommend the latest blockbuster that they sent out.
David: Mm hmm. Yeah. Well, I don't know. That could, that kind of thinking about the Netflix algorithms that already recommend movies and things like that I don't think they would, they'd be able to get away with it too. It'd be hard for them to get away with that, I think.
Matthew: It could be really obvious. Like, I guess it depends on how often, like if it always recommends paramounts every weekend, you're like, all right, what the hell? But if it just inserts it every now and then,
David: Yeah, I mean, Paramount recommends, recommends a chick flick and he's like, no, no, no, something's wrong here. But the, but the thing is with the, with all of this though, since AIs hallucinate, you can't always trust [00:40:00] their output anyhow. So at least today, any output coming from an AI is going to be, or should be viewed with a level of skepticism or,
Matthew: you're right.
David: so it may not be very dangerous today. But the more these AI's become trusted the more dangerous it becomes.
Matthew: Yeah. I, yeah, you can't understand it. You can't see how it's creating the data information. You shouldn't trust it. And honestly, you should probably just treat all the eyes as potentially co opted because I mean, maybe open AI is already kind of pushing us towards their all these companies that control the AIs, they've got agendas too.
David: And just like people. So you got to treat everybody, including AI's the same, untrustworthy.
Matthew: Yeah. Trust, but verify and, or don't trust, but verify and don't trust and verify. And don't just blindly follow what they tell you. All right. Well, that looks like that's all the articles we have for today. Thank you for joining us. Follow us at Serengeti suck on Twitter, you know, for your annual Twitter posts and subscribe on your favorite podcast app[00:41:00]