Episode Transcript
Transcript is created by AI. It definitely has errors. It's provided mostly for Search Engine Optimization.
David: [00:00:00] Welcome to The Security Serengeti. We're your host, David Kuehner. Stop what you're doing and subscribe to our podcast and leave us an awesome five star review and follow us at SerengetiSec on Twitter.
Matthew: we're here to talk about cybersecurity technology news headlines to hopefully provide some insight analysis practical application that you can take in the office to help you protect your organization.
David: And as usual, the views and opinions expressed in this podcast are ours and ours alone do not reflect the views or opinions of our employers. I
Matthew: 99 percent of lawyers give the rest a bad name.
David: think that's too low.
Matthew: That was less offensive than the other
David: It's like, it should be five nines.
Matthew: I'm not into flyers.
David: You ever see the movie Interstate 66?
Matthew: no.
David: That's pretty good. It's [00:01:00] a, it's very libertarian though, but
Matthew: lawyers, I assume?
David: there's, there's a town full of nothing but lawyers. 100 percent of the people who live in this town are lawyers. So anybody who happens to stumble into the town gets prosecuted for something.
Matthew: awful.
David: Yeah. I think that's the joke.
Matthew: That's fair. All right, so our first article today and we might only get to one because we're looking at this our first article we picked out two, but this one seemed, this one ended up being longer than we thought and and somewhat difficult to parse. But it is called, Blessed are the lawyers, for they shall inherit cyber security from Venture Insecurity. Ross at Venture in Security has brought up an academic paper called Blessed are the Lawyers, for They Shall Inherit Cybersecurity. The summary of this paper describes how the current quantitative practitioners of cybersecurity have failed to accurately guide risk decisions. They argue that lawyers prepared by [00:02:00] ambiguity in the law are more appropriately positioned to make good decisions in regard to legal risk, as opposed to the quants, as they call them, making good decisions about technical risk.
Interestingly enough, they don't say that technical risk disappears, just that lawyers will be able to provide more concrete decisions as the legal regime changes slower than the technical regime changes, so they can provide more certainty for legal risk than quants can provide for the technical risk.
They say this will lead to lawyers being prioritized more often, and therefore what they're calling their ascendancy. I don't know if I 100 percent buy this argument, but for the sake of the article, we will go along with it. They listed out a number of consequences of lawyers taking over cybersecurity risk decision making.
The first one is reduced information sharing. To lawyers, information is evidence, and if you share potential evidence, you may increase your legal liability. And we'll, we'll talk about these more in more detail down at the bottom. We're just going to go through kind of the summary first. Number two, reduced documentation.
As a result of lawyers being more [00:03:00] involved in IR, they found that reports have moved towards being informal and verbal because they want to reduce the amount of documents that are discoverable. You know, say, oh this was our fault in an email now that's discoverable in court. And a move towards reasonable versus effective.
The authors of the paper argue that lawyers prefer reasonableness, reasonableness or appropriateness as a criteria over effectiveness. The comparison is the old statement, no one gets fired for buying IBM, and this is going to lead to more decisions related to security compliance. This is I didn't include this in the notes, but they mentioned this in the paper that there's a lot of talk of reasonable or appropriate controls and like the SEC regulations and other government regulations.
So we're
David: by lawyers.
Matthew: my lawyers. So this is going to be about making sure that the company doesn't get in trouble the next time it's attacked or breached, it's not going to be focused on making sure the attackers don't get in
David: Yeah. There's another consequence, which is making, making cybersecurity [00:04:00] evil.
Matthew: everything they touch and when they say inherit, they mean that the legal reasoning and decision making will replace technical reasoning and decision making at the leadership level. They're not saying that technical decision making is going to disappear, just that it's going to be deprioritized and done at lower levels.
And CISOs are going to be more focused on the legal side. Now Ross comments in his article. So this was a, this was Ross's response to this academic paper. The academic paper was about 14 pages and was written in very academic language, which was a little tough to read. I'm a smart dude, but I hate academic papers like that.
David: Yeah. I had three pages of footnotes.
Matthew: So Ross comments on this that. He thinks that he, he says that what they're saying is true, but only available in companies are spending the most because he talks about a quote engineering centric enhancements like continuous testing, continuous compliance, detection, engineering policy as code, those types of things companies that spend a lot of money and are we talked about this before, kind of [00:05:00] engineering companies versus the new, the new type of company, software, software based companies versus old school companies.
He's saying that only the new software companies, which start off early and they don't have, you know, legacy issues. They don't have that years of historical debt, and they're able to take advantage of the most modern tools. He said, he thinks that those guys are going to be able to quantify the risk effectively.
But the old school companies are still doing stuff by the what is it? Oracles and Sears, and, you know, have devices in their network that are 30 years old. And, you know. Concretions of networks from various purchases and stuff like that, like those companies are never going to be able to do quantitative risk management or environments are too complex.
And they're going to be, they're going to keep falling into this legal regime because that's, that's what they got.
Right. So that's the summary discussion points. My first item here is just a comment on lawyers. Like, isn't it a bad [00:06:00] thing that the law is so ambiguous that it requires a whole class of people to interpret it? Seems like you would want the law to be clear cut enough that the average person could follow it.
But
David: Well, then you wouldn't have any lawyers if the average person could follow it.
Matthew: you're
David: well, this stems from, you know, the common law concept of found law versus made law. You know, judges would use prior cases and and prevailing communal attitudes to determine if the law had been violated. You know, this and this made sense, you know, a long time ago when and was more workable when there were fewer laws versus today when, you know, was it 4, 000 laws a day or getting written or something crazy like that where they regulate everything under the sun from what you can eat to how much water you can have in your toilet.
Matthew: Beautiful. Just really beautiful.
David: Well, it is for them.
Matthew: And frankly, don't we need, well, so this question can actually probably get rid of, cause they did address this later. I added this question in the notes as I was reading through it. Like, don't we need both? Don't we need [00:07:00] people that can make both technical risk decisions and legal risk decisions? And as I worked my way through the paper, I saw it later.
They did mention it. But it is.
David: wondering if they, if they're expecting or, or they're hoping that this is going to be embodied in the same people,
Matthew: Oh, interesting.
David: because he talks about how it's not the, that the lawyers will take over so much as the mindset of the lawyer will take over.
Matthew: they'll rely more on decision making based on legal rather than technical causes or reasons.
David: Right. And not necessarily the lawyers will do that per se, but that there, that the idea of, you know, you talked about the what was it, the reasonable and appropriate. Versus effective that that mindset takes over the thought process within security.
Matthew: right. Yeah, they point this out because, well, I mentioned that part before. Yeah, and, and I think the author [00:08:00] mentions that the, that if he had time, if he had another 24 hours in the day, he might consider taking up law. It makes me wonder if kind of the, the perfect person they're looking for, for leadership positions, because, you know, like, a lot of the time right now there's, there's a push towards getting folks in management positions into MBA programs so that, you know, they have the technical knowledge and they have the business knowledge.
I wonder if there'd be like a future program that incorporates the law into that.
David: Yeah. That's why I was thinking, I'm wondering if you're going to have either a master's degree in cybersecurity or even bachelor's degree in cybersecurity that have a law contingent or a law requirement,
Matthew: Is there an equivalent? Is there an equivalent to an MBA but more legal focused? Cause you have to No, probably not. Cause you have to pass bar and be a lawyer to practice law.
David: Well, you have the what the heck did they call it? Well, to practice law, you have to, you have to pass the bar, but they have the, JD Was it doctorate in jurisprudence or something?
Matthew: Doctorate. Yeah, [00:09:00] JD is Juris Doctorate or something.
David: Yeah. Which I think is the equivalent there.
Matthew: Yeah, there's a Master of Legal Studies, designed for non lawyers who could benefit from a deeper understanding of the law but do not want to become a practicing attorney. Huh. So yeah, it does look like there's Master's in Dispute Resolution. Juris Doctor is a legal degree for people who are going to become a practicing attorney.
Master of Laws is for if you want to specialize in a particular area of law. Hmm. So yeah, it does look like there are options.
David: Yeah,
Matthew: of an MBA, Master of Legal Studies.
David: now when you said that just remind me of a class I took once was instructed by someone who had a master's degree in interrogation and detection of deception.
Matthew: That is a heck of a Degree. All right. So, all right. So if I do a search for Master of Legal Studies in cybersecurity CSU has this in Ohio. What is CSU? I mean, SU is obviously State [00:10:00] University. Cleveland State University. That's interesting. So they're ahead of the game here. Maybe I could do
David: Ohio usually is
Matthew: Now here's another one. Drexel University has a MLS degree, cyber security and information privacy compliance. That's interesting. It's 25 credits. Introduction to the legal system, compliance skills, ethics and professional standards, legal research, risk assessment and management, and then two capstone courses for five credits. That's interesting. That sounds more interesting than an MBA. I don't know that I care about an MBA, although maybe I'll find out that I don't care about legal stuff too. All right. Anyways I do think that
David: you find out before you spend too much money on it.
Matthew: Yeah. Now I do think that that does present an interesting choice for people.
Cause I do in an ideal situation, you would want a CIO who knows the business, they know the technical aspects and now they have to know the legal stuff too. That's a lot to ask.
David: I think it would be better to have the CISO be a technical person and then their deputy be a lawyer, maybe.
Matthew: Really? Most places I feel like do the opposite where they put the less technical [00:11:00] person in the leadership role, and then they have like a more technical deputy to actually do the implementation. I'm not arguing. I'm not saying it should or should not be that way. I just feel like in most places.
David: Yeah, which sounds right,
Matthew: Yeah, that's fair.
David: but I think 1 of the points that the paper is trying to make, though, is that the more the legal risk, you know, the, the consequences of a compromise, whether it be executive personal liability or the cost of lawsuits or whatever with the legal risk. Is more more weight than the, than the technical risk, then that's the point where the, the legal considerations are going to be more than the technical considerations for defense.
Matthew: Yeah, I think that day might have already passed. Honestly, I think that already the legal. I mean, we'll talk about this in a bit, but nowadays there's very little technical issue. Well, we'll talk about it, but all right. Next question. What do you think about the argument that lawyers will provide better guidance than what quote unquote [00:12:00] quants?
Although I think regardless of whether or not it's better advice, I think sea levels are probably more critical. Comfortable with lawyers than they are with technical professionals.
David: Yeah, I think it might come down to personalities and trust. You know, if the C suite has more faith in the CISO than, than the OGC, then the CISO may take the lead versus the other way around. So it might actually come down to personality conflicts or personalities in, in the in the organization itself.
Matthew: I could see that. And I could especially see that with maybe, cause I imagine that the CISO will be very comfortable with the technical folks since he came up through that group, but he may be a lot less comfortable and I mean, the CEO may be a lot less comfortable in the technical folks, the CEO will be way more comfortable with kind of the lawyers because they deal with lawyers all the time.
David: Yeah. And they're probably from the same cast. Yeah. And
Matthew: Went to the same schools and all that kind of stuff. It's another version of haves and have nots. [00:13:00] We talked with, as I mentioned before the, the comment about Ross we're well funded and kind of the software driven companies are going to have effective and measured security programs where they can actually put like a, a quantifiable change in risk based on new content rolled out or choosing one tool or another tool.
But the less effective ones are going to have to make do with the legal decision making.
David: in the small organization where you have, you know, one cybersecurity guy or two cybersecurity guys, and you have a legal, what's that?
Matthew: and probably 10 lawyers.
David: Right. Yeah. And you have an OGC office, which, which has got a lawyer or two in it. Now, who are you going to believe the guy in the hoodie or the guy with the tie?
Matthew: Yeah, there's, I'm going to go ahead and jump to this part before. It's item six on the list. So Ross mentioned that the effects of security incidents on stock price has been mostly nil. There's a couple exceptions. We've, we've called out a couple of companies that have been driven into bankruptcy because of a security issue, but [00:14:00] like target dip for a bit came back.
SolarWinds has not, but CrowdStrike has. If many security breaches don't really matter to the stock price. Now, of course, this is for, for publicly traded companies. It's in it's externalized to the user whose data is lost. Then does a breach really matter? And interestingly, this explains the focus on ransomware for a business perspective as a type of attack that actually does impact how you do business.
If they steal your information, that doesn't change how you do business. But when they ransomed and shut down your systems, now you are no longer making money.
David: Yeah, well, I mean, that the that's because ransomware is actual is more akin to theft versus the stealing information. The stealing information is actually copying it, right? Ransomware is actually denying you access to that as if they'd taken it from you. So that is more like theft versus simply copying what you have.
Matthew: Now, yeah, it's like turning a digital good into a physical good because they've destroyed your copy of it.[00:15:00]
David: Right, exactly.
Matthew: Can you imagine if somebody like stole a bike and they just left a copy of the bike there? They just had a second copy of the bike. You're like, Oh, I don't care. Let them ride on their copy of the bike. so, okay.
David: And that's why, that's why a lot of people are say that, you know, theft of intellectual property or anything like that can't be actually categorized as theft because it has not denied you access to this, to that. So you can't, you can't call it, you can't call that theft at all.
Matthew: it's interesting. Yeah, so let's go back to the original question that I, I distracted us from there though. Does the breach really matter? Like, in terms of if, if you have a breach and it costs you a couple million dollars in response or, you know, hiring Mandian or someone to come in and your stock price dips briefly, but then rebounds then maybe the money's better spent on quote unquote, the legal decision making to try and ensure you don't get sued.
Versus the technical decision making, which is, which may come down to, we need to buy 10 million more [00:16:00] dollars worth of gear or something to fix us.
David: Well I think that's, that's the, the factual understanding of Today is that the businesses have decided, you know, for, you know, as long as as long as sub security has been in existence, that money would be better spent elsewhere. Otherwise, they would be spending the additional funds necessary in order.
Sure. You're doing automated batch management and all that stuff, but they've decided that, you know, That the effort and the money spent on building up security is not worth the cost. I mean, this, I think this business decision has basically been made and you know, a couple of decades ago and continues to be made in the same way.
People just don't like to hear that or like to like to think of it that way.
Matthew: Yeah. And nobody likes thinking that the, that attackers stealing your data is totally fine. They did bring up, that firms offering quote firms offering free credit monitoring after a breach are six times less likely to get sued. [00:17:00] That's the type of.
Thing we're going to get from, you know, legal, legal decision making. We're going to get a lot more solutions like that rather than technical solutions, you know, maybe it costs us 1 million to provide free credit monitoring from our partner versus it costs 5 million to upgrade the firewalls and hire two engineers to tighten them down.
David: Yeah, that kind of annoys me because I'm wondering how much value is there really in credit monitoring.
Matthew: None. We're
David: you know, we're going to give you, you know, you lost your arm, so we're going to give you a bagel or something. It's not really helpful for the situation you find
Matthew: a bagel. That's hilarious. I want a bagel. I don't disagree, but that's kind of the point, right? Is that from their perspective, it's cheaper, it is a lot cheaper.
David: Well, it's also praise on the ignorance of people who realize who don't don't realize that what they're really giving you is not helpful or is not really that all [00:18:00] that useful
Matthew: I
David: because the companies at the end of the day, realize that it's going to be hard, difficult for you to prove actual harm. when, when, when one of these data breaches actually does leave to lead to actual harm for, you know, one of the people whose data was stolen.
Matthew: yeah,
David: So it's the externalities problem that we talked about a couple episodes ago,
Matthew: as long as, as long as it's, as long as the impact falls on the people whose data was lost, the company doesn't care. Unless
David: know, and, and if you think about this in terms of. card data, right? So before this is probably a very long time ago. I can't even remember in my lifetime when it was different. When credit card fraud was the, the, the problems or the, the downsides of credit card fraud was born by The people who had the credit cards, the people, the individuals who had the cards. It was a lot more rampant until they changed the laws to put that on the credit card [00:19:00] companies where now people, you know, if I get my credit card stolen, I don't really care to be honest. It's inconvenient for me. Because I have to dispute some chargers and I have to wait for a new card. Other than that, it's not a big deal for me because they put all that, the trouble onto the credit card company.
They have to suffer the losses, which also means they have to take the steps in order to prevent credit card fraud, which is why they spend so much money on doing so, you know, you have to make a similar thing happen for data theft in order to get that, the, the risk of data theft down, you So it's a matter of you know, who, who ultimately is responsible, who's got skin in the game for data theft. Right now, it's not really, there's not much skin in the game for the companies who maintain the data or carry or hold the data.
Matthew: That makes a lot of sense. And yeah, unfortunately, how do you get, how do you get them to have more, [00:20:00] it's got to be government regulation, right? Or we would have to do something as a, as a people altogether to force them to, I don't see how we could do it.
David: Well, I'm not a big fan of government regulation, obviously. But
Matthew: why I'm asking, because that would be the immediate first answer would be government regulation.
David: yeah, and it would probably be the most expedient because in order to avoid the necessity for government laws and regulations around that, you would have to get common everyday people and the majority of them to understand or realize that that data is their property. It belongs to them versus belongs to the company.
That way. If it's your property, you can have legal recourse to damages against it versus now where it's the property of the company, but you're the one that suffers the externality for it. So the only way to do that in a rapid way, unfortunately, would be government regulation or laws around. What the disposition of data is, who the owner of data [00:21:00] is, versus people simply coming to that realization that it belongs to them because they're the ones that suffer the downside risk of data exposure.
Matthew: Yeah. I would absolutely love to see. A government law that officially defines all of the data that companies collect on you as owned by you, like you own your data, you and your DNA, you own your, and then we talked about this before where you, I think you talked about someone who suggested like having an information vault that you could let companies access, but make them pay for it.
Like, you, yeah, you can access my vault if you give me, you know, 5 for 30 days access or something like that.
David: It actually almost reminds me of the, that the, the incorporated band, right? With the stocks.
Matthew: I actually, so I know that that was presented in there as a big negative, but actually like the idea of selling part of your self ownership as stock. I think that you would just want to set it up so that people couldn't sell 51 percent of them selves,
or you could sell more than 51%, [00:22:00] but like you, you can't give up a majority. Like you can only sell an amount that. You know, or you could call it a USL 100%, but you still have the ability to make your own decisions. They're just buying into your future of what they think you will do, how successful you'll be.
David: Yeah, I'm on the fence about it. Although I'm, I do lean towards that being a good idea.
Matthew: I think it, well, I think it's better right now than the the debt that students get into for life. Like you can, you can, some, some, someone will loan your silly 18 year old self. You know, 150, 000 to go send you to college and some sunny beach town where you learn, you know, where you spend 120, 000 on art on an art degree, where you're going to make, you know, 15, 000 a year for the rest of your life.
But if you were selling stock in yourself or you were selling like a slice of your future earnings, which is what the stock would represent, they'd be like, all right, well, what are your plans? You're like, oh, I'm going to go get a four year degree in art. And then be like, Oh, nope, nope, nope, nope, [00:23:00] nope.
David: What's that? I mean, it's just like, I'm not buying your stock for you to do that.
Matthew: Right. Yeah.
David: what, but the thing is that, but if that person is Van Gogh or Rembrandt, right? And they said they're going to do this, then sure you want to invest in them.
Matthew: But here's the
David: all a matter, you can't, you can't bump everybody all into the same thing.
You're like, okay, what do you want to do? Why you want to do it? And it doesn't make sense for me to spend the money on buying stock from you.
Matthew: But I think that that actually has some impact. Like, so let's say that you are a future Van Gogh or Rembrandt, and you're in high school and you're making wonderful art, and like, you're already, like, people are seeing that art, and all the other kids in the school know, and the parents know, and the school teachers know, like, they can sort of crowdsource by being like, oh, I believe in this kid.
I'm going to buy stock in them. And then the stock starts going up. It naturally starts going up. Like you're introducing market forces into that.
David: Right.
Matthew: Like allow you to start identifying those those superstars.
David: And actually incentivize them [00:24:00] to go down the path, which is going to be best for them,
Matthew: Right. The path that you were either make the most money or are most valuable or yeah. So, and then you could have people that deliberately buy stock. You can have all sorts of I'm thinking about like teachers and stuff like that. You could, if you want, if you were an organization who wanted to incentivize teachers you could have like you, you, you buy teachers, people who are going to be teachers or teacher stock, and then choose not to take the the, the dividend or whatever, whatever you would call it from there
David: or you just roll the, you, you pull it back into more stock purchases of them or something. Right?
Matthew: or something like that. Yeah. I don't know. I mean, there's definitely
David: I said, I think overall, it's, I mean, I'm like I said, I'm on the fence because there's good and bad aspects of it. They do lean towards more of that. It's a, it would be a beneficial yeah.
Method, especially like the fact that, you know, in, in the book, anyway, the government only got 5 percent and never got any more. It was basically relegated to doing nothing at all. pretty awesome.
Matthew: But I mean, the [00:25:00] negatives in that were. It would be interesting to see, like, I just, I, the part that I didn't like was the part that you could, that the, the world was structured into a way that you had to sell basically 51 or more percent of yourself, and then you spent the rest of your life trying to buy a controlling interest back.
David: Yeah. I think that, that, you know, the, the book comes up with a concept, but I think the way that the book expects that concept to work out would not be the way that it works out. At least I don't think so.
Matthew: just, you just need some guardrail. Just need some guardrails. Let's say you can't sell a majority, or if you do sell, you know, however much, like, they can't tell you what to do.
David: Well, I think initially that you might run into some problems, but the longer the system existed, I think the lower people's time preferences would become, and they would not sell off too much of themselves because they realized the downside risk of doing so. And I don't think that book betrays that too much because The book portrays everybody is selling virtually their entire selves away [00:26:00] just to do anything that they want.
And I think that in reality, like I said, the longer the system went on, the lower people's time preference would become and the less of themselves they would sell away for frivolous reasons. So I think the, the author come, came up with a good concept, reasoned it out incorrectly. Okay.
Matthew: Yeah, you're right. I could see people like frivolous people being like, ah, I'll sell off 5 percent of my own stock to take this vacation. But as people got more used to it, they're like, whoa, whoa, that's stupid. You only sell your stock for like college or yeah,
David: And you had your parents guiding kids to say, okay, you're not going to sell 5 percent of your stock to throw a caker
Matthew: yeah.
David: in college. You
Matthew: is probably all a college student stock is worth.
David: maybe, maybe, I don't know. But like I said, I, I I think the concept is good. They didn't, but the reasoning behind how they think it would be implemented what [00:27:00] does not fall is not exactly accurate.
Matthew: Can you imagine buying stock in Like if you know, a venture capitalist or something, and you're like, man, this guy's going to go far or one of my, my son's friends, the dude is brilliant. The dude is either going to become like the CEO of a, or like, he's going to become like an entrepreneur creating something, or he's going to be a criminal.
One of the two, but I had the option to buy. Like
David: guy for that guy, that
Matthew: he is not going to be average. He is going to be wild, but if I had the chance to buy like futures in him in some way, I absolutely would. They, yeah, they'll probably turn out to be worth nothing, but they could turn out to be worth millions. That's so interesting to be able to almost bet on like people you think are going to be.
I don't know. I don't know. All right, we're getting distracted.
David: little bit far field, but yeah, unincorporated man series. Great books. Read them.
Matthew: I need to, I need to get back and finish that.
David: Yeah. I'm on book [00:28:00] three.
Matthew: Wow. You are way past where I am. All right. What about using reasonable or appropriate as criteria for a mediation's response? Part of the issue is that we still can't quantify effectiveness, so it's kind of hard to argue for that. We've done a terrible job of that due to issues of complexity and other things.
And Ross points out as well, we've done a bad job of actually following the evidence of what makes an effective control. It's been known for years that longer is better than more complex for passwords. But we still, there's still, I know NIST changed it the other year, but there's still other places that are like, nope, you have to have so much complexity.
One of these, one of those, and they even limit your length. I, what was I doing the other day?
David: Oh man. That's
Matthew: I shouldn't name them. A vendor, a vendor university.
David: They should know better.
Matthew: a vendor site that I went on, I tried, I had to change my password and they were limiting my password to 20 characters. I [00:29:00] tried to put in a 24 character password and it didn't, it was like, nope, you can't do this.
David: Well, I, I I had to, I was buying something from a, a, a store one off or whatever that I was going to buy wasn't really something I was going to expect to go back to, but it forces you to make an account for it. And letters and numbers only
Matthew: That's what, this is a very
David: and numbers only no special characters,
Matthew: That is a
David: you know, and this is one of those things where, you know, you can say, we do this badly, or we do this poorly, but it go, it almost goes back onto itself with the lawyers and the legal framework stuff moving slower than cybersecurity. You know, what moves just as glacially slow are. Rules, rules, laws, and regulations around these requirements for cyber security.
Also, you talked about NIST just now made these changes, even though this conversation has been having been going on about passwords for at least a [00:30:00] decade.
Matthew: The other one that's been going on for a while is fishing simulations. There's been a fair amount of research that has found that the fishing simulation training does almost nothing to reduce your actual vulnerability to fishing. And there are folks that'll just click on anything, no matter how, uh, no matter how, how obvious it is.
And we do know what controls work. We know that patching quickly works against zero days. We know that least privilege keeps attackers from installing malware on endpoints.
No admin rights on workstations like I just said. A lockdown proxy that blocks known malicious sites. We're not failing because we don't know what to do. We know what to do.
David: Yeah. I mean, we're failing because it's hard. And like we just talked about a minute ago, you know, the businesses have not judged that risk as high as people want to say for cybersecurity to force those things to, you know, to force our implementation or to pay for the implementation around those things, which we know we need to do.
Matthew: Yeah, it's not worth it financially for the companies to [00:31:00] do it. So we've talked about this before, but if it did everything right, like the security team could be like one dude or one IR dude and like one, one compliance person and that's it.
David: And we may even get there with the rise of AI and automation and AI's ability to script. You know, if you get to the point where AI can script an automation to do patch management you know, we might, we might eventually get there, but it's probably still a long ways away. But there would be after you'd be significant trust, AI would have to significantly improve and there'd have to be a significant level of trust for us to get to that point though. But he also mentions insurance favorably in here, which Matt and I have probably beat to death about how insurance companies could help with the ability to properly identify which, the, what, what things do work best and which, which don't across, you know, large industries based on, um, insurance claims and how insurance companies could [00:32:00] work to force those.
Well, not forced necessarily, but to incentivize companies to take steps that do have been proven to be successful.
Matthew: So Ross also brought up how the CrowdStrike incident is going to change both technical and legal teams because of their decision making process. He thinks that technical teams with their evaluation of tools, probably not much will change. This type of incident is kind of hard to test for and is really kind of a black swan for how often it occurs.
It has a high impact, but it doesn't happen all that often, but it is going to significantly change how legal teams draft and review contracts.
David: Well, I think it's going to trick. It's going to, As part of that review, also, you're going to get content changes in those contracts, right? You're going to have more statements about the vendor will do this regression testing, they will do phase rollouts et cetera, et cetera, kind of things. I think you're going to start seeing in that when you talk about third party management.
Matthew: That makes sense. Yeah.
David: Because the whole point of reviewing the contracts is just not to review the contracts, it's to reduce, Put [00:33:00] stuff in the contracts, which is going to do reduce risk and exposure. So unless they have those kinds of things in there, there's no real point in, in the the legal team getting involved in the contracts in that way.
Matthew: We talked before about how he the authors of the paper, believe this would reduce information sharing, which I find incredibly vexing because the same attackers are targeting all of us. They're using the same techniques, the same infrastructure, and until they're forced to change it. As we as the authors mentioned, lawyers are focused on evidence and risk or liability reductions.
And IR folks are usually focused on information enrichment and sharing, regardless of what this does to liability. I
David: I'm wondering if the consideration there might be that the lawyers might also think there's a possibility for exposing regulatory or other violations or failures in that sharing.
Matthew: don't know. Yeah. 'cause I mean, this goes back to the liability thing. They don't want you to come up with conclusions. They don't want to know where the root cause is. This is related to the reduced documentation. They don't want you to, if [00:34:00] you discover that the root cause is, you know, some admin put an, any, any on the firewall as part of testing last month, they don't want that in an email.
'cause now that's gonna come out and it in the SEC, you want to have access to that and they're like, Hey, this is not partisan compliance that you're supposed to be.
David: Yeah. Cause if you share why something happened. And it may expose, you
Matthew: that you yeah. And now you
David: failure or violation. Then,
Matthew: that's interesting.
We're definitely going to see fewer indicators. I I think we will see fewer indicators shared through Threatened Intelligence Act. That's fairly self evident. But there was an interesting point here about hiring law firms. Or, I'm sorry, hiring forensic providers. I think you had a quote.
David: yeah, there's a quote in the, in the paper not, not the article, but the paper itself, which will be linked in the show notes, which states to argue this is. To argue this more clearly, law firms hire forensic providers only after an incident is known. This avoids the situation [00:35:00] following Capital One's data breach.
A judge ruled that a forensics report was not protected by client attorney privilege because the contract with the forensics firm was signed before the incident. So I'm wondering how that or if that would impact Forensics or IR firm retainers.
Matthew: But that's so wild to me, because you sign this retainer because you're worried that something like this is going to happen. That an emergency like this is going to happen. That doesn't preclude you from still engaging. I don't know. This make, this makes a lot of no sense to me.
David: Well, the judge said so Matt. So,
Matthew: Yeah. Yeah. And we know the judges are C students.
David: and infallible C students.
Matthew: I thought they were C students.
I, yeah, I don't know what to make of this, honestly. This seems, cause every, it seems like every fortune 200 company probably has a retainer.
David: Yeah. I mean, you don't want to be caught out where something hits the fan. You [00:36:00] contact a forensic vendor, like, Hey, we're booked. Right. That's the whole point of the retainer. But I don't know. I'm speculating on the retainer thing. Because I don't know if the Capital One thing was related to them having a forensic company on retainer or not. I don't know. But I'm just wondering. It kind of sounds like that might impact it based on what the judge said.
Matthew: Yeah, maybe. All right last question for this article. It looks like we might have time to finish like five minutes for the next one. They comment on the tournament of lawyers as an internal business structure of law firms. This is based on a book from like 30 years ago where they would, when you, and there's been TV shows about this, they hire a bunch of low level law clerks and like force them against each other and have them compete against each other to be promoted to partner.
And, and the paper asked if it'd be adopted by vendors. I don't know why they would, but I did think it'd be an interesting thought experiment for like cybersecurity. How would cybersecurity be different if junior employees were kind of pitted against each other adversarially, and then the winners got moved up to a partner like thing where they got [00:37:00] paid extremely well.
I'm like a lawyer structure because if I've seen numbers for lawyers these days, lawyers used to the average seller for lawyers used to be like 150, a year and the average salary now is like 50, 000 because you've got these partners at the law firms that make, you know, hundreds and hundreds of 1, 000 a year and big bonuses, but the vast majority of lawyers are making like 30 40, 000 a year at low level clerk jobs are working for companies.
David: Yeah, it's like the street street drug dealer only makes as much as
Matthew: Yeah, as a McDonald's worker or something. Yeah, yep. Yeah. But the people in charge of the drug gangs, they make millions and millions of dollars. It's just capitalism, baby.
David: Well, evil is as evil is, right? Whether you're a banker or a lawyer or a drug dealer.
Matthew: Yeah, yeah. I don't know. I don't know. Do you think this would drive better outcomes if we paid, you know, principal cyber security engineers and cyber security managers as much as as much as like well paid lawyers?
David: No. No. I mean, because what that breeds is that breeds conflict versus [00:38:00] cooperation. Right. And you'd much rather have cooperation than conflict. And I would just refer to everything that Deming is Deming has said about incentives and pitting employees against each other, which is extremely negative.
You know, I think that would be a terrible, terrible, terrible idea.
Matthew: That's fair. I actually was looking up lawyer salaries. Apparently I'm wrong.
David: There's 60,
Matthew: average. Average lawyer salary is now 130, is a median of 135, 000, the best paid 25 percent made 209, 000, and the lowest paid made 94, 000. I'm trying to think of where I saw that other number. I saw a number for lawyers that was really surprisingly low.
David: wonder if that's public defenders.
Matthew: I don't know public defender salaries because I remember reading about it after the big surge in lawyers in the early 2000s because everybody's like, oh, lawyers make so much money. We just have to, you know, go through and average salary for public defenders, 85, 000. Although this one, oh, that's [00:39:00] interesting.
Indeed says 85, 000 pay scale says 62, 000. I guess different folks submitted submitted to this.
David: Well, they get, they have different data pools,
Matthew: Yeah. And Glassdoor says 98, 000. So that's awesome. So all these are all these are very believable.
David: Well, that's only as good as the sources.
Matthew: Yeah. So here we go. The average salary for attorney in Vermont is 85, 000 This one's a law school degree You're reading. All right. Anyways, I'm getting into the weeds on something. That really doesn't matter
All right. All right. Yeah, I don't know. I don't know. That'd be a good idea But I think it would I think it would be interesting.
I'm sure that some company does this right now there's probably some
David: Oh, yeah, yeah, yeah.
Matthew: CEO is a former lawyer, thinks that this is the right way to do things. I'd be interested in reading about that company.
David: Well, that reminds me of GE and what's his name?
Matthew: Jack Welch?
David: yes, exactly. Where the, where the the lowest count in the totem pole got cut every year.
Matthew: Stack ranking. Yeah, like a [00:40:00] lot of, a lot of I actually am not. I don't know. My problem with sack ranking is that you have to fire someone every year. What if you've got a really high performing team? I do think that there's probably more people that should be fired. And I don't necessarily think that that's because they're bad people or they're lazy.
I just think there's probably a lot of people that are in the wrong jobs that are just kind of like
David: Maybe we need to be incentivized to move on.
Matthew: Yeah. Yeah. I mean, yeah, we've talked about this before. I don't want to fire people because I think that they deserve to be punished. I just, I think there's some people that are just bad.
Cruising and doing a mediocre to poor job. And someone needs to kick them a little and be like, Hey, you could, you could be working somewhere you actually enjoy it. Maybe, I don't know.
David: possible.
Matthew: It's possible. All right,
David: of that.
Matthew: I've heard of enjoying your work one day. I hope to experience it. Next article. I'm going to blaze through this real fast.
Attack B 16 brings rebalance with restructured cloud, new analytics, and more cyber criminals. I don't want to go [00:41:00] through all the changes that attack for V 16. That's going to Yes, you do. You just not.
So I just want to call out one thing. One of their goals this year was expanding the content engineering.
Content. I used content twice. They added a total of 231 new analytics under execution, credential access, and cloud specific analytics, which is, this last year has just been an absolute explosion of content engineering. Yeah, 10 years ago, every company was on its own for Simrules.
David: Not wait, 231. Is that, that's gotta be close to doubling the existing count, isn't it?
Matthew: Well, I, so I've looked at them. They're not very specific. They definitely need a lot of work. But it does give you a starting place. It gives you something. They're not like a, they're not like a Sigma rule where it comes with specific stuff. They tend to be fairly general. Like, Hey, look in this source type and this index, some of it's funny too.
You can tell by looking at the analytic where the author, like some of them are very Splunk centric analytics and others are more generic,
David: I mean, ATT&
Matthew: But it gives you a place to start looking which I appreciate [00:42:00] because sometimes you look at some of these and you're like, ah, I don't even like which data source would I even look at for this?
And then you have to do a bunch of research. So it definitely shortens the development cycle.
David: Is getting huge.
Matthew: attack is getting huge. So yeah, we've got, we've got companies now where you can buy a whole content library. There are multiple free Sigma repositories online where you can get free content. Attack is adding analytics.
We are now at the point where we have a surplus of basic content. There are thousands and thousands of analytics out there. You can go and you can just. Throw in your sim with some level of customization.
David: Yeah. I need to start writing this like IOCs. Where, you know, you had a ton of IOCs, but how do you determine the quality here?
Matthew: Yeah. Yeah. That's, that's the question. Cause now we've got our own problems. Are these good detections? Which should you implement? If you, if you tried to implement all of the public analytics and the release of attack, you can't just throw 200 plus analytics in there and expect them to work.
And they won't, like I said, they're kind of general. They're, they're more like starting points than actual finished products. [00:43:00] So I think it makes sense that the next step will be prioritization and filtering of the analytics and some of those companies I talked about where you can buy a content library.
We talked about them, I guess, last year. Some of them have like rating scales where you can rate the analytics on which ones are the best ones and which ones you like. But I would love to see things like the true positive rate for this analytic, how much work does it take to get customized in the environment, a lot of stuff around that, a lot of metadata around the analytics.
David: Right, it goes back to what we were talking about in the last one with the information sharing,
Matthew: Hmm.
David: you know, it would be nice if, you know, the detections and the true positive, false positives was automatically uploaded in a generic way to a centralized, you know, maybe MITRE ATT& CK database or whatever and then you could get an aggregate.
Understanding about the quality of that item,
Matthew: Yeah. Because, because you don't need all the rules. You don't have to cover all of the tactics and techniques in MITRE. Adversaries only use some of them. I did this activity myself last year where I [00:44:00] downloaded like 10 threat reports and I combined and counted up which attacks were mentioned in each of them and even if you like combine them I think it was like PowerShell and command line was mentioned in every single one of those reports and then there were other things that were only mentioned in, you know, two or three of the reports and others that were mentioned in one and then like 90 percent of attack was never mentioned at all.
Like, just because somebody used it at one point in time doesn't mean that it's, you know, the easiest thing to use or the most likely to be used.
David: right. And, you know, and this is the kind of analysis that I think, you know, threat Intel teams could be doing.
Matthew: Yeah. Which ones, which of these do we need to care about? You know, oh, look, this one is coming up, it's being used, we're seeing it being mentioned more and more. But we have minimal coverage on it, like, we need to beef up this coverage. Yeah, that's useful threat intel. Not these reports. Not these, not these strategic
level reports.
David: About whatever Rush is doing.
Matthew: Yeah, do not care what Russia is doing. Tell me how attackers are getting in, tell me what the most popular methods are and [00:45:00] then what I need to be
David: And where our gaps
Matthew: months. Alright, I don't know that I want to cover the rest of this. Is there anything later on here that you want to
David: now. So that's it. So thanks for joining us and follow us at Serengeti Sec on Twitter and subscribe on your favorite podcast app.
Matthew: leave us a review? Cause that's why nobody listens to us, I think. Ha ha
David: Hey, both our fans are rabid at least
Matthew: ha.