Episode Transcript
Caution, this transcript is AI generated and contains errors. I see the first ones on the first line of the transcript, so don't trust it too far.
David: [00:00:00] Welcome to the Curious Serengeti. We're your hosts, David Schroeder and Matthew Kinnear. Stop what you're doing and subscribe to our podcast and leave us an awesome five star review and follow us at SerengetiSec on Twitter.
Matthew: We're here to talk about cybersecurity and technology news headlines and hopefully provide some insight, analysis, and practical applications that you can take into the office to help you protect your organization.
David: And as usual, the views and opinions expressed in this podcast are ours and ours alone, and do not reflect the views or opinions of our employers.
Matthew: I heard that you'll be able to opt out of targeting advertising if this new privacy bill passes through the House and the Senate.
David: Oh, so the ads that I get will be even less relevant?
Matthew: Ah, I mean, I always get the ads after I buy the product.
David: Well, of course, that's because they know you're willing to spend money on it.
Matthew: So, it's funny that we were just talking about this. I opened up my phone and Amazon was pushing a pair of pants that I looked at last week and bought in person. But Amazon is like, Hey, you looked [00:01:00] at this. Are you sure you want to buy it now?
I already bought it.
David: Too bad you couldn't reply to the ad too late.
Matthew: I don't know. Do I want them to know even more about it? I mean, they already know everything as it is. Alright, our single article today, this is something that David recommended we look at Committee Chairs Rogers Cantwell Unveil Historic Draft Comprehensive Data Privacy Legislation. This is on the house.
gov website. Two representatives have put forth some new legislation to set, quote, clear national data privacy rights and protections for Americans. And it's bipartisan. It is a Democrat and a Republican.
David: Nice.
Matthew: this is nice. I don't know though, if they're working together, they might actually start passing more laws.
I think, isn't that one of the benefits of of this bipartisan, of this non bipartisanship is that nothing gets done?
David: Yep, exactly. Boy. That's what they found is like the the government spends less and gets less done when the the Congress is, is, is one party and the [00:02:00] president is another party,
Matthew: Because the reality is, is that no matter who's in charge, both parties spend more. And pass more laws.
David: like you said, both parties are two different wings of the same bird of prey,
Matthew: And we're the prey?
David: Obviously
Matthew: Yeah, that's fair.
David: it's kind of like the I've never been able to understand why, you know, big government being in bed with big bit business makes it so that I get ****ed
Matthew: All right. There's one quote in here that was interesting. Quote, Americans deserve the right to control their data. Yeah, we do, but I don't think this is going to do it. Yeah, I don't think this is going to do it. So I have mixed feelings on this before we start reviewing. On one hand, I think we do need some comprehensive privacy legislation, But on the other hand, I think it's about a decade late for this type of thing.
Companies already have all of our information. So they guess it might be good for future people. And I have almost no faith whatsoever that the government will do a good job at this.
David: Really?
Matthew: Shocking. Shocking.
David: Yeah, [00:03:00] I mean, I don't know why you're so bleak on it.
Matthew: yeah, I don't know either. So the whole thing is 57, 53 pages. We're going to go through it in detail, line by line. We're going to be here for about eight hours.
David: So strap in.
Matthew: Now they have they have an eight page summary of the 53 pages that I went through. I only really reviewed the security section in detail. It was very small. So here are kind of the summary points from the article and from the eight page summary. The first one is people get control of their own data.
Now this is funny because they had this as a summary point in the article, but I didn't actually find anything in the eight page summary that really, I mean, I get that it's kind of talking about the whole thing, but there's, I don't think that we'll get as much control over the data as we need.
David: Well, I mean, they've got exceptions large enough to drive a truck through. So
That's one of the reasons why it's, it's not exactly accurate to make that statement.
Matthew: um, second point they had was that it overrules inconsistent state laws and creates a single national law. My first thought related to that [00:04:00] was I imagine that pretty much every company would be happy about this except for the companies that sell tools to help you meet each of the state's laws.
Have you ever talked with any of those companies?
David: I have not.
Matthew: Okay. I talked with one of them last year at RSA. And literally that's, that's I mean, I guess that's not their entire thing, but that's a big chunk of their business is keeping track of all the various laws at various States and various countries so that your compliance group can make sure you're following all of them
David: so it's a company staffed by lawyers?
Matthew: probably.
David: Yeah, I mean, I've talked to different, you know companies that say, well, we'll help you with your regular regulatory burden by, you know, saying whether you're in compliance or not based on your Splunk logs and stuff like that. But not one to specifically say, oh, well, we'll keep you in check for Connecticut, Connecticut law and Illinois law and California law, et cetera. And this, I mean, this may be, accurate, but there's still exceptions to the whole preemption of state laws in section 20, including these two gems [00:05:00] the provision of laws that address notification requirements in the event of a data breach. So that's not superseded. And then provisions of laws that address banking records, financial records, tax records, social security numbers, credit cards, identity theft, credit reporting, and investigations, credit repair, credit clinics, or check cashing services.
Matthew: Interesting, so those all take precedence. What's left?
David: Everything else. No, there's a whole, there's a whole page of exceptions. I just pulled out those two because I thought they were the most relevant to show that that's not really preemption about major things that companies have to respond to on a state by state basis now.
Matthew: Hmm.
David: But this whole thing also is, this is a blow against federalism. So where the, where the states get to do, get to decide instead of the federal government. And that is flies in the face of the 10th amendment, 10th, 10th amendment, which says the powers not delegated to delegated to the United States by the [00:06:00] constitution nor prohibited by it to the states are reserved to the states respectively or to the people.
So if it's not in the constitution, then the federal government shouldn't be writing laws about it basically. And yet this is what they're doing here is for prohibiting the states from. Chart in their own course.
Matthew: I am shocked. I can't tell you why they would do this.
David: Well, we'll get into that at the, in the the, the summary at the end when we, when we get, when we get down there.
Matthew: I've actually always wondered about the 10th Amendment. It seems to be generally ignored at this point.
David: Oh, it's completely ignored.
Matthew: Ha! Alright. Alright. Hmm. Good times. Alright next item. It reduces the amount of data that companies are allowed to keep, collect, collect, keep, and use. Hmm. Hmm. They say that, quote, covered entities, dot, dot, dot, shall not collect, dot, dot, dot, beyond what is necessary.
I think the problem with this is two fold. Number one, companies are going to just define what is necessary as whatever they want. And then [00:07:00] they're going to be like, oh, well, it's necessary that we have your birth name, and how many children you have, and how many rooms are in your house.
David: Well, what else are you going to feed into the algorithm?
Matthew: yeah. Have you, speaking of which, have you ever gotten a, Request for by the, by the, what is it called?
What's the annual survey or not the decadal survey the government does?
David: What the census?
Matthew: The census. There we go.
David: No, I don't fill that out.
Matthew: well, I got an invitation to fill that out.
David: Oh, well, congratulations.
Matthew: and I was like, you know, I don't want to be a good citizen. I'll, I'll fill out some information. I started going through it and holy crap. Did they ask for so much information?
I ended up quitting because of her. So they wanted to know like how many square feet my house was, how many bedrooms are in the house? What everybody here does for work. Just wild, wild amounts of data.
David: Yeah, I filled that out in 2000, I think was the last one I did. And it made me so [00:08:00] angry. I've never done it again.
Matthew: I ended up looking it up because it's full of all kinds of like warnings, like, you know, if you don't fill this out, you know, you could go to jail. I was like, well, and I looked it up and apparently they never, well, technically it is a legal thing on the law that if you don't fill it out, they can put you in jail.
But they said that they have not done that in like 30 years or something.
David: No, I mean, I love watching the, the, the census people come to the door and knock and ring and mull around outside my house. That's the best. Get a kick out of that.
Matthew: all right so the second thing is that's interesting here is they, they said, and they said this in several places, the FTC shall issue guidance on what's necessary. And I know this is a big thing. And I know Reason Magazine and Libertarians have been talking about the administrative state and how Congress has basically given up legislating.
David: Well, that's one of the things that's before the Supreme Court right now the Chevron doctrine case or deference case, excuse [00:09:00] me which according to the source, my sources, which of course are biased, are saying that if that gets struck down, then it will be a serious blow to the administrative state.
And possibly hamper the ATF to a large degree, which would be fantastic. I am skeptical that even if they do side to throw out the, all that rulemaking stuff for the, for the, uh, executive branch organizations that that's really going to work out the way that we hope,
Matthew: Hmm.
David: but as usual, this is Woodrow Wilson's fault.
Matthew: That's usual. Do
David: He really ushered in the administrative state state with his Is his minion Edward Mandel house, who wrote Philip drew administrator, a story of tomorrow, 1920 to 1935. And to quote what's that?
Matthew: Oh, no, never mind. I was not looking at the notes. I thought you just had that memorized and we're just like extemporaneously just Like, dang!
David: And it [00:10:00] really helped me in such high regard.
Matthew: I'm going to cut out all the part about you having it in the notes and I'm just going to leave in the parts about you having it all memorized.
David: okay. But to quote Walter Lippmann about the book it's the story of if a boy A 14 who dreamed of what he would do if he had supreme power and nobody objected. And another quote from another political commentator says that eight major reforms from the book had been passed into law only four years after house became an advisor to the president.
Yeah, he wrote it in
Matthew: Interesting. When, when was he, so he wrote the book before he got power.
David: 1912 before Wilson was elected.
Matthew: That's interesting.
David: Or is that the year Wilson got elected? I think that might be the year Wilson was elected.
Matthew: He was part of the campaign team so he was, this was kind of his, you know, when I'm elected I will do these things.
David: No, I don't think [00:11:00] so. He was not an official member of any part of Wilson's cabinet. He was just an advisor
Matthew: Influential outsider.
David: time. I think he did become a different, he, he, he got some part in the government later, but not as like a, uh, presidential advisor or something like that. He had something some position in Paris or something like that.
And I have to look it back up, but at the time in 1912, we had no position in the government, but Wilson and all the progressives from that time period are really what ushered in this giant bureaucracy. But of course, you, you know, what is not a covered entity, federal, state, tribal, territorial, or local government entities, such as a body, authority, board, bureau, commission, district, agency, or political subdivision of the federal government or a state tribal territory or local government. And you expect that to prattle on for another half a page with that list.
Or of [00:12:00] course, any entity that acts on behalf of the same.
Matthew: Darn near all the major corporations, government contractors. So who
David: Oh, except tick tock, tick tock.
Matthew: This affects like, yeah, this is going to affect like foreign companies and that's it,
David: Yeah, maybe it says there's a lot of interpretation to this. And one of the other things that I thought was interesting that isn't exempt from this is anti fraud nonprofits. And when I did a quick Google search for. Anti fraud nonprofits. What I came up with was the anti fraud coalition. And these are apparently a, a group of whistleblowers who exposed fraud on government and financial markets. So other enemies of the state. So the bottom line of that whole thing is that, you know, so the government that spies on you for your own good does not have to abide by this, but companies they want to use your information about you for selfish profit motives by selling you your, uh, by selling you [00:13:00] stuff that you agree to buy.
Do have to abide by it. Yeah. So this makes total sense in a capitalist society. I'm sure in a socialist utopia, this would be the complete opposite.
Matthew: No doubt. No doubt. All right. The next item.
It was mentioned was it allows users to opt out of data processing upon privacy change. This one I actually see is kind of nice because right this second, although given all that list, giant list of exemptions you're just talking about, I guess it's not so nice. It's going to let me opt out on like three companies.
But as it, as it stands now, you, you know, click through and say, yes, you agreed everything. And generally speaking, that's it. Or they'll tell you like your privacy thing is changing. Okay.
David: Right. Yeah. FYI.
Matthew: Yeah, you don't really, you don't really have a choice in the matter.
David: Right. Cause then this prevents them from really being able to do a bait and switch. But that's not going to stop them from doing it. They're still going to do it. But it's going to [00:14:00] be like, okay, you're grandfathered in and the new privacy stuff is going to be for everybody who joins after kind of thing is the way it's going to go.
So they're going to have privacy statements applied to, you know, dozens or hundreds of groups, depending on how often they update their privacy statements.
Matthew: Can't have nice things. Allow users to prevent companies from selling and using their data. There was no details that I saw in the page summary on how specifically they're gonna I mean, I guess there, there are some things later about how you can opt out. So maybe it's related to that.
David: Yeah, there's, there's of course additional detail in the 59 pages. I didn't specifically look up that one though.
Matthew: Yeah, I mean, either. Next item was allow or forces express consent. For the companies can transfer sensitive data to a third party. Of course, the problem is this is just going to be part of another click through that
David: Yeah. That's going to be in the original privacy statement, I would assume because those [00:15:00] business relationships are already going to exist.
Matthew: I could see that
100 percent see that. So this forces corpos to allow people to access, correct, delete and export their data and see who the data was transferred to. I can definitely see this being a new business. Thanks. Just like the privacy companies that exist right now that promised you know, submit. request to all the data brokers to remove you from their things.
But as I understand it, most of those are not terribly effective because you remove it and then they go out and buy a new swath of data the next month and it just puts you right back in.
David: Hmm. You're going to miss it somewhere.
Matthew: Yeah.
David: It's kind of, it reminds me of the what was that book about AI we, we, we reviewed?
Matthew: Was it manna?
David: no, the singularity is closer than it appears Avogadro,
Matthew: Oh, okay.
David: right. Where they, where they wipe out everything except for one tiny data center that they missed.
Matthew: And it
David: and [00:16:00] reinfects the entire planet. You know, it's going to be the same kind of thing.
Matthew: Yeah. That's depressing as hell.
David: I, I wonder how long it's going to take though, before a software manufacturer makes a really good data management software package that, that does all these things. You know, you'd think there'd be one by now with the GDPR since that past went into effect in 2018. I haven't seen any company do this or any software manufacturer do this.
It seems like everybody's just having to roll their own.
Matthew: you mean like something for an end user that you could subscribe to like a dashboard? In your browser that you could log in and be like, Oh, my, my, my data exists in these five companies and submit a request for them automatically.
David: No, I'm thinking about in individual companies to, to, to accept that request. Right? So maybe it has an API that that kind of thing could plug into, but it might manages all the data that company has in their data lake. So when an end user requests this stuff, it automatically goes in and does the cleanup and the removal and everything.
And organizes it [00:17:00] and keeps track of it. Cause one of the things that they'll have to keep track of is the fact that you did opt out. So they're gonna have to keep something on you.
Matthew: Yeah,
David: you, you will get that reinfection. So I'm just wondering, everybody seems to be rolling their own instead of some software company saying, Hey, you latch this onto your data lake and we'll manage all the PII or whatever data elements are in there and we have, we'll expose this API to other partners or to, you know, a browser extension or whatever.
So people can go in and do this stuff automatically.
Matthew: that makes sense. Yet another wonderful business idea. Your favorite security Serengeti podcast,
David: Yeah. We'll only ask for 10 percent of revenue,
Matthew: nothing to deal at any cost.
David: but at least this may force I have the benefit of forcing companies to finally do some data consolidation instead of having all this stuff in dozens of databases and file shares. But of course [00:18:00] this also then raises the stakes for that data lake or wherever they do the consolidation at.
Matthew: Yeah. Best target.
David: You know, I
Matthew: Yeah. I definitely, if this ever exists, I'm planning on taking advantage of it. Like I said, I've been looking at those kinds of ID companies and thinking about it, but I've always been like, I don't know if
David: mean like delete me,
Matthew: yeah, stuff like that.
David: you know, so I was
Matthew: unless they want to sponsor us.
David: was ironic, you know, you have delete me and say, okay, tell delete me all of your data elements that they want to find and delete on your behalf so they can have all your data.
Matthew: Yeah.
David: delete me is just collecting all this stuff on individuals also. So, uh, I'm wondering if there isn't some backend nefarious plan for them.
They're like, Hey, we have all these means of people who've asked us to delete all their data. And we have all their sensitive data because we had to know it in order to find it and delete it.
Matthew: then once everybody's data is deleted, they're the only ones who have the information.
David: Yep.
Matthew: Brilliant.
David: Yep. Genius.
Matthew: [00:19:00] Hmm. Very nice.
All right. Next item allows users to opt out of targeted advertising as we alluded to at the beginning. So this has a provision to create a data broker registry. For all data brokers with more than 5, 000 individuals data and provide a do not collect mechanism so that users can log in and tell the data broker not to collect that information on them.
So interesting, interestingly to me, I was thinking about. Another, another we, we both read this, I don't think we did a review on it, but Accelerando by Charles Strauss, where, in that book, it's set in kind of the, between now and the, I can't remember the name of it, the Singularity, there we go. And as part of it, there's, they have semi autonomous AI run corporations, and it's used as an obscuring mechanism to, you know, hide people's [00:20:00] taxes and stuff where you can spin up autonomous corporations that, you know, you can siphon off your assets to, and kind of like how how we do the anonymous corporations now, but a lot easier.
And it wouldn't surprise me if large data brokers like split themselves off into Dozens of these autonomous corporations that all have 4, 999 people.
David: hmm. Yeah, that's kind of like there are businesses now that let's say you have a million dollars, right? You can't put that million dollars in the bank and expect it to be covered by FDIC insurance So there are companies where you give them the million dollars and they distribute it to Enough banks so that your entire million dollars is insured by having two hundred thousand dollars or whatever in You know five banks or whatever.
Matthew: Yeah, we saw something around that when Silicon Valley Bank went up earlier, or last year, and a bunch of people had way more than 250, 000 in there, and the government stepped in to insure them,
David: Yep
Matthew: even if they technically were not. I can see [00:21:00] that working if you've got a couple million dollars, are there enough banks to do that for, you know, Zuck's money?
David: Maybe maybe not for him, but there will be for, you know, so much a certain number of millionaires.
Matthew: Yeah, I'm just imagining like literally every, every single company in the world or every single bank in the world gets to say that Zuck is their client because he has so much money spread out across every single one.
David: Yeah. Well, I mean, with the bank consolidation, eventually, you know, there will be ones like Highlander.
Matthew: Just one bank left.
David: Yeah. Unfortunately it'll probably be the Fed.
Matthew: So next item is it allows users to sue people who don't follow the law and recover money for damages. So. On one hand, I'm actually kind of, I kind of like the idea of being able to, you know, because we've talked before about a lot of times when you report a crime to the, the law they just ignore it.
We've talked about this before in terms of, you know, if there's less than 10, 000 in a cyber crime, they don't even care. And if it's less than a million dollars, the FBI [00:22:00] doesn't get involved or whatever. So it is nice that you can sue them yourself if you want, but the, the problem I see here is how are you going to be able to prove that damages are related to any single company that didn't follow the law?
Like your data is ingested by one company, then it's passed through multiple data brokers. How do you, how would you even find out that
David: Yeah. Well, as an individual though, the return on your investment to Sue though, too high, cause if you're expected to get 5, 000 back or whatever from your lawsuit, you can't even pay the lawyer with that.
Matthew: Well, I guess the question is how much are you allowed to sue for for damages? Like, if you're allowed to sue for a significant chunk of damages, then that would change that math. Although it
David: Well, you had to,
Matthew: the Hmm?
David: I'm not sure exactly how it works, but I would assume that you'd have to prove a level of damage in order to expect that kind of recompense, right?
Matthew: Yeah. Which is already going to be extremely difficult to prove. So, yeah.
David: Yeah. I think [00:23:00] that may be, you know, one of those throwaway entries in there.
Matthew: They added it, but it's impossible to actually accomplish. Figures.
David: But that would be interesting to hear what they call an algorithm, since this is an important part of the entire law. And I'm not going to read the whole thing, because it's a pretty lengthy definition. But there's something I wanted to highlight in here, where it says that a covered algorithm includes computational processes, including one derived from statistics.
Matthew: Hmm.
David: You know, so not just AI, not just machine learning, but if you do any kind of compute, make any kind of computational decision based on a statistic or, you know, multiple statistics, then. That falls under a covered algorithm. And we've been, you know, businesses, insurance companies, banks, they've been using statistics for decades, nor [00:24:00] decision, how they issue a policy or make a loan or whatever.
Matthew: Yeah.
David: so this is going to completely start interfering with things that we've taken for granted that companies do for years now and they're going to have to justify all the math that they're doing. So this is more than just going to impact AIs. And of course, when they're going to be, when people start leveraging accusations against the math the, the algorithm is going to be determined guilty until proven innocent. So I just expect that the, you know, insurance companies and banks to be making poor decisions about how they issue stuff based on this.
Matthew: So, the reason that David brought that up now is there's a couple items in here. The first one is one of the goals is to prevent companies from using personal info to discriminate. And how often does this really happen these days? Discrimination is already illegal. This is, like, making it double illegal.
David: Yeah. I [00:25:00] don't think before you could, you didn't get a jail, get out of jail free card because you said, Oh, well, the computer did it.
Matthew: Computer did it. Home free. Ah, they got us.
David: Dang it.
Matthew: Ah, if only we had thought of that. So there's also some verbiage in here where large enough companies must evaluate their algorithms for discrimination and then make that evaluation public. It's interesting. I like, I, yeah, but I don't think it would really change anything.
Companies would choose the test that makes their stuff look totally benign or they'll, you know, I mean, like so many other companies have, they'll just, Hide the results if it's not, or they'll bury it somewhere on their website if it's not beneficial to them.
David: Yeah, I'm not exactly sure how that, that's going to work out because I'm rather surprised that they didn't say, well, you had to get an evaluation from one of the big three accounting firms. And then those are coming from could then justify spin up an entirely new division to do AI evaluations.
Matthew: That's how it's going to end up though. [00:26:00] You're right, they didn't say it, but that's, that's 100 percent what it's going to be.
It's depressing.
David: Yeah. But what they're really saying with that whole thing though, is you can't use inconvenient facts against someone is really what they're trying to say. But the exceptions for this. are themselves discriminatory. So
Matthew: They're the
David: One of the exceptions is you can discriminate in order to diversify, diversify an applicant, applicant, participant, or consumer pool. So you can discriminate to substitute white people with minorities. So that's okay. And another exception is advertising, marketing, or soliciting economic opportunities or benefits to underrepresented populations or members of protected classes. So as long as your purpose is to send messages about economic opportunities or benefits you can exclude white people from those and this is complete.
Matthew: and all [00:27:00] of Target's advertising.
David: Yeah, it's it's it's bull**** is what it is I mean at least I agree with the one caveat they have in here that that Any private club or group? Not open to the public is an exception, but this **** about how you can you can discriminate against anybody who's not a minority is ridiculous
Matthew: Interesting. I wonder if that changes. I'm thinking, weren't they saying that the U. S. is going to become majority minority sometime in the near future?
David: Well, that's the thing is they don't say minority specifically in here. They say underrepresented or protected classes. So even when they are the majority, they will still be protected classes or there'll be underrepresented in some way, probably,
Matthew: Yeah.
David: Cause right now it's United States is like 60 percent white, I think.
Matthew: I just, I remember seeing
minority majority, [00:28:00] so, so like in Georgia, the white population's below 50%.
David: Yeah, Texas too. Texas is 51 percent Hispanic. But the funny thing is, if you go to the CIA World Factbook, Hispanic is not an ethnicity in there.
Matthew: It does. Yeah.
David: They say, well, Hispanics could be any race. So we're not going to add them as a minority. But if they were, they'd be 16%. There's something like that is what the CIA World Factbook says.
Matthew: Interesting. Yeah. They're predicting sometime around 2041 to 2046.
Oh, all right. Next item is it allows end users to opt out of algorithms for housing, employment, healthcare, credit, education, insurance, and hotels only for consequential decisions? And there's not much more info on this one, but I know you have some thoughts on it.
David: Well, what's consequential?
Matthew: Yeah, no definition there. We all know what consequential means. It's
David: Well, actually, they may have the [00:29:00] definition in the larger document. Actually, I didn't look that
Matthew: possible. Yeah.
David: But as I was saying before, you start interfering with the algorithms that, that do all these things for you know, that are making decisions a day. This is not going to make the decisions better. This is only going to make the decision even worse as you add in caveats and exceptions and et cetera, et cetera, especially if they, if they go into the whole exception thing about white people, so 60 percent of the population, You know, the decisioning algorithms based around them are going to be have worse outcomes.
Matthew: So that's actually an interesting thing that occurred to me about this. Who is going to have to make those decisions? Does this mean they're going to have to hire a bunch of extra people to handle? Like, let's say this gets passed and let's say a large number of people decide that they want to get out of, they want to be accepted from, cause they don't know what, they don't understand what it's going to mean.
They're just like, get me out. Does that mean they're going to have to, the companies are going to have to hire a whole bunch of people to do this? Cause you can't, if they [00:30:00] opt out of your algorithm, you can't use an AI cause an AI would count as an algorithm.
David: Yeah, it's going to be a mess.
Matthew: Yeah, sounds like it. This is really secretly a jobs program. Well, every, every government, this is jobs program.
David: Well, yeah, and like I said, it's just going to make everything worse. We're going to have worse outcomes across the board. When you start messing around with the current way that these decisions are made with algorithms and machine learning and AI already today,
Matthew: Yeah. Can you imagine if you exempted yourself, you decided to take yourself out of the running for, uh, what am I thinking of insurance? You, they, they would drop you. They would say, all right, well, we're not insuring you. Cause we can't tell you how much it's going to cost.
David: yeah, cause we can't use statistics to determine what, what the risk is for the way that you live or the way you drive or, or the kind of house you have or all that stuff. That we use in order to make our [00:31:00] decisions. We're no longer able to consider,
Matthew: Yep. So thank you much, but goodbye.
David: you know, I mean, it's either going to be that, but if too many people opt out, they can't just drop everybody because then they have no customers. So that means that those companies are probably gonna have to take on a greater risk than they did before, which means they're, they may have to charge more.
Or we're going to see a lot more of these companies go under as they can't properly identify or plan for the risks they've accepted and they're overwhelmed by the risk because they've missed, they've, they've missed, they've made bad decisions around risk.
Matthew: Well, what's funny is they're already going, they're already struggling and having trouble because they've been making bad decisions. Even with these algorithms, it's just going to get worse.
David: Yeah. Like I said, this, this whole thing is not going to end well.
Matthew: It's going to bring peace and joy and everything's going to
David: it is. Sure
Matthew: better than it was. Yeah. I don't know what else,
David: wait. I can't wait to see it.
Matthew: allow end users [00:32:00] to opt out algorithms. Yeah. Like housing. I mean, I guess that a lot of landlords are mom and pops. That probably wouldn't change that. Employment. I mean, again, you probably.
Would you like, if you can't do the initial screening through your HR app, you probably just get sloughed, sloughed off into the do not respond bin credit. That basically just means you're never going to get any more credit. I wonder if current credit card companies would drop you, or if they would just freeze your credit limit where it's been at Hmm.
I think I still think insurance is the big one. Cause insurance, I think is the one of these insurance and credit are the two where algorithms are most dominant.
David: Yeah, I agree.
Matthew: Yeah. They did mention strong data security standards. Which that wasn't, that was what the article said, but when I reviewed it in there, I would disagree, it says you must have security practices appropriate for your size. And they only mentioned a couple specific requirements specifically around vulnerability assessment and management and disposing of covered data.
David: But one interesting thing to note in there, though, [00:33:00] is what covered data includes. Which is usernames, email addresses, or telephone numbers in combination with a password. So now any credential compromise is going to be, is going to be a privacy breach.
Matthew: That's phenomenal.
Hmm. I wonder if that would be an improvement or not. I'm just, the reason I'm thinking about this is there's a, there's a cranky article where it was talking about how. Even as painful as the credit card requirements are around data breaches and reporting, they were somewhat effective at getting companies to actually care, just because it was so much of a pain in the butt to do all the reporting and pay the money out for it.
Wonder if calling those sorts of things, if you, if you bring those under a privacy regime, if that would increase the pain to the company enough that they might care more and spend more money on securing it?
David: Well, one can hope so. It also may drive some companies out of business. You remember when [00:34:00] Yahoo had that billion dollar or billion record credit credential loss.
Matthew: Yeah.
David: So imagine that the FTC said, okay, well, you gotta, you gotta, you gotta cough up you know, 2 a record. Or whatever that would have folded the company.
Matthew: Hmm. Yeah.
David: So,
Matthew: deserve to die?
David: Oh, well, I don't give a **** about Yahoo. I'm not a fan.
Matthew: I know, I
David: But I'm just saying that, you know, this is, well, I'm, I'll get into this later, but it's all in the implementation.
Matthew: Yeah, this, this section was less than a page. So strong data security standards, I think is an exaggeration. And I'm talking about, I went to the 53 page document and it was like three quarters of a page in there. So supposedly this will make executives responsible.
David: Well, I mean, it makes sense. Cause that's the whole point of being a corporate officer, right?
Matthew: know, right? The whole, it's you get paid a lot of money, but you're responsible. That's the whole point of that thing. But when I looked at it, [00:35:00] it actually didn't have any information on how it would make them responsible. It said that you had to create like a privacy officer, but that just seems like that's just a chief scapegoat officer.
David: I wonder if, I wonder if this is going to be a new ground for insurance. You know how doctors have, um, malpractice insurance. I wonder if you're going to end up with corporate officers having like data breach insurance where they're personally insured against a lawsuit from a data privacy breach.
Matthew: Yeah, so I'm looking over the, the actual 53 page part real quick, and basically it's just stating that you have to have a privacy offer officer and a data security officer and they have to certify that you're doing these things, but it doesn't look like there's any punishments for those folks.
That's disappointing. You have to do privacy impact assessments. But again, where's the, you know, you will go to jail if it's found that you blew off the law. [00:36:00] Yeah. I've always hated that. They never, I mean, we've talked about this before and you have a, I think if I remember correctly, you had a. A thing where if a company breaks a law, they're put to death if the law is serious enough, like the company's put to death. Yeah, that makes a lot of sense. Should have added it here.
David: I wish.
Matthew: Yeah.
All right. Next item. Forces companies to tell you when your data has been sold, sent to foreign, foreign adversaries. Do I, do I get like a piece of mail saying that they sold my info to North Korea?
David: Yeah, it'll come in the mail,
Matthew: Sure it will.
David: but you can see that obviously that's targeting China and Tik TOK cause you know what they say adversaries, not just foreign countries, so if they sell it to the UK, Germany, or Israel, it's all good.
Matthew: Just like our, just like our weapons.
David: Well, they give those away though. Ah, you get an AK 47, you get AK 47. Mm
Matthew: I wish. Send one of those my way. All right. Small businesses that are not selling personal data are exempt. Small in this case means 40 [00:37:00] million or less in revenue and have data on fewer than 200, 000 or fewer individuals. Weird that the limit there is so different. The, when they talked about the brokers earlier, the limit on brokers was 5, 000 individuals.
David: hmm.
Matthew: I took a quick look around. I didn't find any definitive numbers for companies with 40M or less in revenue. Most places do the cutoffs at 25M and 50M, but CrunchBase says that there are 10, 000 businesses with more than 50M in revenue. And ask wonder says that there are 5. 6 million businesses with revenue under 25 million.
So this seems like this would exempt like 99 percent of businesses.
David: Well, and that's probably fine as far as they're concerned, because those businesses don't have any political power. So the government doesn't need to mess with them.
Matthew: I wonder. Yeah, I saw the number 40 million was very weird. My initial thought was maybe there was a donor with a but you're probably right probably is. They just don't. Matter. They don't make enough [00:38:00] money.
David: Yeah. They don't make enough money to influence the government. So. The government doesn't care but none of this is random, you know, there's a reason it's 40 million though. don't know what that is, but there's a reason it's 40 million. No one guesses 40 million, someone would say 50 million or 25 million, they, they would go with, you know, one of those, one of those numbers, they wouldn't say 40.
So yeah, there's a, there's a reason in there. Don't know what it is though.
Matthew: All right. And the final item companies cannot use dark patterns to distract from users exercising their rights under this
David: I love that. I love that term, dark patterns.
Matthew: Yeah. I mean, and
David: it's like a terrifying quilt.
Matthew: for people that are listening to the don't know what a dark pattern is. This is the thing like. The order of the buttons to make you, you know, select the button they want you to select most often or the
David: yeah. I got the definition down there.
Matthew: Oh, awesome.
David: two one
Matthew: A user interface designed or manipulated with the substantial effect of [00:39:00] subverting or impairing user autonomy, decision making or choice.
David: Yeah, who's gonna decide that you know, I think facebook. com is a dark pattern,
Matthew: I think probably 30 percent of capitalism is dark patterns. Like once, once people figured out you can manipulate, I mean, and they've known this for obviously people have been manipulating other people through marketing and other measures for hundreds and years of thousands of years. But like once, once we got like the marketers involved and all that other stuff, like every website you go to is full of these things where they're all trying to funnel you into buying ****.
David: right? I mean, they're all trying to influence you. Man, what was that game the the freud's nephew that's basically founded marketing
Matthew: I've, I have a book about him on
David: Dang forget his name now. But he invented dark patterns where, where the, the one particular thing that I remember is Virginia slim, their marketing or their logos or whatever, their colors, green. so they did an entire [00:40:00] marketing campaign around the color green and how it was the color of the season or the spring or whatever in order to increase the Virginia Shim market share when they were selling cigarettes in the 30s or 20s or whatever, I forget the time frame.
Matthew: Yeah, we think, we think an amazing amount of money. And this is actually, this is something that really bugs me about capitalism is that since the money right now is in tech companies and in advertising and trying to get you to buy stuff, we have just so much money and so many smart people that are just absolutely focused.
On how do we sell people more stuff instead of oh, I don't know cancer research and how to get off this planet
David: Well, I, personally, I think that. I'm less concerned about that because I think there are less smart people trying to sell you more Virginia Slims than there are smart people trying to figure out how better to blow up people working for the Pentagon. Because you look at science itself, most scientists work for the [00:41:00] government
Matthew: Hmm
David: of the government scientists work for the DOD. So it's a shame that we spend much more of our time learning how to kill people than it is to try to keep people alive and make help them live better lives.
Matthew: that's fair So I went looking to see if I could find any responses to this I did find an article in ad age with the fear mongering sub headline american privacy rights act strikes fear and internet ad members and internet ad industry And i'm warming up to it now if the ad industry hates it because I hate the ad industry and the enemy of my enemy Is obviously my friend unfortunately you have to subscribe to read the article.
And I was on the internet archive. I was able to like, get like 10 seconds before the, before the
David: Oh, yeah. Yeah.
Matthew: kept popping up. And I did see there were a couple of quotes from folks in the ad industry and industry that said that it killed the ad industry as we know it, which given all the loopholes you mentioned, I don't know if that's true.
David: You know, sometimes when that happens, you have enough time to do a copy and paste.
Matthew: I should [00:42:00] have done that. Yeah. Control a control C and then.
David: Yep. But there's enough, you know, subjective language in here that target companies, the government doesn't does that the, the government does not think are playing ball to a sufficient level like Twitter or, I mean, exit, if you call it,
Matthew: Shitter.
David: you know, this is going to increase the cost and companies that have to put everything in place and abide by it and maintain it and fight the legal battles for this.
Matthew: Well, it sucks for them. That's their own fault.
David: I mean, well, the thing is that you can say it's their own fault or whatever. That just means that either you're going to get less products or the product's going to be more expensive, you know, so by punishing the company, you're punishing the consumers. And I mean, it may make you feel good about it.
Forcing them to do this or force them to do that, but they still have to run as a business in order to do that. They have to do things that you're going to, you may cut into their profits a little bit, but ultimately they're going to expect to try to maintain their profit level. Otherwise those money for the shareholders go somewhere [00:43:00] else.
So they have to respond to that. And that means. Cutting back on services, raising prices, et cetera. So indirectly you're still hurting everybody by trying to punish the companies and force them to spend more money on doing this regulatory bull****. I think if,
Matthew: is our word of the week, ladies and gentlemen. Regulatory bull****.
David: yeah, I personally, I think it would be better off if. There are fewer, if no regulations, and there was more money and time spent on enforcement of actual harm and harsh penalties for those who violate those kind of problems, those kind of things. So, like I said, if a company kills somebody, it kills a dozen people because of purposeful decisions made by that company, you put that company to death, you break them up, you sell off all their things.
Shareholders, they get nothing, right? Right. The, their creditors get all the money. The government takes the profit or whatever, or the, the amount of money that they made from breaking up the company and that [00:44:00] incentivizes the company not to do those things. Because that destroys the company. I mean, those officers are now unemployed.
The, you know, the executives are unemployed. The shareholders don't have shares. They get no dividends, et cetera. Instead of saying, well, we're going to punish you up front and try to prevent you from doing horrible things. Rather than saying, well, if you do a horrible thing, we're going to punish you severely to disincentivize other companies from doing that horrible thing.
Matthew: It's actually interesting, an interesting point on the, the money the company makes goes to What happens to the extra after the creditors have been paid? Does the government keep it?
David: Well, I mean, depends on how you, I mean, I've never seen a company put to death. So I don't know exactly how you would work it out, but you could do that. what you do is you say, okay, the company's being put to death. The people who were harmed by this, they get all the money that's made from the destruction of the company.
Matthew: Split amongst them, maybe. Yeah.
David: instead of a class action lawsuit, you know, if you kill a dozen people [00:45:00] and their 50 family members sue, you know, when the company's put to death, those 50 family members get all the profits from the destruction of that company, selling off their assets and everything like that.
Matthew: unlike what you talked about before, that would be worth hunting down companies because that could potentially be a lot of money
David: Yeah, I mean, something like that. It's almost like a letter of market reply, reprisal kind of thing. But I think, you know, everybody would be better served by that kind of mechanism instead of this regulatory mechanism which tries to prevent companies from doing anything horrible. It is, it's like they said in Dune.
No, if you can control, if you can destroy a thing, then you can control a thing. So the way that America works is if you can regulate a thing, you can control the thing. It's like all the threats against Facebook and Twitter and all those social media companies, if they didn't censor the government was threatening them.
And that, that court case is going before the Supreme court now [00:46:00] is that, you know, their ability to regulate is their ability to influence and censor and control. So if you get rid of their ability to regulate that, that eliminates the, the government's ability to regulate and control. Businesses and then businesses can act freely and not have the government basically tell them how to operate like they do in a fascist Society we'd actually work in a free market society But if they did something horrible, then the courts punish them to the commensurate level of whatever horrible action They did I think we'd all be better off in that kind of world than the one we're living in today
Matthew: One day we can hope. All right.
David: Unfortunately, I doubt we'll see it, maybe someday, but I don't, I don't know if we'll ever see it. I mean, but the bottom line for this whole thing is, as usual, it sounds like they're trying to be helpful. But the things you have to keep keep in mind is the government doesn't do anything that doesn't give them more power. don't care about you or your data. This is going to be a [00:47:00] political weapon. And the government who can collect and use your data to throw you in a cage or kill you doesn't have to follow this. And lastly, it's going to, it's going to hamper development algorithms, AI that can be beneficial to people and businesses.
So overall this is going to be a net negative, I think.
Matthew: Tell me how you really feel because I don't think
David: I'm sorry, I shouldn't be holding back. You're right. I should be more upfront about my feelings. That's what my shrink says.
Matthew: I can't see you going to a shrink.
David: But the, so FYI, get ready. I don't know if there's anything you can really do based on, well this, this proposed thing. I suppose you could write your Congressman. Good luck with that.
Matthew: What's the the the George Carlin joke that even in a fake democracy, I think people should sometimes get what they want. Yeah. What do you think about the chances of this passing? I think this has a high chance or low chance.
David: I think it's got a pretty good chance because it's, it sounds like it's got a really good language in it. So we'll see, we have to keep track of it as it progresses through Congress. But one [00:48:00] thing I would say that, you know, if this does pass before the Republicans take over, you can guarantee that Twitter is going to be in the crosshairs of this.
Matthew: I did see that Twitter is apparently pissing off the Brazilians
David: Oh yeah. They're probably gonna pull out. I'm guessing. Yeah. I mean, the worst part about that whole situation is the people in Brazil who work for Twitter are probably going to go to jail.
Matthew: and it's, yeah, of course the folks here in the U S can just happily be like thumbing their nose at the employees in Brazil are like, no, no, stop. What
David: Bolsonaro to go **** himself or whatever, but ultimately the employees that work for Twitter in Brazil are going to pay the price.
Matthew: I always do. Remember the people on top.
David: Yeah. Well, it's the old, the old joke about when elephants make war, only the grass suffers.
Matthew: Oh, and that lovely, that lovely thing. Lovely bit.
David: Well, that's the only article we had for today. So thanks for joining us and follow us at [00:49:00] Serengeti Zucker on Twitter and subscribe on your favorite podcast app.