Redefining cybersecurity strategies with Bruce Schneier

August 20, 2024
  • copy-link-icon
  • facebook-icon
  • linkedin-icon
  • copy-link-icon

    Copy URL

  • facebook-icon
  • linkedin-icon

What does it take to stay ahead in the rapidly evolving cybersecurity landscape?  In this episode of SecureTalk, Bruce Schneier dives deep into his latest book, “A Hacker's Mind: How The Powerful Bend Society's Rules and How to Bend Them Back”, and explores the intersection of technology, security, and innovation. Get ready to rethink your strategies and embrace new ways of thinking. Listen now!

 

Book: "A Hacker's Mind": https://www.schneier.com/books/a-hackers-mind/

Essays and articles mentioned in this episode: 

"The Psychology of Security": https://www.schneier.com/wp-content/uploads/2015/08/paper-psychology-of-security.pdf

"A Bold New Plan for Preserving Online Privacy and Security"  https://www.schneier.com/academic/archives/2023/12/decoupling-for-security.html

"The Hacking of Culture and the Creation of Socio-Technical Debt" https://www.schneier.com/blog/archives/2024/06/the-hacking-of-culture-and-the-creation-of-socio-technical-debt.html

 

 

 

 

 

 

 

 

View full transcript

Secure Talk - Bruce Schneier 

Justin Beals: 

Hello, everyone. And welcome to Secure Talk. We're really glad to have you with us today. We have an exceptional guest, I’m very excited to chat with him,  Bruce Schneier, an internationally renowned security technologist. Bruce Schneier is the author of over one dozen books, including his latest, “A Hacker's Mind”.

Bruce is a lecturer in public policy at the Harvard Kennedy School, a board member of the Electronic Frontier Foundation and Access Now. And he's also the Chief Security Architect at Enrupt Incorporated. Welcome, Bruce, to the podcast. Thanks for joining us today. 

Bruce Schneier: Thanks for having me. 

Justin Beals: I just want to highlight a couple of areas that I got a chance to know your work. Mostly I started off with your website. It was phenomenal. You have a ton of essays and blog posts up there, your academic research, as well as books like “A Hacker's Mind”, which you recently released. It's a prodigious amount of work and exceptionally authored. I just thought very insightful. So my gratitude, Bruce, for all of that information that you're providing. 

Bruce Schneier: How I find the time or?

Justin Beals: That's right. How do you find the time? I suppose it's quite busy. 

Bruce Schneier: You know, it's funny to me; writing is figuring it out. So, it's what I do when I have an idea and I want to understand the idea, I start writing. And that process of writing is the process of learning. So that's how all of these essays and books pop out. They're really part of my intellectual process, which I know is not normal. But it is the way I do things. 

Justin Beals: I have a little background in education and you know, that concept of synthesis at the end of knowledge, that ability to, uh, although I hated regurgitating essays, it was probably pretty valuable. 

Bruce Schneier: So, but for me, writing is easy. It is not hard work. It is enjoyable. And it is not the result of synthesis. It is the process of synthesis. And that's what makes, I think, me different as a writer from the typical tech writer. 

Justin Beals: Well, and certainly, along with your ability to express is a deep understanding of computing security and privacy, and I'm always curious, um, how our guests got interested in security and computing. Were there any early influences? 

Bruce Schneier: You know, I've asked that a lot. I don't have a good origin story. I've always been interested in security. You know, I remember as a kid studying cryptograms and that was something Itook with me.

I  didn't do anything with it in college, but it was part of my early jobs. And then, in the early nineties, I was laid off from a job and started to try to make a living as a freelance writer. I wrote articles and wrote books and it was a crazy time. Those computer weeklies that I remember, the big format weeklies, would pay a dollar a word in the early nineties.

It's more than they're paying today. Not even just for inflation. So you could make a good living writing. An article a week. And that's what I did. And then I turned that into books and then consulting and, you know, now it's decades later, and here I am. 

Justin Beals: Well, why don't we dive into some of this, this work that I really enjoyed, and the first question I have is from your essay, “The Psychology of Security”. We've had a number of guests that talk about social engineering hacks and that particular surface area. And certainly one of, I think the really interesting learnings that I had is  the shift in health care data breaches from essentially social engineering hacks and patient health care information releases.

What are your thoughts about how businesses and individuals can better align security practices with human psychology to enhance that security posture? This is a tough area of human engineering,I think. 

Bruce Schneier: it's tough, but it's not. I mean, there's enormous literature on user interface design. Well, the security and not the problem often isn't the lack of technical knowledge; it's the lack of willingness. Now, you know, like a company like Facebook doesn't want you to have more security. It wants you to give more data to Facebook. So it's going to build its interface to nudge people in that direction. So nudging people to security has to require wanting them to have security.

And that is, is first and foremost, an economic issue. That security is not what companies tend to want for their customers. There are exceptions, and where there are, we are really good at interface design, building interfaces where people understand what they're doing and get good security as a result.

It's taken a bunch of years. I mean, you remember all of the, I'm going to pick on Microsoft security warnings, which would be a whole mess of text and a button that said, OK And the text was incomprehensible to an average user. And what are they going to do not click OK,  I mean, that wasn't written for security, right?

That was written by lawyers to get the user to click. OK. And therefore, they can't sue us. Right. And we could do a lot better. We know a lot about warnings. And there are principles. They have to be actionable. They have to have knowledge the user understands. Go look at the current warning that Firefox has when you go to a website where the certificate is expired.

Yes. It lays out the problem. It says here's a bit of information that you, as the user, might have that we, as Firefox, don't. Here's how you should respond. We're giving you the knowledge you need and now go make a choice. And the choice is biased, the buttons, so that you click, don't do the dangerous thing.

Justin Beals: Yeah. 

Bruce Schneier: But if you want to do the dangerous thing, you can't, you can, you know, that's a way better design of, of a security dialogue. And that's just one example, right? How defaults are, what color the buttons are, where they are, how hard they are to find the configurations. Apple tends to be a lot better at your privacy configurations in your browser than Google is basically because, you know, Apple doesn't make its living spying on you, Apple makes its living selling you overpriced electronics. So, they have a different incentive. But largely, you know, we have the user interface design, we have the psychologists, great research in the past couple of decades. What we lack is the business incentive.

Justin Beals: It's interesting because I felt like this was a thread in some of your writings where something that I like about what you're expressing is that we, as computer scientists or a community broadly that is building this software, could do better work or create better incentives, not perverse incentives for the outcomes that matter to our customers and our communities like broadly. 

Bruce Schneier: And that's often long-term thinking. I mean, the problem with securities doesn't pay off in the short term. So it gets axed in budgets. Now we're right now recovering from the Crowdstrike, right? Uh, you know, failed software update. So there'll be a lot of questions we're going to ask of the company. You know, why didn't they do better testing?

Why didn't they have a phase rollout? I mean, as engineers, we can invent all of the things that should be done, but every one of those comes with a price tag. And, If there's no problem, cutting that makes a lot of sense, but when it fails, it fails badly. We could talk about CrowdStrike's customers. I was stuck in Delta Airlines hell over the weekend because they were overly dependent.

Why don't they have redundancies? Why don't they have face testing for the updates they accept? Why don't they do all the engineering things for security and reliability? Again, they're expensive and they don't directly affect profitability. So they get axed, right? We've built this system that is so lean and so just in time and so brittle, that it is maximally profitable until it breaks.

Justin Beals: I think, uh, it was interesting to me since we're on the CrowdStrike topic that, CrowdStrike had special access, you know, I believe to the operating system and the ability to change that. You know, with a simple push to every Microsoft computer that was assigned to it, it just seemed like there is almost collusion, and of course, Microsoft used the moment to point out that they felt that the reason that access existed is because the EU had regulated them in a way to give people access to their systems. 

Bruce Schneier: Yeah, you know, that, they're protesting too much. The real issue is, again, economics. The cheapest way to build these systems is to as much access as possible.

The most profitable way to exist as a company. Is to get as much access as possible. I mean, the incentives are economic and deep, and we can talk about the surface issues, but the real issues are these economics. If you are a company like CrowdStrike, you want all that access that makes you stickier.

That makes you more of a monopoly that makes you more essential. So that's the direction you're going to go. It is not the safest. And, you know, I'd rather see a regulatory environment where they can't get that kind of access, but that's not what we're going to see. 

Justin Beals: Yeah. I mean, until, you know, it's either the carrot or the stick, and I think we're feeling the stick a little, although this concept I thought was also prevalent in another one of your articles titled “A Bold New Plan For Preserving Online Privacy and Security”. Decoupling our identities from our data and actions could safeguard our secrets. You know, we're talking here about the idea that we don't want one group to have access to that level of the operating system. We need to decouple these systems on some level. 

Bruce Schneier: You know, I'm glad you brought up that paper. I'm really proud of it. It's something I recently wrote. And it's, it is very much an old, uh, idea in cryptography and computer security to, to not pull your eggs in one basket, to decouple the algorithm from the key. So that if someone gets the algorithm, they don't break your system. It is a resilience measure, right?

We want to decouple the encryption we use from storage for the encryption we use from transit. So one doesn't break the other. There's a lot of decoupling we can do that we don't. So we think of it narrowly in security, but really we can decouple data, From algorithms or like processes, applications, we could decouple authentication from applications, and some of it we do if you think about authenticate with Apple, with Google, with Facebook, that is a decoupling mechanism. When we, you know, put, our data in the cloud in a secure enclave, we're decoupling, right? The processor, which is owned by another company with access to the processes running on it, which is controlled by us because we control the enclave.

And this kind of thinking gives us a much robust, robust security. We're sort of moving to a world with a lot of centralizations. You know, I mean, some of them, you know, happen sort of by accident. Access to your email, for example, is often the root access to everything you have. Because pretty much everything you have has a reset your password by sending you an email feature.

And there are horror stories of that. Someone hacks into your Google account and then they have suddenly have changed the password on everything you own. Or someone grabs your iPhone from you. Like in a bar when you have it open, runs away, locks you out of your iPhone, and now at leisure can attack other things, right?, you know, decoupling gives us ways of thinking about fixing those sorts of issues. And I think it's, it's a paradigm shift, and I urge your listeners to read the essay. Please put the link in whatever show notes, but actually Schneier decoupling, you'll find it. And it, we really talk about sort of a new way to think about.

Security that I think is just more resilient, you know, more robust, more suited to the way we run the world today. Another example, I'm part of a company called interrupt, which is trying to commercialize Tim Berners Lee solid edition. Solid is basically decoupled data storage. It allows you to decouple your data from the applications that use that data, giving you a lot more control, and you have a lot more security over the data.

In addition to being, you know, generative, new applications can appear, uh, it's more reliable, you can share it better, just a whole lot of benefits, but it really is decoupling. That makes me excited about. 

Justin Beals: Yeah, I was struck by two things out of the article. First is, is as someone that is designed technology architectures for software deployment, you know, taking into account security in the architecture is not something I was ever incented to do.  Of course, you know, that's not always what the business sees is efficiency at the end of the day. Uh, but not a hard concept to ingest, right? We do a lot of decoupling just between software layers and horizontal deployments already. We could just be better about the separation of information. 

Bruce Schneier: I think better and also more deliberate about it. Some of our decoupling is, almost incidental. I mean, yes, software is, is a lot of decoupling just because that ends up being cheaper. It's all, you know, using libraries. We got to decouple smartly. You don't want to, you don't want to make it so that everything is critical. I mean, the CrowdStrike issue illustrates that there are hundreds of companies that these networks rely on, any which of one, if they fail, is fatal to the network, right?

What you want is something where you have hundreds of companies where you need, like, half of them to fail in order to fail the network. You get resilience from the decoupling rather than brittleness. This is the whole point of that very famous XKCD cartoon, which is always displayed whenever one of these obscure random little companies turns out to be critical.

Yeah. I mean, you'd never build a house that way. Yeah. I mean, we, we, we'd never build an airplane that way. If any part breaks the airplane falls out of the sky. That's ridiculous. You build it so that if any five parts break, the airplane still flies. 

Justin Beals: Well, I think this is also similar to the change healthcare issue that we've seen, right? We have a highly interconnected data source for processing all kinds of medical transactions, having a failure.

Bruce Schneier: and, and again, this goes back to economics.  Change Healthcare has made themselves a monopoly. So that if your pharmacy fails, the pharmacy across the street fails also because they're using the same data source. So, again, we have this brittleness, and it's not the tech. It's the economic incentives that lead us to failure. To this particular type of tech. Yeah. So consolidation, you know, the monopolization, these are all big problems here that are affecting security. Now in the United States, in the past few decades, you know, antitrust has really been gutted, so these countries are being allowed to get dangerously large. Right? Too big to fail, right? Too big to fail. That phrase is a security risk right there.

Justin Beals: I think we've only seen that couched in economic network terms, not in data and infrastructure terms, because it's business. Yet we have only, 

Bruce Schneier: You know, we techies all know what too big to fail means in the tech world.

Justin Beals: The other thing that struck me about the article, Bruce, and I've been, I've not been a huge fan of the blockchain tools or their most recent applications, I should say, in, in financial, in the financial space,

Bruce Schneier:  You know? I think blockchain is the stupidest thing in the history.

Justin Beals: I might be right there with you because I keep complaining to people that this is just a database, right? 

Bruce Schneier: it's it, you know, it's right. It's the computer database. It's a consensus algorithm. We know how to do these things. I know you just, if you remove the blockchain from any application, it tends to work better. Yes. Yeah. It provides no value. Adds so much insecurity, so much unreliability. Just a fricking disaster.

I get that the tech bros like to stick it to the, to the Fed. Yeah. And we're all libertarian paradise, but no, it doesn't actually work. 

Justin Beals: Although I did think, you know, in setting up what you describe as security enclaves, especially as we're handing off data and we want to anonymize some of those transactions,  but be able to track them that these might be the interesting types of technologies that could glue some of that connection together. 

Bruce Schneier: And blockchain doesn't block is highly centralized. I mean, there's, we have a lot of good degeneration technologies that we can use for decoupling, but no blockchain, certainly not one.

Justin Beals: Okay, good. Well, you also mentioned in the paper, and I have, I think it's a great quote. You say we need a belt and suspenders strategy. With government policy that mandates decoupling best practices, a tech sector that implements the architecture and a public awareness, uh, of both the need and the benefits of the way forward.

And I'm curious about the public understanding of security issues. I think that it has been a struggle sometimes for the public to understand what we're in for. And you've obviously been a big voice in trying to advocate, you know, for a better understanding. Are there, what is, what do you want to be, you know, what do you wish you had as a tool set to be, uh, helping the public understand better these concerns?

Bruce Schneier: Yeah, it's interesting. I mean, I'm pretty specific what I want. I don't want a public that needs to be technology experts. We have to build a society where non experts can live and work and operate and not have trouble. You know, just like, you know, you can walk into a drugstore and not be an expert in pharmaceuticals and still be able to buy things over the counter and not needlessly kill yourself.

You know, we do need understanding about policy. Now, as much as we can talk about different policy needs and different things that would work and things that would help, this is not a political issue. You know, there will not be a question in the presidential debates about this. This will not be something that, candidates will have policies about, except at a very cursory level.

And that is, you know, really unfortunate. You know, we, I want this to be an issue that people care about and that people campaign on. Because that's how we're going to get change. We're not going to get change any other way. So, you know, what do I need? I need this to become important. Important in a way that that is going to matter.

And whether it is important via the national security lens, Right. You know, the, these things are a, domestic threat, whether it's security through public safety lens, through a privacy lens, through a consumer choice lens, through an economic lens, these are all different areas that the things we're discussing. Affect, but are not really talked about that way. 

Justin Beals: I think the importance issue is hampered by a couple of different things. If I think, especially about the algorithmic nature of data today and how big an influence data brokers like Meta are in our impression of what reality is, that very concerns me for it even being exposed.Just because. That runs counter to their entire business model.

Bruce Schneier: But they also have a lot of money, and they're, you know, deploying that money to make sure the laws don't change. This is the United States. Money, you know, money in politics is a huge thing. It makes a big difference as to what happens.

Justin Beals: Well, and it feels a little bit like we're seeing the days of the Railroad baron. Oh, I mean the newspaper with things like Twitter's recent purchase. 

Bruce Schneier: Right. And, and, and again, we're back in the monopolization. You know, if there were a hundred social networks that were all interoperable, if you start to look more like email, then, you know, it would be a very different world.

But when you have a few big companies controlling everything. It's you know, you get different things. 

Justin Beals: What's your thoughts on the Fediverse, Zidane style organization? 

Bruce Schneier: You know, I don't know the Mastodon in detail, but that idea, email, is a perfect example of how you do distributed communications that, you know, we all have our own email provider, and our email just interoperates.

And you can imagine a social network like that. There are hundreds of social network providers, and you subscribe to them. Speed looks a little bit like RSS. You can subscribe to other people's feeds, you can build your viewer how you like, you can set your algorithms how you like, and then you are much more in control over what you're seeing and how it's looking and how it works.

We can do this. There's nothing technically impossible. Prohibiting, there's a whole bunch of, of company. I mean, if you tried to do this and tried to link Facebook into this system, they would sue you out of existence. So, right. I mean, so again, we are back to the economic incentives. 

Justin Beals: That's right. Yeah.

Bruce Schneier: So, right. I mean, so again, we are back to the economic incentives. 

Justin Beals: And I think this is a great line because it leads to one of the other articles I quite enjoyed.

So I've, you know, tech debt has been something we talk about, as a CTO in my role in the past, or even a software engineer for a long time. And you have a recent article titled “The Hacking of Culture And The Creation of Socio-Technical Debt”. I want to start with one of the premises of the internet culture, and I certainly remember this and was deeply bought in that all information desires to be free.I'm curious if you still believe that holds true today. 

Bruce Schneier: You know, I'm not sure it was ever true. It's a very technical statement. 

Justin Beals: Yeah. 

Bruce Schneier: You know, and John Gilmore's a friend of mine. I know what he meant when he said it. Right that on a technical level, data,  you know, tends to flow; it tends to flow out whether it wants to is less relevant In a world where laws matter, It doesn't matter how much data wants to be free if it's illegal for it to be free.

It's not going to be free except in, you know, sort of, you know, edge cases where someone steals it and publishes it, I mean, so what that means is, you know, once you, once someone hacks your data and publishes it, you can't get it back. That is a factual statement. That is not a moral statement, right? because that could be both good and bad. We can imagine that being both positive and negative. When AT&T leaked pretty much everybody's everything, that was really bad. But when we use it for government accountability, that is good. But really, the social layer matters a lot here. And, it is not hard to build walled gardens, you know, I mean, the New York Times has done a great job making sure that their information is behind a paywall.

And regardless of what it wants in any abstract sense, you know, their business model is successful. So they're not struggling. They might struggle financially, but they're not struggling technically. Even though I can cut and paste a New York Times article and send it to you and you didn't pay, and you got it. So, that doesn't matter in the, in the main. 

Justin Beals: One of the things that I found very insightful in this article is that you identify this almost as a transfer of power. And that there was this pretext that we were building these platforms that allowed people to connect and share content, but that's facilitated a power shift where the state it used to be the most knowledgeable entity about its population to corporations being much more knowledgeable about a population. 

Bruce Schneier: And that's been interesting to watch. Right? So, I mean, the biggest surveillance companies are our private companies, are Google, are Facebook. And, uh, You know, a lot of these companies have governments as customers.

So, yeah. So, as we move our stuff online, I mean, this is sort of a generic trend as we move our world online; computers generate data as part of their processing. That data is inherently surveillance data, it's information about what just happened. And, you know, the cost of data storage drops to free, data processing drops to free, companies are saving stuff they used to throw away, and now all of this surveillance data, your location history, your purchasing history, your conversation histories, these very intimate things are being collected by companies sort of as a matter of course.

Over what they're doing. 

Justin Beals: Yeah, I believe they've learned to weaponize it in a way the changes our fundamental perception of reality Algorithmically, 

Bruce Schneier: yeah, I mean weaponize it is it's a pejorative. I would use it too. We know they would say not weaponizing it We're using us. We're we're right. We're we're built a business around it We're giving you all of this free stuff because we are collecting this right there is some quid pro quo here, right?

But, you know, weaponizing is not a bad term, I think. They are certainly using it in ways that we didn't imagine and are largely Ignorant of, I mean, people know that their data is being collected. Maybe they know it's being used to train AIs or being used to serve them, as I don't think people fully understand what their data shadow looks like and the kind of information it reveals about them.

Justin Beals: I think we want to see ourselves as an independent actor from the environment, the context in which we operate. And we're, we're not, you know, we fundamentally make decisions based upon the information that's now served to us. As opposed to what we went out and found. And it 

Bruce Schneier: turns out the ability to curate is incredibly valuable.

The ability for me to determine what you see. And companies like Google and Facebook have that. And that does it affects what you think. It also affects what you think others think, and you see this in people's beliefs that Twitter reflects the world, right? What they see on Twitter is what people think.

Twitter is, turns out to be an extraordinarily skewed slice of the population. And butt, you know, we don't think that. And if we live in a world where the things on social media are written by AI, which basically means they're written by companies, by money that is using AI to generate content that affects what we think.

It also very much affects what we think others think it affects what we think the conversation is. That's, I think, very worrisome.

Justin Beals: I mean, cause their point with AI is the scale at which they can build that information. and the specificity at a granular level, you know, you as an individual, what you see can be slightly different than another individual.

Bruce Schneier: Which is already true, right? That isn't new. I mean, that, that's, that's your algorithmic feed or that, that was invented, you know, early social media, but it can be much more finely grained and much more precisely personalized. 

Justin Beals: Yeah. And my struggle with the current hype. A little bit is we have, you know if I boil the ocean some, it's like, yes, we've invented a better user interface by having natural language processing engines that can write in something you would expect more from a human.

We wouldn't expect that from a computer as opposed to any real major shift in AI broadly, right? Like I don't see a new mathematical algorithm that we didn't understand in a prior period driving this innovation. 

Bruce Schneier: You know, it was, so don't know it's, I think about this a lot. You know, what we know about, you know, tech changes like that is that they are discontinuous.

And so right. We're not going to see that before it comes. That's not something that will be predictable. So I, you know, so yes, I think that's, that's what we, that's what we get. 

Justin Beals: Okay. So a little lighter topic, um, here at the end, well, maybe not. We seem to do good at diving into both sides of the coin, Bruce.

I did really enjoy your book and got a chance to read A Hacker's Mind. I thought it was great and I, one of the things that I found really intriguing at the opening is you describe kids as natural hackers. And I'd wonder if you'd relate to us how their behavior exemplifies hacking for you. 

Bruce Schneier: So this is, so in this book,  I'm generalizing the term hacker.

So we know it from computers, from the internet, and to me, hacking is rule-breaking in a very specific way. It is finding a loophole in the rules. Finding something the rules allow but is unintended, unanticipated by a designer. So it's easy to think about a computer code, there's a mistake, it lets you get some permissions, but you can think of tax loopholes.

Right? It is a loophole. It is, it is an unintended consequence. In my book, I talk about hacks, of rules in sports. Anytime you have someone just wanting an advantage, they're going to look at the rules and figure out a trick, right? the rules allow this, even though you never thought so, but now the rule, but, but I can do it and kids do that naturally.

I mean, if you have kids, they're always trying to figure out, well, you know, you said bedtime was ten, was nine o'clock, but, you know, you, you didn't say this part, or, you know, you have to eat your vegetables, chocolate's a vegetable, I just looked it up, it comes from a plant, it's a vegetable, what are you talking about?

So, that is, is hacking, and, and because, I think, kids have a different relationship with the rules. And adults do; they don't really think of them the same way. They are natural loophole finders. And I find it wonderful. I, it is what I enjoy the most about kids is their ability to break rules without even realizing it.

Uh, I mean, I have some stories, and some of these are they know what they were doing. Some kid is a kid of a friend of mine. Realize this is during, during COVID is their Zoom classrooms. They change their screen name to connecting dot, dot, dot, and then turn their camera off. I mean, that's frickin brilliant.

Justin Beals: Yeah, it's really good.  I think the cultural motif has appealed to me. In my younger years and growing up, I learned to program by playing with computers and figuring out what I could do with them. And then, also one of my, um, favorite activities still to this day, although I don't know how much longer I'll be able to keep it up, is skateboarding, which is the act of hacking a context for me, or an environment, or figuring out, What can be done?

 

Bruce Schneier: That's right because you're using it in a way that it was never intended to be used. And actually, in some ways, the people who designed the antenna would prefer you didn't use it that way. 

Justin Beals: That's right. There are all kinds of little tools placed in architectural environments that I, that No, are there to limit skateboarding.

Bruce Schneier: That's right. We can put little bumps here and there that are security systems designed to prevent skateboarding. All right. I will think about this. I like this. This extension of my metaphor. 

Justin Beals: I, one of the things that I never realized about hacking that you correctly point out, is actually how pervasive it is.

And one of the posits in your book is that the most prolific hackers are something that the rich and powerful do, something that reinforces existing power structures. And I thought this would be an important example to present. You know, this is a tool. And but it can be used for good or bad in a way.

Bruce Schneier: So I'm using hacking as a technical term. So it's sort of value-neutral, but even in the computer field, the most effective hackers are the governments, right? The NSA is way better at hacking than than the cyber criminals are because they have just bigger budgets, right? Hacking takes resources even more.

Generally, think of something like a tax loophole. If I discover a new tax loophole. Maybe I save a few hundred dollars. Maybe if I'm lucky, I save a few thousand dollars in my taxes. Goldman Sachs discovers it. They save millions. They sell to their customers. I mean, they just have more raw power. Also I mean, they have more tax attorneys who can read through the tax law, looking for the loopholes.

So, you know, they're better able to profit from them, and they're better able to find them. And this is largely true in, in general, that we think of hacking as this counter cultural thing. But if you really think of it as rule-breaking. It is something the rich and powerful do more if you are, you know, Uber, you break all the rules about taxis If you're Airbnb, you make all the rules about hotels.

And you make a fortune doing it I mean, I could break those same rules, right? I can I can make use of those same loopholes And I'm not going to make nearly as much money. 

Justin Beals: Your book is great. The chapters literally highlight different hacks that have happened, sometimes on a large scale. 

Bruce Schneier: It was super fun to write. I really enjoyed it.

Justin Beals: It was very enjoyable to read. And I think what you get  Cultivating almost a mindfulness around when's an opportunity to break a rule and when not from looking at all these stories around different hacks. Yeah, Bruce, one other thing that I found in the book that I thought was phenomenal, and it goes a little bit back to how technologists like myself can be better architects of good security, is the chapter on security by design, and I think these are commonly understood but maybe not often shared. Would you share with us the four major areas? I'm happy to relate them in the security because I'm not going to remember that simplicity, defense in depth, compartmentalization and fail-safe.

Bruce Schneier: It's funny, and this goes back to the stuff we talked about at the beginning of this podcast, where we talked about CrowdStrike. I mean, these are all the technical ways that we, as security people, have developed. Some of them are ancient. You can see them in castle design to aid security. And we just need to do them.

Justin Beals: I thought the fail-safe one when I consider the CrowdStrike situation was perfectly applicable. That's right. 

Bruce Schneier: They didn't fail safe; they failed terribly. 

Justin Beals: That's right. 

Bruce Schneier: And that's, and the word to use is brittleness. 

Justin Beals: Yeah, it's certainly; when I have the next chance to design a technology architecture, I'm excited to bring these four concepts into them, and they help us build a better piece of software to maintain and innovate on top of any ways when we choose to do this. Because, in my mind, simplicity means more people can work on the code. Compartmentalization means that we can innovate a certain area of the system, not affecting others. and then when we get into things like defense in depth, it just means that we're managing our data more cleanly. And along the fail-safe side, I haven't designed much of my software to fail-safe. We always say we'll just buy more servers, but I certainly think, but, but, 

Bruce Schneier: but that, 

Justin Beals: that is 

Bruce Schneier: right. That's redundancy buying more servers is a security mechanism, right? I mean, it's a security mechanism that lobsters use for their offspring. We're going to make a million eggs, and some of them will survive. it's a pretty reasonable strategy. 

Justin Beals: Well, Bruce, my last question is a little open-ended.

Bruce Schneier: All these very close ended questions have been asked me all along.

Justin Beals: So your thoughts on the future of culture and security, you know, maybe what would you like to see changed, and what do you think is changing that makes you more nervous? 

Bruce Schneier: Yeah. I mean, what we're seeing is the integration of computers into everything. I mean, this is the Internet of Things, that computers are no longer things you interact with via keyboards and screens.

That has become the minority of the computers in your life. And when that happens, computer security becomes everything security. And then all of the Everything systems we have in place take over, you know, planes, cars, medical devices. These all have regulatory agencies that have nothing to do with computers.

And they're gonna have to grapple with the computerization of all of these fields. We're seeing that in AI; we're gonna see that everywhere. And I think that's the interesting thing to watch. And that was actually my previous book, sort of on on Internet of Things and security. And the click here to kill everybody. And I think that's becoming AI is, is, is making that even weirder. 

Justin Beals: Bruce, I am very grateful for our time today to chat and your works, which I got to read, really appreciate it. I highly recommend it to our listeners, and we will include in the show notes some of the articles and books that we've talked about today, so Bruce, we wish you really well, and we're very grateful for your time today.

Thanks for joining Secure Talk. 

Bruce Schneier: Thanks for having me.

 

About our guest

Bruce Schneier Lecturer in Public Policy Harvard Kennedy School

Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of over one dozen books—including his latest, A Hacker’s Mind—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Klein Center for Internet & Society at Harvard University; a Lecturer in Public Policy at the Harvard Kennedy School; a board member of the Electronic Frontier Foundation and AccessNow; and an Advisory Board Member of the Electronic Privacy Information Center and VerifiedVoting.org. He is the Chief of Security Architecture at Inrupt, Inc.

Keep up to date with Strike Graph.

The security landscape is ever changing. Sign up for our newsletter to make sure you stay abreast of the latest regulations and requirements.