- Home >
- Resources >
- SecureTalk >
- Privacy reforms we need now with James B.Rule
Privacy reforms we need now with James B.Rule
It’s easy to consider privacy as a technology issue, or a legal challenge. But our concepts of privacy have a lot to do with what type of community we would all like to live in. What happens when we consider privacy a right as opposed to a commodity?
Join us on Secure Talk for an in-depth exploration of the complex world of privacy with esteemed sociologist and author James B. Rule. In this episode, he draws parallels between past and present institutional power. He discusses his latest book, “Taking Privacy Seriously: How to Create the Rights We Need While We Still Have Something to Protect “, where he delves into the intricacies of privacy laws, the implications of personal data commercialization, and the notion of "personal decision systems."
James presents 11 practical privacy reforms, highlighting the importance of informed consent and strong data protection measures. This episode offers cybersecurity experts valuable historical context, actionable insights, and thought-provoking discussions on how to balance privacy with technological advancement. Join the conversation on how we can protect what truly matters.
Join us as we examine the challenges and potential reforms related to privacy in the digital age, highlighting recent legislative successes such as California's new privacy laws. James advocates for the establishment of national institutions dedicated to promoting privacy. He also discusses the ethical dilemmas faced by technology and policy leaders in striving to find the right balance between the utility of personal data and the protection of individual privacy.
This episode is essential for cybersecurity experts interested in privacy reform and the history of personal data usage.
00:00 Introduction to SecureTalk
00:32 Exploring the Complexities of Privacy
01:21 Introducing James B. Ruhle
02:56 James B. Ruhle's Journey into Privacy
06:55 Historical Perspectives on Privacy
09:10 Modern Privacy Challenges and Solutions
15:48 The Concept of Lawful Basis
23:59 Personal Decision Systems
26:26 Proposed Privacy Reforms
36:56 Public Events and Privacy Issues
42:55 Conclusion and Final Thoughts
View full transcript
Secure Talk: James B. Rule
Justin Beals: Hi everyone, and welcome to Secure Talk. This is your host, Justin Beals. Today we're talking about privacy. It is a complicated subject. Privacy can have a technology perspective, It can have a business perspective, It can have a legal perspective, and it can have a philosophical perspective.
And The problem of privacy, is made difficult to understand by the different perspectives with which we can approach it. Certainly, I think across, especially the United States and colleagues and friends that I talk to worldwide, there's a deep concern about what should be kept private and what is not private and who we give that data to with what power they have to work with that data.
So this week, we have an exceptional guest, an expert in this space. I'm really excited to introduce James B. Rule. He is a sociologist and writer based in Berkeley, California. He's a distinguished affiliated scholar at the Center for the Study of Law and Society and part of the UC Berkeley School of Law.
James is the author, coauthor, editor of nine books and monographs. And the book that we talk about today is his new book from 2024 “Taking Privacy Seriously”. It's an exceptional book. I highly recommend it. I found it very pragmatic in what a sociologist might like us to change about the way we operate as local, national and global communities with our privacy and privacy rights.
His scholarly writings have covered a wide range of topics, including the role of social inquiry in promoting social betterment, the causes of civil violence and militancy, and the progress and development of social science. James continues to study and write about privacy and information, along with various other subjects, for both scholarly and general audiences.
His recent articles have addressed issues such as militarism American foreign and domestic policy, banking regulation, Israeli expansion of the West Bank, and the theories of Cass Sunstein regarding the domestication of social movements. Love to welcome James to secure talk today. I hope you enjoy our conversation.
—
Justin Beals: James, thanks for joining us today on secure talk. Of course, what we're interested in right off the bat is how you got interested in privacy?. Love to hear the story of that.
James Rule: Well, I was a graduate student back in the wonderful 1960s, and I was working with a wonderful historian, and now no longer with us, Charles Tilly, who was studying the evolution of protest in France, of which there was a lot right, going on right at that time, and the role of popular participation in the creation of French government institutions, how did France get to be a modern nation-state with the lines of authority and the lines of protests and contests and all of these things we associate with mature democracies.
And at the same time, I was a grad student at Harvard, so I was subjected to all of the student management processes that big universities now routinely do.
So that, for example, in the middle of the day, I would be studying some riot in rural France over an issue of taxation or of government authority, and in the evening, I would be fussing with Harvard about my library finds and that did I have enough credits to qualify for this or that? And I noticed that it was quite significant in my relations with Harvard that Harvard, though a big institution where two people, university authorities and university students, often didn't know each other very much, that the university had a comprehensive list.
And if you got in trouble with one part of the university, you were likely to be in trouble with all parts. In other words, if you hadn't paid your library fines, then you would find that you're excluded from certain privileges and interactions that you might like. And so I began to think, well, this is really like what Tilly is studying in the 19th century, transformed into the 20th and soon the 21st century, where the access of government institutions to comprehensive lists of subjects or citizens in that case, uh, is rather like being a student at Harvard.
If you get in trouble with one part of the government. If you stop paying your taxes, if you have an outstanding conviction for something, you will find that trouble arises in other parts as well. And this seemed to me to be a kind of generic thing about how big institutions grow. They learn ways of providing inducements, sometimes positive inducements, but certainly also negative inducements for those who don't fit in, in one way or another.
So I, in the daytime, as I say, I was studying fascinating French accounts from French journalism, mostly of how the government managed to become a central, imposing feature of the French people's lives, and in the evening, I was jousting with Harvard often about, could they hold me responsible for something that happened when my card was in the machine whether I was the one who was doing it pressing the card or no.
And that began a long series of studies of present-day institutions, credit, policing, things like even social security, which is a way of binding people to the state or binding you to your institution if you're in a modern corporation. And so I got fascinated with records, certainly with the abuse of records, and also with the power of personal record keeping for positive things as well.
Justin Beals: Yeah. I definitely remember in college missing paying a parking ticket and trying to figure out how to get my financial aid checks dispersed at the same time. It was a challenge. Yeah. It's such an interesting perspective to come at the privacy problem from, you know, French politics and history.
And then seeing, I think, systems designed for an outcome. Was there a thread, historically, and I don't know, I'm, I'm ignorant of this, of as the changeover of governments, in France specifically, and the retention of information about the population, to essentially govern them.
James Rule: Well, the French are very aware of their status with the police, and the police have enormous discretion.
A few years after my time as a grad student, I was a Fulbright lecturer in Bordeaux, right after the, not during 68, but in the late 60s, early 70s, and people in my department in the University of Bordeaux said, on hearing of this research, they would say, well, we noticed this too. We found that when we applied for government grants, the police had to declare your record with them Vierge, Virgin or you had no possibility of getting the equivalent of a Fulbright grant, which is what I was enjoying there in Bordeaux.
And the notion that personal information flowed sort of like water as a kind of natural force without any constraint in terms of what could be recorded and what conclusions could be drawn was then just coming under attack, and it's been under attack ever since, and not necessarily successfully.
Justin Beals: You know, I think due to the historical nature of privacy issues, you know, that you're describing, and even to the modern day with the systems that we have, and I, I loved your book, just going to mention it here, “Taking Privacy Seriously: How To Create the Rights We Need While Still Having Something to Protect”, I have this foundational question that I struggle with, James, which is I'm both a consumer of privacy-invading technologies. I have a mobile phone. I have a Google account on Gmail. I'm really aware of even certain systems that I've decided not to use anymore because of the privacy issue, but I also develop solutions where we ask for data.
We store that data, and it's oftentimes about individuals, about people. Is, you know, how did you feel in coming after this topic area? And you must have been working now on it for most of your career. Can we put the genie back in the bottle, right? I, I fear that there is no ability to walk things back, that we've come so far.
James Rule: Well, this The latest book, “Taking Privacy Seriously”, is an attempt to countervail against this attitude in the following way, not to deny that there are excruciating trade-offs between life-giving and reinforcing uses of personal information on the one hand and soul destroying and anti-democratic uses on the other hand but to say that, and this is my position, that these questions are much more political than they are technological.
They're like the Democratic Party or Republican Party platform. Who gets what options, what government interests or private interests get to decide what happens to personal information? And sometimes there are disappointments and sometimes there are victories for both sides. But. It's, I think we've gotta get away from the idea that technology has issued us a series of orders, uh, and that, our role as historical figures is simply to act them out.
I just don't believe it. We can get along with more personal information technology. We can get along with much less of it. And above all, we can make discriminating decisions about what options we're going to choose and which we need not choose.
Justin Beals: I love this discriminating decisiosn perspective because I don't think fundamentally the technology cares at the end of the day, right?
The database has no idea how big or small it is or the decisions we make, how to use it. But the culture in which we operate dictates how we may believe what data we can store, how we use it. And what can be done to that point, there are privacy codes or laws that have been implemented, you know, around the world.
And I think one of my first questions for you, James, was, in your opinion, what's the current best privacy code out there? Who's got the best protections, and why do you like them? Yeah.
James Rule: Well, I don't think I have a single answer to that question, but I can point to some significant victories. And I think recent legislation in California, although certainly limited in many ways, has the virtue of enabling ordinary consumers to opt out of collection of information, which is not demonstrably necessary to make sales.
If you want a lot of credit from Tiffany's, then you probably will have to do something to reassure Tiffany before you open an account. But if you are simply an ordinary consumer, in the way that most of us are, you ought to be able to fly blind in that respect. You ought to be able to set aside your records after the records are compiled.
And you ought to be able to limit how much, can be ascertained about you, even in the interest of making a good credit decision. And, of course, if protections of privacy are stronger, even at these initial stages of credit use, there will be more mistakes, and conceivably credit will become at least incrementally more expensive.
I think I would be willing, as a good democratic minded with a small D citizen, to have a world in both in terms of credit and many other respects that works a little bit more haphazardly, in order to have a world that gives more second chances that doesn't create such tremendous imbalances of power between big institutions and ordinary consumers.
Justin Beals: Yeah, there's an acceptance that, you know, perfection is a little bit something we need to give up on in a way, like you can't have perfect data and perfect privacy. You're going to need to give up perfect data. I think to your point, you know, like the credit rating, there's a lot of algorithmic tools that we develop on top of data and the more data we believe tells us that it will be more accurate than the projections that it makes.
And asking less data, do we really lose that much precision in the projection? I've certainly worked on data systems where we were like, we lost a point of precision for a huge gain in a smaller database. You know, 80 per cent data reduction.
James Rule: We also have to ask, I think, whether justice is part of our goal, you can predict an enormous amount about people's behavior in credit or in terms of crime, who is going to become a criminal, or who is going to be a trustworthy employee.
You can predict an enormous amount by associations. In other words, people who resemble people who have done a lot of bad things, probably could be shown in many cases to be risky themselves. Is that a good enough reason for excluding people who resemble people who've been troublesome in the past?
That, would be a depressing conclusion because it would preclude evolution in positive directions.
Justin Beals: I want to take a second to applaud a part of the way your book is constructed and also bring forward one of the topics in it that I really liked. First off, I found the book to be very pragmatic for privacy reforms.
As a matter of fact, you discuss 11 primary privacy reforms that you would like to see implemented and what the impact of those are. And as an engineer, I thought that was great. It helps me break it down and scope the outcome that we're working towards. But one of them that I found intriguing, that I didn't understand the concept of, and I think I could still use some explanation, is lawful basis.
And when you talk about the size of the database and the amount of data that we want to collect, you mentioned this concept of a lawful basis. What, what does that mean?
James Rule: Well, it's, you're quite right. It's not an extremely precise specification like the speed of light or the core capacity of your computer.
But, it's, it means a rationale that's consistent with publicly defensible values. And we don't like, and we shouldn't like, laws that constrain people's lives without some form of justification, which is, of course, not, will never be precise as a kit for assembling your new Google Chrome equipment, but which we can't get along without.
I mean, this is the, this is the nature of public life. Some people favor measures that other people would not approve of for the same for the same ends. We don't. Most of us don't believe in flogging anymore for shoplifting. It seems like a degrading and retrograde step. But at least it's some kind of rationale that says, you know, shoplifting is a bad thing and something ought to be done about it.
In privacy matters, people, citizens, policy types, would-be philosopher kings like ourselves perhaps, we have to ask the question, is a use particularly a compelled use of personal data morally or politically acceptable enough to warrant it's imposition? A lot of privacy talk has a lot of privacy policy has emphasized the importance of letting people who are subjected to data collection know the reason for it and to examine the quality of the data, and challenge the accuracy of the data.
But that's not enough, I think. Think of the shocking result, the shocking recourse to forced examinations of virginity in some countries as a precaution. That's a constraint against women's prom promiscuity.
You apply privacy certain privacy rules to that. You could say, well, every woman who is, experiences this and feels that it's unjust could appeal to the record cite evidence and, demand an apology if this is, uh, if this is deemed incorrect. But a prior Question that does not get asked often enough in class modern privacy wonkery is this form of data collection warranted in the first place? If you look at the at the, now vast corpus of privacy laws in the, in today's world, you see that, most of them, do not engage this, what I think of as a private prior question. Is it warranted for, any agency to demand a medical, or psychological background analysis of people before they're admitted to certain sensitive roles?
Well, right. it, it may be efficient, but maybe that's a step too far for some, for some sorts of privileges. And Europe has pushed this need for legal basis much farther and more explicitly than the United States.
Fortunately, not to the extent of the hair-raising the examples that I just cited, but, you know, if you are subscribing to a magazine and somebody thinks that you're might be a former sex offender and that it might be dangerous to let you see these materials,? is it worth trying to investigate that area of people's lives in order to determine their eligibility to read pornography or quasi pornography? I certainly would say no. And, I hope that there's a lot of company for that position.
Justin Beals: This, has come into, I think, a pretty sharp contrast. You mentioned the EU is pushing this concept of legal basis quite a bit. We've had this particular recording scheduled for a little while, but just in the last week, LinkedIn has received, I believe, a $334 million fine for violating GDPR.
And one of the areas was a legal basis for collecting some of the data that they were collecting. Yeah, but that's teeth. I mean, it's not a small amount of money. I mean, Microsoft has a lot. I'm not sure that it really hit the balance sheet that hard, but yeah.
James Rule: We don't live in an intellectual culture that has very different and not necessarily consistent ideas about personal information.
On the one hand, for many purposes, we don't want to stop public communication about people, about personalities. We certainly don't want to, uh, curtail the stories that seekers of public office Uh, tell about themselves because then what would happen to democracy if you had no more mudslinging, then, things would not, we would, we would not want to live in that world.
On the other hand,we live in a world where personal information is worth a lot of money, a lot of above all in aggregate. And I believe this will be an issue for the children of people listening now, who won't be around anymore in half a century or even a century, how much can you predicate the, the conduct of institutions toward people based on the associations of those people, with other people?
In other words, if people, if a company wants to sell only to a subset of potential customers. And that opportunity to be a client or customer is predicated on the resemblances or lack of resemblances of potential customers to previous customers. That sounds like quite a profound difference in privilege that's being allocated to people on the basis of associations.
But people in my sort of business as sociologists and market researchers are increasingly ingenious at noting associations that nobody ever thought of and that nobody really has control of over their lives. So In the book it says if somebody in the CIA discovered that, secret agents were major consumers and unusual consumers of a particular kind of toothpaste, you spies on the other side, then you would not want to use that toothpaste if you're wise, whether you have any subversive intentions or not.
And it, it worries me, and of course, I'm one of the, I'm a member of a profession that it does a lot of predictions on the basis of similarities, but we don't want those predictions necessarily to weigh too heavily on people's lives. I mean, that is a consideration of justice, not of efficiency.
Justin Beals: Yeah. You have a concept in your book; you call it a personal decision system. And I loved the nomenclature and the definition. Could you help describe for us what a personal decision system is?
James Rule: Sure. Personal decision system, is everything from taxation machine the machine that produces your credit rating or your, uh, your, uh, marketability rating for product.
It's a system that makes decisions about you that matter on the basis of stored data, personal data, data either about you yourself or about people like you.
Justin Beals: And then, you know, we can use those, I think those concepts of the personal decision system to think about all the different technologies that we load this data in right.
And. And you would say, essentially, any personal decision system should get caught up in things like GDPR, how we store the data, or also thinking about what rights we have in interacting with the owner of the personal decision system.
James Rule: Yes, well, we don't want to live in a world where access to personal information is the last word as to how people get treated.
Some, some decisions about how institutions treat people, need to proceed without need to proceed only with a sense of justice and social necessity. And, uh, But I want to emphasize what I think of as a kind of contradiction in our culture. We don't want to be able to talk about people for many purposes.
We want to be able to, We want to weigh the lives of people around us and say, you know, has this person lived well or badly? We want to weigh personal decisions that people make in terms of their family lives or in terms of their business roles and judging their character?. And I think we, we can't get along in life without some such decisions.
But, uh, if it's, if it's big business, if compiling information about people on a large scale and selling it is, a major effect on people's life chances, as sociology, sociologists call them, then I think we have to be very careful.
Justin Beals: Yeah. Um, One of the 11 reforms or changes that you mentioned, there's one, another one I'd like to draw out if you're open to it.
It's in chapter four. It's called reform number six. And you write that you would, you would like to see us establish a new national privacy-promoting institution, with two portals coordinated websites offering comprehensive information on these personal decision systems maintained by American organizations, both government and private.
It's an interesting idea that people would have a technology system to go and find about their privacy situation. And there were two of these systems, right? Could you identify what each one does for us, James?
James Rule: Sure. This is the one recommendation in the, for reform in the book that would admittedly, Cost a lot of money and I think it would be worth it.
And these, these paired systems would do two things: for every, um, legal personal decision system, that is a system that. compiles a lot and uses a lot of personal information to make discriminating decisions about how people are treated. One ought to be able to have a guided tour, electronically, so that every personal decision system would be basically, it's not an open book one could rapidly learn.
Whether one's own information be part of that system to, if so, how it would be used three, what the most sweeping consequences of that system could be on the lives of people and then in parallel with that system, one ought to be able to access one's own records in it, and certainly notice anything that's, uh, either inconsistent with what we hope will be even stronger privacy laws in the future.
And also, challenge the, challenge the uses and ultimately even the existence of the system, if it doesn't, if it doesn't work. doesn't comport with salient public values having to do with personal information. In short, the system, the two systems, would figure Would serve as to open up this rather underplayed andproblematic area of personal life to public discussion and debate.
And we don't agree about these things. I don't want to suggest that this, uh, shedding light on these practices would make it all smooth sailing. It would make things much more contentious and lively and, uh, certainly uncomfortable for a lot of people, both people like us who would find things in our records that we would never suspect. And for the people, I hope, who keep the record systems and would then have to defend them publicly.
Justin Beals: Yeah. I certainly think the, the, of the two systems, the one that seems more accessible, more both the technology and the ability to implement is a national database of companies that are operating technology that have some amount of information about what data they're storing, you know, um, and I can think of a lot of ways technologically to solve the ability to gather that information.
The other one is a little tougher to understand what data is stored. But, you know, at the end of the day, we don't even have a national registry, or an ability to catalog the businesses. We spend a lot of time cataloging people, but the, the companies, the incorporations, the legal frameworks that own some of this data and where they sit.
James Rule: Indeed, but this is, this is why technology is, or this is how technology is used, uh, inadvertently perhaps as an excuse for things that it doesn't deserve to be accused of. Technology, in this broad sense that we use it, could very well illuminate areas of life, particularly involving personal information, in ways that would be liberating, or at least, informative for democracy.
And, uh, I don't see why technology is always seen as the perpetrator rather than the guardian of the personal information.
Justin Beals: Yeah. I'm a little curious about, some of the other types of reforms that you're recommending. And one of the things in the reforms that I thought was really interesting, James, is it's obvious that like, if we were to implement the reform, some businesses are not going to be allowed in the United States anymore, right?
Which, which ones specifically, and I think it's okay to say we, we, in some ways we should get to decide what types of businesses are allowed to operate in our communities. We do that anyways, in a lot of different ways. Which ones do you think if we implemented good privacy reforms, we would see die out? maybe not no longer impose their will upon us.
James Rule: Well, you're quite right the sites a number of activities that should not go ahead for profit in my estimation, but it also I tried to do this systematically It also points out things that could be changed about where those Businesses are done that, uh, would make them more, acceptable, at least, reduce the risks and, uh, offensive characteristics of them.
One big, big business in the United States is the wholesale, the sale of personal information from people who have not, uh, given their consent. And if, um, if the, the, the advertising marketers and advertising companies that,the data brokerages, as they're called, had to pay for the information that they're, um, distributing for a price.
They would certainly have to reconsider their economics, and if, if people had a mechanism for, simply withdrawing from those markets and saying, I don't want my data to be marketed, uh, that would be bad news for some very profitable companies. But on the other hand, if you said, uh, well, you know, everybody is the owner of his or her own data profile and if they want to sell they must enter the market and offer people [some kind of acceptable price, acceptable to the people whose data are being presented.
And I think that would, that would have a tonic effect on people's privacy consciousness. For one thing for the first time, many of us would realize that we are the subject of this form of commerce, and some people will certainly say, Well, thanks, but no, thanks. I don't don't want to be involved.
And I would be willing to more than willing, I would be happy to see the owners. How shall we say, the authors of our own data to have a decisive say as to whether it's to be marketed commercially. But the larger cultural repercussion of a right like that would be to make everybody much more conscious of something that we should be conscious of anyway, and that is it's worth a lot of money, and sometimes that's acceptable, maybe, and certainly If it helps make political campaigns more vibrant, I'm willing to put up with a certain amount of nonsense in the mails or on the media. But we should at least choose. We should treat this, as the title of the book says, seriously. Like, you know, well, like other serious stuff in life.
Justin Beals: Absolutely. I, I always, um, struggle, uh, especially with, with some of the, like, well, I see it when I sign up for a system, you know, trying to drive through their policies and stuff like that.
There's no way of reading it. And they're so obfuscated and it's this concept of informed consent, right? Like if I'm going to interact with your system, I deserve as a consumer to understand what you're doing with that consumption beyond, especially when there's after effects like second and third order actions in the information that I might not comprehend.
James Rule: Well, of course, I mean, to speak personally, I like everybody else when we're confronted with a sheet of paper and that says sign here and we'll get this process going for you. I am putty in the hands of the people I'm. confronting and I don't read all the terms.
But when anyone signs a credit agreement in this country, I know of no exceptions when you say we're giving you credit for this account or credit for this purchase or for this credit card, you sign something that says, and the company receiving the information, has the right to continue to draw on future experience. In other words, your credit account, once open, flows just in that one direction from your actions to the credit bureau.
Well, I don't buy it. I think it's, I think we should always have the right to withdraw. Even, even if the, notification goes out that this customer has stopped the flow of credit to his or her credit bureau, I think I could accept that, but it would make, it would put all consumers in a much stronger position vis a vis all credit bureaus.
And in the scenario that I spin out at some length in the book about how this would work, there would then become brokerages that would represent customers, consumers in their relations with credit grantors. And the brokers would in effect, if the credit, if the consumer desired it, drive the hardest bargain that they could for, for the right to use, uh, people's data for, as you say, second, third and nth order, usages, I think that would be a healthy tension.
I think I would, I would much prefer to live in that world.
Justin Beals: Yeah. And at least we're informed about what we're selling to whom, for what, one of your, you know, one of the things that you were open to, and I want to take you up on it in this conversation is, perhaps chatting about a recent public event, a privacy issue that has unfolded.
And the one that, um, I'm thinking of here is the case of the Georgia prosecutors. I think Fannie Willis was the name of the prosecutor in Atlanta, Georgia, where it was revealed that some very private information, you know, about her and that impacted a really critical case. A legal case, I believe.
James Rule: Yes, indeed. They don't get much more critical than that.
Justin Beals: Right.
James Rule: But in this case, I think I come down a bit on the, on the other side because the, the issue at, uh, debate seems to be whether the prosecutor's judgment was subject to conflicting, considerations other than the substance of the case.
And I wouldn't say it could never be true. And I don't say it is true. But that seems to me one of the things that underlines the point that, damaging or distressful personal information sometimes has a place in an open society.
Justin Beals: Yeah. I think, um, it's a tough one. And, but what I do think seemed to happen in the process, while of course the information was made public about her relationship, I think with a fellow lawyer, um, It does seem like there was a lot of attention and caution paid and a lot of review and so, in a way the legal system attempted to, you know, we went from a privacy issue to a public issue, and now there was an investigation around that impropriety that does seem like the kind of areas where we have, we need to know what's happening. Yeah.
James Rule: Yes. Well, you know, there is. It's not like, as I said before, it's not like establishing the core capacity of your computer or the mileage of your car. Different people with different values will have different takes on a case like that.
And I'm not sure where I stand, but I think in these high stakes matters where there's a, You know, a trial with perhaps global consequences hanging in the balance. I think even extreme demands on privacy are, can be warranted. But in the world where most of us live, where we're worried about getting, uh, appeals for products that we don't want or demands on our time, like the phone calls at dinnertime and all this kind of thing.
Uh, I, I think the stakes are lower and the willingness to, my willingness anyway, to tolerate invasions of privacy is less.
Justin Beals: Yeah. I certainly don't enjoy it. Although I have to, you know, I've worked in companies long enough. We've tried to sell enough product that we certainly feel required in a way expected to take every advantage we can and being successful.
I think what I wish more often, this reminds me of my argument around healthcare. I don't like that I help us decide what health care plan to buy as an employer I think it's not in it's not our business. It doesn't matter to me. I want the community to decide what kind of Health care they want,and then every company to be held under the same rules, then it's a level playing field for me I can go out and compete my struggle is when there's no guidelines, ethics can not be your friend in winning, in a business scenario, and that's not good for anyone.
James Rule: Right, and think of all of the areas of life in which big and not very visible institutions Make binding decisions about health or retirement or any, any number of other public health. They know what chemicals will be used in the, in the atmosphere. These are, these are things that ought to get the broadest possible publicity.
And, of course there are a lot of them. So we don't want to be citizens who do nothing but. Our homework as citizens, we want to get on with our lives. But the recent stories about the, what are, what are these figures called? in the drug industry who make decisions about what kinds of, uh, what kinds of drugs will be approved for, sale through companies to their insured employees, Right? That sounds to me like, the Soviet Union in the 50s does not like the country I'm going to be in.
Justin Beals: I mean, at some scale, right, you're, you're, you're, a megalith, you know, there's so many hundreds of thousands of people that rely on the decision, uh, that, yeah.
James Rule: Selling,you know, selling a product for a few cents more when you're dealing with hundreds of millions of people is big money.
Justin Beals: That's right. Economy's a scale. That's what they teach us that every single. tech startup business class that we sit in. James, absolutely.
James Rule: But our lives are, the details of our lives are enormously valuable. And if you could figure out, and some people can, what makes the difference between choosing Google and Yahoo for your search engine, then you're on the road to great prosperity, I would say.
Justin Beals: Yeah, I think so. James, I want to thank you for your book. Uh, I really enjoyed reading it and I thought it was just a very pragmatic, but also brought philosophy to the concept of privacy. And so I'm just going to, um, uh, mention the book title here for our listeners, “Taking Privacy Seriously: How To Create The Rights We Need While Still Have, Something to Protect”.
And James, I'm really grateful for the conversation today. Please. Yes.
James Rule: Could I just say one thing about the book?
Justin Beals:Absolutely. Yeah.
James Rule: Despite the fact that I'm a sociologist, the book is written to be clear. And it's written to be argumentative. So if you have an objection, it's right out there in the open. Write me a note. I'll thank you no matter what you say.
Justin Beals: Thanks, James. I have to agree with you. It was clear. I have read some privacy books that I needed, it felt like I needed a dictionary while I was reading it. And this was, uh, really clear both, uh, the, why we're suffering the pain of privacy issues, how that comes forward, how to start to think about what the right reforms we need culturally are and what it should look like when it gets enacted from a policy perspective.
I thought it was very helpful. Yeah. Thanks. Well, um, thanks everyone for joining us today. Thanks James. And we'll see you all again in a week.
About our guest
James B. Rule is a sociologist and writer based in Berkeley, California. He is a Distinguished Affiliated Scholar at the Center for the Study of Law and Society, part of the UC Berkeley School of Law.
Rule is the author, co-author, or editor of nine books and monographs. His latest works include “ Taking Privacy Seriously” (2024), Theories of Civil Violence (2024), Global Privacy Protection: The First Generation*, co-edited with Graham Greenleaf (Edward Elgar, 2008) and *Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and Convenience* ( 2007). His scholarly writings cover a wide range of topics, including the role of social inquiry in promoting social betterment, the causes of civil violence and militancy, and the progress and development of social science.
He continues to study and write about privacy and information, along with various other subjects for both scholarly and general audiences. Recent articles have addressed issues such as militarism in American foreign and domestic policy, banking regulation, Israeli expansion into the West Bank, and the theories of Cass Sunstein regarding the domestication of social movements.
Justin Beals is a serial entrepreneur with expertise in AI, cybersecurity, and governance who is passionate about making arcane cybersecurity standards plain and simple to achieve. He founded Strike Graph in 2020 to eliminate confusion surrounding cybersecurity audit and certification processes by offering an innovative, right-sized solution at a fraction of the time and cost of traditional methods.
Now, as Strike Graph CEO, Justin drives strategic innovation within the company. Based in Seattle, he previously served as the CTO of NextStep and Koru, which won the 2018 Most Impactful Startup award from Wharton People Analytics.
Justin is a board member for the Ada Developers Academy, VALID8 Financial, and Edify Software Consulting. He is the creator of the patented Training, Tracking & Placement System and the author of “Aligning curriculum and evidencing learning effectiveness using semantic mapping of learning assets,” which was published in the International Journal of Emerging Technologies in Learning (iJet). Justin earned a BA from Fort Lewis College.
Other recent episodes
Keep up to date with Strike Graph.
The security landscape is ever changing. Sign up for our newsletter to make sure you stay abreast of the latest regulations and requirements.