- Home >
- Resources >
- SecureTalk >
- Creating the dark web: How the TOR browser was invented
Creating the dark web: How the TOR browser was invented
What software do radical techno-libertarians, the CIA, Privacy Advocates, the US State Department and Cyber Criminals use every day? The TOR Browser.
In this compelling episode of SecureTalk, Justin Beals, the Founder and CEO of Strike Graph, discusses the book ‘Tor: From the Dark Web to the Future of Privacy’ with its author, Ben Collier, a Lecturer in Digital Methods at the University of Edinburgh. This episode traces the early anonymity problems that the US military and libertarian-minded computer scientists were attempting to solve. How they created a partnership and worked together to invent a solution that could provide global privacy at the dawn of the information age. Ben provides powerful insights into the motivations behind its invention and the future of our connected world.
View full transcript
Secure Talk - Ben Collier
Justin Beals: Hello everyone, and welcome back to Secure Talk. Uh, really glad to have you joining us today. I'm Justin Beals, your host, and we have an exceptional guest with us. I'm very excited to chat with him. Ben Collier is a lecturer in digital methods at the University of Edinburgh. He holds a master's in chemistry, a master's in criminology and criminal justice, and a PhD in law.
Ben has recently written a book. We're going to be talking about it today, titled “TOR From The Dark Web To The Future Of Privacy”. Ben, thanks so much for joining us on SecureTalk. We really appreciate it.
Ben Collier: Great to be here, Justin. Thanks for having me.
Justin Beals: I really have enjoyed the book a lot, although I realize in reading it that I'm starting to get older because instead of it talking about things that happened before I was born, I'm starting to read books that Talk about things that happened while I was born, but it is a brilliant dive, Ben.
I think you've done a really good job of pulling together the technology, the sociological aspects, and the larger global political landscape for the book. Were you inspired by TOR itself? Is it something you use deeply? How did you even get interested in the topic area?
Ben Collier: Yeah, absolutely. I think I really started getting interested in TOR in the early 2010s. So it was around about when I started becoming interested in criminology. I'd always had an interest in computers and then later in The sort of hacker ecosystem and various aspects of that. I think fresh from reading old William Gibson novels. And so I really wanted to do research on that. And that kind of initially led me into finding out a bit more about TOR, when I did my Criminology Master's degree.
Justin Beals: Yeah, what was your breakout Gibson novel for you?
Ben Collier: As everyone else, Neuromancer. Yeah, that's
Justin Beals: a
Ben Collier: good one. But I predicted that a
Justin Beals: little bit. Yeah, I quite like Snowcrasher, or Snowcrashed as well. Yes, it's really good. Yeah. Were you ever a software developer or engineer? Did you spend time playing with code in that way?
Ben Collier: Yeah. So never as my main job, but so the first computer I worked with was a old ZX81 Spectrum in the mid nineties, which I learned to code on as a small child, just in basic, mostly developing very bad computer games. I did a lot of playing around as a teenager, just generally online and sort of bits and pieces of code.
Again, making very bad, this time 3D computer games. But then I learned quite a bit of coding in my [chemistry degree. That was the sort of area I specialized in some of the more digital methods aspects. And I weirdly learned quite a lot of programming working for the Scottish government. So I worked there as a statistician for a number of years, which involved learning a lot of data science and, and some kind of programming.
Justin Beals: You know, the statistician side is really interesting to me. It's hard to work in mathematics, chemistry, or otherwise without needing to learn how to program. A computer, I believe. Yeah.
Ben Collier: Absolutely. Yeah, I really love statistics, which I realize not many people in the world would say that. It's a really interesting way of looking at the world through numbers, although weirdly, mostly, I do sort of qualitative, but it's various, at various times, been very useful to be able to go back and use the numbers side of things. Particularly if I want to speak to police officers or FBI agents or government people, being able to do sums often helps because it gives you a bit of credibility, you know, to think, well, why do I want to speak to another ethnographer or interviewer, right?
Justin Beals: Yeah, I, I had a company I was leading as a chief technology officer and we were doing, we were using some data science tools to make some predictions. And I remember having this terrible time explaining what a probability was. And to a lot of customers, even some of the fundamental concepts and statistics can be really hard to wrap your head around.
Ben Collier: Yes, which is, is difficult for me specifically because that's a lot of the teaching I do. So I come across how difficult that is to communicate every day of my working life.
Justin Beals: Speaking of teaching work now, you know, how do you, I think that tech work broadly is a huge variety of different types of creatures and People's interests and skill sets.
How do you consider yourself today in that style of work as a teacher? What do you like to provide?
Ben Collier: Yeah, so I have a really fantastic job. I love lecturing digital methods, particularly because I bring an informatics skill set to social science and a social science skill set to informatics. So a lot of the methods teaching involves either speaking to people who have a long history as computer programmers and what they need is me to teach them how to do social science and investigate the social world, whether that's a lot of them actually are interested in security. So it might be kind of online cybercrime subcultures or things like that. And how they use social, social science ideas and tools to do that. The other side of my job is taking, and the teaching side is taking people who are really, really good at words and culture and social life and theory, and basically teaching them how to write Python scripts and how to do a bit of data science. That's phenomenal.
Justin Beals: You know, it is quite interesting. I once hired an engineer. that knew quite a bit of R because they were a social scientist.] And, and we were like, Oh yeah, we need to do some data science work here in R and you've worked on the social sciences side. And they had a great skill set for it.
Yeah. It was a lot of fun. Well, I think one of the things we've chatted about on secure talk in the past, and it was a little eye-opening for me, is the fact that oftentimes metadata. Is used as surveillance, the data that comes on top of the photo we take or the post in social media can be really surveilled.
Justin Beals: And I think one of the lessons that I learned in reading your book. is how TOR fundamentally as a solution is meant to anonymize that metadata.
Ben Collier: Yeah, it's really interesting. I think, so TOR is really unique in that it anonymizes metadata basically at the most fundamental level. So actually, the particular form of metadata that TOR mostly, though not solely, focuses on is the message that allows your internet traffic the signals your computer sends out to just make its way across the internet.
And that is quite radical because it's the equivalent of sending a postal message but having the post office not be able to see the to and from address. So that is, but it's a very clever technical design that essentially manages to do that. Additionally, since the original design, it now protects you in a lot of other ways, particularly in the browser for listening. It protects a lot of the cookie tracking, other forms of metadata, surveillance. It's, it's got a lot of protection against other forms of surveillance. But at its core, it's that sort of really fundamental form of metadata. The, the basic way your, your traffic gets on the internet.
Justin Beals: I worked in networking early in my career. We had logs for everything. You know, we knew what bit got sent throughout the network. Sometimes the packaging information almost reveals more about the conversation than the actual media.
Ben Collier: Yes, absolutely. So if, yes, that is definitely true. So there's very particular types of websites where if you're, for example, accessing an adult website at a particular time of day, Or a particular newspaper, or you can get a lot of that just from the peer communications method.
It's also particularly revealing for building out patterns and forms of group activity. So you can see, well, we've got this person, we know who they are, let's find out everyone they speak to, everyone they message, everyone in their local network. And for police, this is a really, this is a, to them very useful because they can say, well, we've got one member of a drug gang, let's find out everyone else they know.
Right. But there's also other more nefarious ways that they can, people can use this to kind of harm people or stop people participating in democracy or whatever it might be.
Justin Beals: Yeah. One of the interesting social aspects of developing TOR was some of the early groups that were inspired to work together on it.I wonder if you'd. Help illuminate that story for us a little bit.
Ben Collier: So this is a, I think, one of the most interesting is the bit I always used to sort of sell the book to people. essentially. And to get people to read it is, it's a very odd tour. The birth of TOR is really odd.
So across the 1980s and early 1990s, you have all these sort of radical technology people who've got a real commitment to sort of libertarian, techno-utopian political values. And they're sort of writing manifestos, they're building encryption technology, they're often taking technologies that were originally used by the military, and then making consumer versions of them, things like this. So they've got a real political commitment, but they're also quite, sort of, scrappy, computer hackery people, right? And they're, they're often very anti government. And in the sort of mid 1990s, you have this group of researchers at the Naval Research Lab, who decide to essentially build this early version of TOR.
The idea is called Onion Rooting, and it still is. And they realize, very early on, that if you're going to build a big privacy system, like a big anonymizer, It can't just be the US military that uses it, because you could just say, well, okay, all the connections are going to the CIA or the military's magic anonymizing service, so we know whoever using it is a member of the military or is an asset, for example.
So they say, well, we need the general public to use it in every country around the world that we'd want to get an asset or a military user. You need a decent number of members of the general public using it for all sorts of different things so that you can't tell anything about a person just from the fact they're using this service, which becomes Tor.
And because these are essentially academics working for the military, they're going to conferences, they're going to events, in the anonymity and privacy space. And it's through this kind of connection. The two groups come together, and the military researchers essentially say to this group, well, okay, so you want an infrastructure, a set of technologies that makes the internet a private space for everyone in the world.
And we want everyone in the world to use our private privacy infrastructure to give us cover traffic.
And so actually, we've got more in common, really, than you might think. Um, and this all sort of culminates, they start working together, and then they, they meet up for something called the Onion Dinner in Oakland, where they all get together and make an entire meal of sort of onion based food and chat about privacy and, and what you'd actually need in a kind of, A private system to get general use and adoption.
And yes, it's a kind of a fun little, a very odd story of sort of some of the most radical and some of the most sort of, yeah, they should say people you can imagine working together.
Justin Beals: The anonymization strategy is really intriguing to me. You know, there's lots of data sets where we've had to look at it and be like, you know, how can we essentially in your this data set from a privacy issue? And one of the ones that we don't think of very often is let's fill it with a bunch of junk data so that no one can tell the difference, which is essentially the one of the fundamental strategies, right? I think there was a lot of discussion in the early groups that we've got to get people using it so that, you know, the, the things the government wants to hide is hidden.
Ben Collier: Yeah. And this is a really interesting aspect of it, I think, because actually it took a very long time. It's a really weird way of doing cryptography. Usually when you're building a cryptographic system, you want to make it as secure as possible and anything you can do to make it more secure, any design aspects you can do to make it more secure, you would usually switch those on.
You'd usually choose to make it more secure. But in the Tor design, actually there's lots of ways that you can make it more secure that they choose not to do because it'll make the system slower and it'll make it harder to use. And in the Tor system, the easier it is to use, the more people you get. the more users you get.
So it gets bigger and bigger and bigger. So your cloud of cover traffic gets bigger and bigger and bigger. And so you become more anonymous. And what's interesting is it actually took until about 2009 before they figured out a way of describing this mathematically to talk in a way that cryptographers would accept.
They took a long time before it sort of, they could prove to cryptographers that, yeah, this was a good way of doing, doing privacy.
Justin Beals: One of the other stories you tell in the book is of course the, the government researchers trying to get funding, for this particular project. And of course, they didn't have a mathematical proof of why it needed to happen.And so they described this social story about the Pentagon Pizza Channel. Can you tell us a little bit about this? Was this real life or metaphor? I guess.
Ben Collier: No one really knows. So this was a story that was going around on kind of late night talk shows at the time. There is probably a bit of truth to it.
And actually what's really interesting is a real life example of this has happened in the last couple of months where this really did happen. So the Pentagon Pizza Channel is basically this idea that before the U. S. led invasion of Iraq in the first Gulf War, you've got all these journalists who are trying to figure out when the boots are actually going to go on the ground.
And so what one enterprising journalist, so the story goes, decides to do is not to use their sources because no one's talking, but instead what they do is they park outside the nearest pizza hut to the Pentagon and they just wait. And then one day. A hundred pizza vans leave the pizza hut, and it's sort of hundreds of pizzas going straight to the pentagon, and so they know, ah, right, okay.
Essentially, everyone there is having a late night, and so we know for a fact this is when it's going to happen. So it's essentially called a side channel. In security research, so something that's not the thing itself, but another thing that gives you a bit of information about you can use to crack it, um, essentially.
So this actually happened recently. some people were talking about this on Twitter, where Google tells you when a business is more busy than usual. And so people noticed that a correlation between some military, U. S. military activity and the nearest pizza to the Pentagon suddenly being way, way more busy than it usually was.
So why this is related to Tor is that the good people at the NRL, uh, went to the funders internally and said: well, look, this is a good example of a side channel. Now, no one can do this yet, but imagine a distant future where you can order pizza on the internet. Suddenly, we'll have all these different side channels, loads of different aspects of digital life that will be impossible to sort of plug up all these holes.
We'll be leaking information in all these different ways. And so what we need to do is to make the core infrastructure of the internet as secure as possible. Find these ways to stop this additional metadata leaking out. And that will allow us to use it more securely for military communications.
Justin Beals: Yeah. In our work, there's a lot of questions oftentimes about privacy.
You know, we, we help companies get through security and the right scope of security, but privacy is oftentimes a part of the conversation, whether it's regulatory issues or cultural expectation. I found that your book had a discussion about privacy that I thought was really interesting. And one of the concepts that you illuminate Is that privacy only happens in context and depending on the context, we have wildly different expectations of privacy.
Could you help us example that out a little bit about how we shift our concepts of privacy depending on how we're operating?
Ben Collier: Absolutely. So the classic example of this is sort of privacy in the home versus privacy out in the world. So the stuff that within our immediate family, things we're likely to share with people, the situations we're likely to see each other in, are radically different to when you're out in sort of public space you're interacting with in the world.
And things that you might never say in a public space, you know, you're likely to be in a different environment, you wouldn't expect those things to be private. We have really different expectations of those compared to the things we say in a private space, where the spread of that information is much more controlled.
But this actually, so this is, there's a really famous privacy researcher called Helen Nissenbaum who's written a lot about this in her book Privacy in Context, but this actually applies to lots and lots of different situations in life. It's not just this sort of two things, the private home and the public realm.
Actually, we have lots and lots of different contexts in our lives where we really expect different things. So if you're on an operating table in a surgery, for example, the people in that room will probably be able to see you. Not wearing any clothes, they'll know lots and lots of really intimate detail about that public life. And so, but that's fine, we sort of accept that. Because we're in the context of the operating theatre. Now, if I were to take all that information and the video of you in surgery and all your medical data and then take that to the person that runs your local 7 Eleven and give it to them. You'd be pretty angry, right? Cause it's, it's a different context. So what Nissenbaum talks about is when we're studying privacy as a sociologist and trying to understand, well, what different types of privacy do we expect? A lot of that's about looking at the context that information has been generated in and figuring out what the cultural expectations are around each different context.
Justin Beals: I mean, this was illuminating for me. Cause I think about privacy, I think so in such a black and white term instead of like, which context, am I moving in right? Like, but it's very true. We do that naturally. I have different conversations professionally than with my family, and that can really change what's appropriate to talk about or share. Yeah.
Ben Collier: Yeah. And it's also just about what, what practically privacy means as well, because in some contexts, so for example, a good example here is like even just what data means and what privacy. So if you're in a medical situation, for example, there's lots of different people who will understand the data that are generated about you in really quite different ways.
So like the doctor who's coding up your medical records will understand you in a very different way. to the surgeon who's operating you, on you, and they'll understand you very differently to the janitor who's maybe cleaning that room, and the nurses, and the, we'll all, you know, and so you have to understand each of those contexts quite differently when you're understanding whether or not to share information and whether it's appropriate or not to do so.
Justin Beals: And when you approach it this way, this understanding of privacy, I think what I came to realize is that because digital data is, you know, it, it, it, it, once it's set, as long as it's maintained, that the data is, is true, right? Like we, And as it changes context, so if we take data in a professional situation, it changes to a public situation.Nothing changes about the data. That's what's new about the digital age, in a way, I feel.
Ben Collier: Absolutely. And I think it really changes how you design and maintain your IT systems as well. And how you design and maintain systems of information. Because you can't always assume, when we do these things, when we design systems, We're trying to create an idea of the user and what they expect.
They've obviously got legal compliance. We're also trying to think about how are people going to use this system? And they might use it in totally different ways that you would never have expected at all. And then stuff starts to get quite tricky.
Justin Beals: I think you're exactly right. And as the book states, you know, these, we've got these two teams, the researchers and cypher punks, which I hadn't heard that name in a while.
Thanks for bringing it back. It was really fun and I'm going to quote the book here that tour demarcates a separate space in which government and corporate control and power, whether exercise through influence, coercion, surveillance or censorship are removed from our private online lives. It's like Tor is trying to define a privacy space, a context, that we've got an adversarial situation against, right? Yeah.
Ben Collier: Yeah, yeah. So, this is something that I think is often slightly misrepresented about the Internet. And I completely agree with you. I think that's exactly right. I think when we think about the Internet, we often think about, well, you know, why shouldn't the government have access to all this data, or whatever it might be?
And I think we don't often recognize that it's a huge transformation from what they would have had beforehand. So, parts of the intro, before the Internet, part of the inspiration for one of the big developers on tour, Roger Dingledean, was an essay written by security academic, Professor Ross Anderson. There's a huge figure in the privacy space, more generally, who passed away recently, unfortunately. And he wrote about this idea called the Eternity Service. And he said, well, the internet has created a new type of publishing and social space and method of communication that's very, very dangerous, because it's completely, the way it's designed means it's very, very easy for government to surveil just every aspect of your life. Things that they would have never had access to before. The conversations you have over coffee with your friends. Your innermost sort of private feelings and thoughts. And, and sort of political ideas, things like this.
It gives them a huge amount of new powers they would never have had before. And so what it essentially says is you need things. Like will become for to kind of reclaim a bit of a bit of that privacy that the sort of design of the Internet loses.
Justin Beals: Yeah, I remember the days when it felt very anonymous, right?Like, yeah, you could build whatever random web you wanted. It had very little ties other than maybe the network logs for how you got to and from the system and certainly, I think there's a lot of corporate interest in in knowing who is doing what, if quite simply for targeted advertising. But it's obviously used in other ways to change our perspective.
Ben Collier: Absolutely. So there's actually another stream of research that I'm doing quite a lot of work around at the moment, which involves how governments are starting to use targeted advertising. And this is more generally the case with, as you say, you know, when you get a new technology, Initially, no one's using it, and it feels very sort of exciting and a lot of possibilities, but then as more and more people start building on it, some of the more negative aspects come out.
So, actually targeted advertising. Now we're increasingly seeing governments start to use targeted advertising for behavior change campaigns for public policy, whether that's for preventing crime, encouraging health behaviors, all this sort of stuff. Um, so that's another kind of bit of research I do, which is about how governments do are starting to take up these private sector targeted advertising tools themselves,
Justin Beals: You know, broadly one, one theme I felt throughout the story of TOR, and the book. And I think it's something we discussed a lot is that these tools don't necessarily have a moral and inherent moral value to them. We use them for different outcomes, right?
Ben Collier: Absolutely. So I think TOR is very interesting because you would immediately think if you looked at TOR that it must have this sort of central political idea because it's such a well-known technology. The people that like TOR have got really, seems to have very strong political views about it. And you'd think it would have a sort of defined political idea at its heart.It really doesn't. When you, when I started speaking to the people that develop TOR, the people that maintain TOR, the kind of its original designers, the people in the wider community, it's just this incredible diversity of different politics. Actually, you can see some sort of trends and sort of big groupings, but actually, I mean, I spoke to everything from sort of really far-right people to sort of really the whole political spectrum, basically.
So, yeah, it's very interesting the ways that people can find different politics in the same technology.
Justin Beals: Yeah, you, I liked another part of the book where you discuss the privacy worlds, you know, as context in a way, and I thought it also, uh, worked alongside the growth of four or the types of feature sets that were being put into it. And can you help us break down a little bit privacy as a structure, privacy as a service and privacy as a struggle?
Ben Collier: Yeah, absolutely. So the privacy world's idea, I don't go too much into this sort of super dense social theory in the book, sort of by design, but because it would be unreadable, and I wanted to write a book that people would want to read.
Justin Beals: I loved it, man. I obviously hit the tip of the iceberg, so I haven't read enough about it, but. It's funny because I saw the technology architectures in each of these worlds and that was very intriguing, yeah.
Ben Collier: So what I was really interested in, a lot of the work that has been written about TOR looks at the different worlds of TOR's users.
So that might be people that use it for journalism, people that see it as a drug dealing tool, people that do it for everyday privacy. But because I was writing about the technology itself, I was really interested in the worlds of the people that use it. that actually build it, the people that actually make it work day to day.
And so you can see a chronology of these different perspectives that emerge over time that tend to be associated with different parts of the job of making Tor actually work. So initially, you see this very strong world that I call the engineer world, and that comes from a sort of fusion of the ideas of the cypherpunks, this quite radical view of privacy, with the military engineers that have a, essentially a strategic view of privacy. They want to, they see privacy as these sort of like structures of power in the internet that you can engineer in or engineer out. So it's a very military kind of view. And that's where I got the idea of, essentially, they see privacy as a structure. When they think about privacy, they think about the structures of information and control in digital networks.And they see creating privacy as about using clever engineering tricks and design tricks. to build technologies that remove those concentrations of power. So, it's a much flatter ecosystem of power. So, for them, privacy is really about engineers coming in and making big interventions in power and online space.
But when you build a technology and infrastructure like TOR, it's not like a rocket launcher that you can just give it to someone and they use it. Right. It's, actually an infrastructure, so you need people to maintain it. You need people to maintain the TOR network, and that's actually quite a lot more people than are the engineers. So it's thousands of people, volunteers all around the world, who run TOR nodes and sort of maintain and administer the network. And what's interesting about them is they tend to initially come from the hacker scene, places like the Chaos Communications Congress, and for them, as they start using TOR, they see other people using TOR, they see people using their nodes, they get a very different understanding of what privacy is to them, the work of building a privacy technology is very simple. isn't about these big engineering decisions that radically alter the structure of power online. For them, it's privacy as a service. They want to provide privacy as a service for the users of the network. And that means they are essentially service providers.
So they get, start to get quite queasy about telling the users what they should be doing. So they say, well, we shouldn't take a political view on the network. We shouldn't be saying we're sort of, you know, engaging in this big radical political movement. Instead, we should be as neutral as possible, we should just run the infrastructure, and the politics should all come from the users, whoever wants to do that.
Now, there's one, and so those two groups rub along pretty well, because they kind of, you know, you need to maintain the network, you need to design the network. The final group is quite interesting, because actually maintaining a network, running a network, engineering bugs out of the network, continuing to develop it, requires money.
And building stuff that's sort of grey and neutral, that doesn't have any desired users, where you don't care who uses it, It's not a really good way of making money. It's not a good way of getting a project funded. Because when you go to funders, you want to say, well, this is what it's going to be used for.
These are the amazing things we're going to do with this, right? And so TOR, very, uh, kind of quite early on in its history, in about 2006 7, really engages with the kind of digital privacy, digital rights ecosystem. Now, these people tend not to be technologists. They tend to be lawyers. They tend to be people involved in political lobbying. They tend to be activists. And so then you get the kind of activist work, and they have a very different understanding of what privacy is. So, they see privacy as a human right. So as I call it, privacy is a struggle. Mostly because I wanted three words that all start with the letter S.
Justin Beals: It writes, it writes really well, Ben.Yeah.
Ben Collier: I thought I was really lucky that there were words that that were relevant. But yeah, so these three, previously as a human rights, And more than that, they say it is a political struggle. And what essentially they say is, well, you know, privacy does have a politics. We want some groups to use this technology, and we want other groups, which tend to be criminal groups, not to use it. And we want to say that this is about sort of a Western value of kind of liberal democracy. political rights, people should be using this for journalism, for freedom of speech, and we want to promote that, and we want to link up with other social movements that are going on as well. So, if privacy is going to be a social movement, it needs to be linked to the kind of feminist social movement, it needs to link to civil rights in other places as well.
And this fits very well with the funders, which tend to be sort of NGOs, US government, etc, who want to do promotion of digital democracy, who want to use the internet. social media to encourage liberalization and democratization around the world. Again, not an uncontroversial idea, but this sets up a bit of a conflict, which I talk about in the book, between the people in the organization who run the network who often want it to be a bit more neutral, a bit more like just a big open field of whoever wants to use it can use it, and we shouldn't be telling people what to do.
And on the other hand, a group that really wants to articulate a big case for TOR having a set of values. And it's interesting because you have one group that's got the money and one group that's got the kind of infrastructure and so it kind of, yeah, it comes together in an interesting way.
Justin Beals: And each of these areas has a technological underpinning, right?Like, we could look at the privacy of structure as being the architecture and the browser itself, the service as the points of presence in a network, right? People are literally operating software routers on top of hardware routers across the network. And then those people that are trying to push content and material over that network.
Ben Collier: Hundred percent. And I think what I've really tried to do across the book is when I speak about each of these worlds, is to anchor it in a particular type of particular aspect of TOR as a complex series of technologies, but also a particular aspect of technical work. It's also interesting that the, the activists do also do technical work in that when you're building something like TOR, and the TOR increasingly understands this across the course of its history. When you're building a technology for everyone, you're basically building it for no one. And so the activists are the people who are actually going out and speaking to particular groups of users, finding out what they want from the technology, finding out the features they want for the technology, finding out how they use the technology because TOR has a kind of Achilles heel with this.
Most web browsers, most, IT products collect information on you so that you can, the designers can figure out how to make it better. TOR doesn't do that for pretty obvious reasons. So actually the activists are one of the main ways they understand what people want at the tech. So there's a kind of, that is a form of kind of quite interesting technical work in its own right. That kind of almost market development piece.
Justin Beals: Yeah. Of course we have to talk about the dark web a little bit, but the early designers of the system. We're aware that these types of tools could be used for criminal activity. It must've been a difficult discussion. Yeah.
Ben Collier: I think it is. It's quite one of the funniest emails I saw was really early on in 1997.
Someone says. Well, what about people using the technology for crime? Like, surely that's going to be an issue. And then the response that they get is, that's a political issue. And as soon as we start dealing with political issues, the whole project is going to fall to pieces. I think I would say that, so they're definitely not in favor of people using the network for crime and it's certainly they've seen it as a bigger and bigger problem for them often for image as it's gone on obviously they don't want people using the network for crime but additionally the more people that do use TOR for crime the more it becomes associated with crime so that can often put off other people from using it but actually part of the design of TOR is that people will use it for everything including criminal activities just all the sort of different activity that goes on online.
The more, the better that moves on to TOR because it gives you more and more and more cover traffic. It makes the network better. It contributes resources to the network and actually improves the privacy you get from TOR. One of the things, the sort of ultimate goals, I think of a lot of people in TOR is that it sort of stops being a separate technology of its own and ends up being more a sort of set of protocols or just an aspect of how the Internet's designed in the same way that. You know, encryption in the 1990s is this super controversial technology, but now HTTPS is just like a fundamental aspect of the internet.
Justin Beals: Yeah. And, you know, It's, it comes back to this. The tool has no moral value, right? At the end of the day, as a matter of fact, we might design a tool that's trying to live inside the cultural changes in how we think about our own moral values, which we've seen shifts in since Tor was first designed.
Some things that used to not be okay are acceptable, you know, in certain cultures. Yeah.
Ben Collier: Yeah, indeed. And I think it's, it's one of the sort of paradoxes of developing internet technologies, because. They're almost by definition, global technologies. So they get into all these different cultural environments.
They're adapted by different users in different ways. And it's one of the really nice things about actually doing research on technology is that I think when you're sort of, a lot of how we used to think technology worked was you have this set of morals and you build those morals and values into a technology and it goes out and sort of embeds those in the world.
But actually, when you see how people really use tech, It's just so many weird and wonderful different ways. It's not to say that it doesn't have any effect how you design the technology, it is obviously important. It sets the kind of playing field for what you can do with it. But actually, the users will always find kind of weird and wonderful and interesting ways of hacking things together. And it's, yeah, I think that's quite an optimistic way of looking at tech.
Justin Beals: Of course, we have a lot of security professionals that we count as listeners. And one of the things they deal with all the time is threat modeling, you know, in their own security work. And I was wondering if you'd tell us a little bit about the threat modeling that the Tor team did early on in developing their solution.Sure.
Ben Collier: Absolutely. So TOR, as Onion Routing, as it's kind of initially designed, has this quite simple threat model, which is kind of two, two types of user, basically. So it's essentially a user model where you say, okay, we've got the people. We're assuming people are spying on user traffic and using that to identify them.
And you've got two types of users. User one, whose threat is the government. You're in real trouble. You know, any information that gets out about you, you're a high-security user. any information could really be quite identifying, put you in a lot of danger. You're probably not doing the sort of stuff you're doing all the time.
It's these really sort of high-sensitive, rare types of interaction. And those are very, very weird and strange and idiosyncratic. And so they're just, you know, You can see them a mile away when you look at network traffic, so you're in a lot of trouble. The other sort of TOR user is an everyday user. They use TOR for, basically, to provide everyday privacy, because they maybe don't like the government very much, or they, like, looking at slightly odd things online.
And so they're just using it day in, day out for everything. And so you've got these two different types of threads, well, user model, essentially when they, that unfortunately doesn't give you a lot of information on how to design your system. So when the sort of design process for TOR really kicks off at the implementation process, when they figure, let's build a real system that works in the real world, they're trying to think about what different types of adversary might be attacking the Tor network. Largely because they're trying to figure out whether they should add padding. So padding traffic is basically flooding the network with lots and lots and lots of fake cells of data so that it makes it really really hard if you are spying on the network to trace people through it. So they come up with essentially three different types of adversaries, and they see if they can beat them using padding traffic conceptually.
The first one is the global active adversary. This is someone that can see all traffic on the internet. They have a global view of the internet completely and they can also delay signals so they can hold up signals at exchange. And as the job of the network to add timing signature, you cannot beat that adversary.
There's nothing that can beat that adversary and remain a usable system. So even if we, if TOR did mixing where it's a introduced lots of random delays or if they added padding. That doesn't really get you much against the Global Active Adversary because you can add, the people you're trying to protect are so odd and their behaviour is so strange and unidentifiable it's very hard to protect against the Global Active Adversary.
The second type, and this is the most tricky adversary for TOR, is the Global Passive Adversary. So this is essentially the NSA, an adversary, or at least the NSA as it looked in some of the Stone documents. This is an adversary that can see the whole internet. But can't necessarily do much of that timing delay thing. Yeah. What they decide, um, so that really is a bad one for TOR, that, um, because essentially what you can do is, if you can see the whole internet, you can see packets travelling through the TOR network, so you can trace them from source to destination. It's quite expensive to do that. It requires quite a lot of money, time, compute. It's not always easy to do it. Partly because of the physical geography of the internet, the way that autonomous systems network together. It actually turns out to be quite hard to do this, but that is in itself, yeah, it's still a bit of a threat to TOR, and it's the one everyone always mentions, this kind of global adversary.
Yeah. They decide not to do anything about it. Not to do nothing about it, but not to worry too much for the purposes of adding things like padding traffic and and adding kind of mixing. Because what they do a bunch of tests, they do a bunch of conceptual work, and they say, well, okay, if we try and beat this adversary, essentially every design we can think of makes TOR unusable for day-to-day traffic.
And so you lose all those everyday users. You're just left with the really high-security ones. And so, you know they're using TORs, so you know you're interested in them. And they say, well, okay, so if, should we be deploying TOR? Is the global active adversary realistic? And they essentially say, well, not really.
They say, if you're a global adversary, you can already be active. You've already got access to all these other attacks, including just going into someone's house and sort of beating them up and all sorts of things. It's not really realistic to protect against that type of adversary. But additionally, actually in practice, most adversaries are not the NSA.
Most of the people you want to protect yourself against aren't the sort of, you know, you're not on the NSA radar. And so they say, well, what can we do about a realistic adversary? Is it worth developing TOR? And I said, well, the realistic adversary is the roving adversary. They call it. This is a, an adversary that has their own domestic view of the internet.
So you can beat them essentially just using the basic TOR design where you pop between countries. And they've probably got an international estate as well of machines. They've compromised machines. They're surveilling machines. They hack, but that'll come and go. People will realize they've got a bug on their machine or they'll switch it off for various things like this.
And so their picture of the internet will change and it will be based, you know, they'll have to spend money to see more of the internet. And that means we can make their life a lot more expensive. This is really good for Tor because it means that you move from a position where you can spend like a penny and you can de anonymize someone or you can spend 10, 000 pounds to de anonymize someone.
And that takes most people out of the reach of security services. Because most people really, they're not going to spend 10, 000 pounds. So it essentially makes mass surveillance much less effective and moves a lot of the energy towards targeted forms of surveillance. Which don't really have anything to do with Tor, particularly in that kind of way.
So they essentially build this quite strange threat model, where all the high-security options you would get for a system like TOR, they basically turn them off. And they say, well the best security we can give people, is just encouraging so many people to use the network by making it so easy to use.
That just you've got so much data. It's very, very expensive and difficult to do the timing.
Justin Beals: Yeah. Yeah. I thought that the discussion around the global actor was really interesting because I can only think of a singular global network owner back when it was, you know, DARPA net or the early days when I was using it on the library system with IP addresses, but once it became a commercial endeavor and we started internationalizing it, I think you do get that kind of domestic look or the spy agency.
When from TOR's perspective, you're doing that threat modeling and who your adversarial actors are, it was really interesting story. Yeah. So what do you think about the future of TOR? Are we in maintenance mode or are there, what are the challenges coming up for this team?
Ben Collier: This is really interesting. I think there's a lot going on.
So TOR has been through a lot, but it's definitely not just in maintenance modes. And I actually think. It's poised to do some really, really interesting stuff in the next few years. There's a lot of technological innovation currently going on in tour. Tour is still a really big site of the development of new technology, of the development of innovation, of, in particularly something that I'm particularly interested in is Artie, which is a, a, a re-implementation of TOR.
So TOR developed in, I think CA sort of a memory insecure program and language. And they're putting a lot of effort into redeveloping it. I think we had a first release last year in RT, which is Rust, which is a memory secure language. Um, and so one of the side effects of this is that Tor will become a lot more modular, modularizable.
Sorry, I'll find a different way of expressing that.
Justin Beals: No, we're going to find you a VC to pitch that for that. That's a great word.
Ben Collier: I think, so one of the side effects of this. Reimplementation in Rust is that it makes Tor much more easy to cut up and put in lots and lots of different stuff. So if you will make it much more easy to embed aspects of Torification into a crypto wallet, into other forms of technology, it'll make it a lot more reusable.
And I think when that starts to hit this tech ecosystem, it'll be really interesting to see how people adapt the Tor technologies and incorporate them into their business. into other technologies, particularly in the kind of crypto space. I think it'll be, I think we've got, there's a lot of interesting stuff to come there.
In terms of the, the kind of politics of TOR and how that's evolving. I think, so I wrote about this at the end of the book. I think there's a couple of interesting directions that could go. One of which is to push really far into the kind of crypto ecosystem and kind of get Really into cryptocurrency projects and things like that.
It's, I think that has become less of a likelihood. So that's become less likely over the last year. There are some people in the tour community that are quite really big fans of cryptocurrency, but there was an experiment last year where a crypto network called ATOR started setting up their own tour nodes and tried to reach out to the tour project.
And TOR made the decision not to go down that route because they didn't want. The health of the TOR network to be pegged to a volatile currency. They wanted it to serve a really solid base. And that makes a lot of sense because you know, if digital freedom activists and dissidents are using it, you don't really want anything that could compromise that like rock solid base of the technology.
The other side of things is kind of where new TOR users and new TOR funders do come from. And we're really seeing this now. So particularly in like Iran, China, Russia, are all countries with like quite large proportions of TOR users. And there's a really interesting technology within the wider TOR ecosystem called Snowflake that allows you to essentially set your computer or your phone up as a kind of bridge into the TOR network for people that live in countries where it's hard to access TOR.
So I think There's, it's got some interesting things to come for, like, sort of alliances that Tor will make and its position in the global internet ecosystem in the future.
Justin Beals: Yeah, you know, one thing that definitely speaks to me is the concept of modularization of, you know, TOR itself in that I think we see this in the open source community broadly.
Which I have taken a huge advantage of in my own career and attempted to be a good contributor for. It's almost like we're building our own, building blocks of languages for the things we want to invent in the future. And it's, it self fulfills or feeds the cycle, right? To be like, I can pull this package down today.
I, we do it in data science all the time. I don't; I don't need a PhD designing a custom algorithm as much as I need the right Python package for the type of modelling I want to do from the open-source community
Now. Yeah, absolutely. And I think TOR. There's a big bit of TOR that really does see itself in that way as a kind of big open-source project That's got this core bit of tech it's developing but loads of other stuff fountaining off that, so a lot of the browser protections that TOR has developed have been incorporated into other technologies sort of ways of stopping tracking ways of stopping profiling Bits of sort of things that have gone off into signal or bits of you know, other tech. And I think it's one of the great things about that TOR has been very smart about, is not just being a single thing, but being a much wider ecosystem of standards, other bits of technology, weird things like Ooni, yeah, it's, it's quite an, I think quite an exciting and interesting space.
Justin Beals: Ben, I am super grateful for your book. I thoroughly enjoyed reading it and want to recommend to all of our listeners that they get a copy. Your book grabs this technological innovation, certainly has an aspect of security from a very different type of perspective and how security works. And finally, I love the social stories.
At the end of the day, these are all people trying to work together with different motivations. I thought it was just incredibly well thought out and you dive deep, and I appreciated that. You know, it wasn't. It took me a little bit of work to get through the reading, which I liked. Ben, thanks for joining us today on Secure Talk.
We hope to have you back in the future as you're working on other projects. Please let us know.
Ben Collier: Brilliant. Thanks so much.
About our guest
Lecturer in Digital Methods in the Department of Science, Technology, and Innovation Studies at the School of Social and Political Science, University of Edinburgh.
Other recent episodes
Keep up to date with Strike Graph.
The security landscape is ever changing. Sign up for our newsletter to make sure you stay abreast of the latest regulations and requirements.