I am Dr. Gabe Kaptchuk, a Research Assistant Professor in Computer Science and Center for Reliable Information Systems ļ¼† Cyber Security Affiliate at Boston University. I earned my PhD in Computer Science from Johns Hopkins University in 2020. I have worked in industry, at Intel Labs, and in the policy sphere, working in the United States Senate in the personal office of Sen. Ron Wyden. Now, I'm focusing on privacy research to spread provably secure systems beyond the laboratory setting. As part of Cyber Security Awareness Month, ask me anything about:

  • What is data privacy?

  • On an individual level, what can I do to protect my data?

  • On a national level, what can the government and/or companies do to protect private data?

  • On a systemic level, what changes are needed to reclaim our data privacy?

  • What are the biggest cybersecurity threats right now?

  • How should we think about balancing privacy and accountability?

  • What is the relationship between cryptography, security, and privacy?

Proof: https://i.redd.it/us7nr4ykk4s71.jpg

Thank you everyone for asking questions ā€“ this has been lots of fun! Unfortunately, I am not able to respond to every question, but I will plan to revisit the conversation later on! In the meantime, for more information about cybersecurity, cryptography and more, please follow me on Twitter @gkaptchuk.

Comments: 85 • Responses: 25  • Date: 

lonepairs9 karma

How can an undergrad interested in security reach out to professors for research besides cold emails?

kaptchuk14 karma

- Taking a course is the best way to get a foot in the door for doing research. Often times faculty (myself included) will basically insist that you take their course first so that they have an idea of your strengths. This is basically a survival mechanism for faculty, who often have too much going on.

- Office Hours! Especially the week after a homework assignment/project is due, office hours tend to be empty. Faculty have usually already committed to using that time to talk to students, so its the perfect time to start a conversation and getting to know each other. Maybe look a little bit at what kind of work the faculty member does and reach the introduction section of a couple papers. When you show up at office hours, starting a conversation about the kind of research they do -- even if you don't totally understand it -- will be easy! Faculty love talking about the stuff that they work on.

- Especially with larger research groups, building relationships with PhD students can be another helpful avenue in. TAs/TFs are often doing research and they probably also want to talk about it! As an undergrad you will often get paired with a PhD student anyhow to do research, so skipping the very busy middleperson and talking to the PhD students directly can be great.

- Seminar and Reading groups. Many research groups already are having speakers come in on a semi-regular basis. Send a faculty member an email asking where its happening and if its ok to show up. Then actually show up for a couple talks in a row. Faculty will notice when new people are showing up and you will get an idea of what kind of research the group is interested in.

Ambitious_Tooth_5598 karma

Hi Gabe, what are the biggest cybersecurity threats right now?

kaptchuk15 karma

It depends on the object of the sentence -- cybersecurity threats to whom? One answer might be something like ransomware that cripple critical infrastructure and companies. Alternatively, you could claim that subversion of cryptographic algorithms standards by nation states poses a huge threat to the security ecosystem that is very difficult to understand or spot.

My person answer might be the collection and aggregation of incredible amounts of information about individuals by companies and governments. I personally think that this poses a bigger existential threat to real people and the communities that they are a part of than something like ransomware. Unfortunately, addressing this problem is much more complicated than "software engineers need to be better at their jobs."

TedaToubou7 karma

not really privacy related but i'll bite, would you rather fight 100...just kidding.

I recently heard Andrew Ng analogize (probably not a real word) computer language to how just like in our human history when priests would be the only individuals versed in written language and the rest of the population would listen and absorb. The world changed once literacy reached to the main population. He theorized that similarly, we'll get to the a point where everyone should eventually have a basic understanding of some computer language and that'll also, create a large shift in society (maybe not now but decades later) when technology becomes more and more infused into everyday life.

What are your thoughts on this as an academic professional?

kaptchuk11 karma

I think what would be more helpful would be having widespread understanding of data processing paradigms at a high level. Programming languages aren't really necessary for understanding data processing paradigms. I don't think everyone need to understand how pointers work or recursion, which are classics of programming 101. Instead, it would be great if folks could understand how technology worked on a high-level schematic level. For instance, when i type "reddit" into my google search what happens. (1) some information about who I am and my query go to google, (2) google uses everything its learned about everyone to figure out what to send me back. (3) google sends me some data.

Having this level of understanding I think would make a big difference when it comes to having meaningful social commentary about the role of technology in our lives. It would mean that folks could start to imagine all the actors involved in surveillance capitalism. Understanding the problem would mean we could start working towards a society that is less extractive and harmful.

LadyPiaget5 karma

How should we think about balancing privacy and accountability?

kaptchuk7 karma

Whats cool about crypto is that this doesn't even need to be a balancing act (at least from a technical perspective)!! The tools developed by the cryptographic research community allow for fine-grained control over information leakage. For example, other folks at Boston University have been working on deploying cryptography to compute the wealth gap in the city of Boston without requiring companies to ever lose control of their data. This does a great job balancing the privacy of individuals (eg. not releasing their salaries) and city-wide accountability (eg. documenting the problem is a critical first step to rectifying it).

I think the key is asking "privacy for whom" and "accountability for whom." We have the technical means (thanks to crypto) to keep companies accountable without forcing them to reveal all their information to the public. In that sense there's no balancing act -- just adding more accountability doesn't detract from anyone's privacy. Additionally, I think looking at both legal and technical means for guaranteeing individuals more privacy are needed (eg. there some cool data fiduciary bills that have been introduced over the last few years).

In all, I would like to see us tilt towards privacy as a society. Its very possible that is course correction could go too far, but i think we are way too far in the "no privacy" direction right now. A change would be good.

sugarcoatedcat4 karma

When someone says "i dont have anything to hide" in response to privacy issues, what is the best response? The best i can do is ask them about if they would want to live in a glass house.

kaptchuk8 karma

I was this kid in my first security/privacy course, I'm ashamed to say. I start out my course with discussing this exact issue.

I usually take a three pronged approach

(1) You might not need privacy now, but you don't know if you will need it in the future. Its hard to get rid of information about you once its out there. Something might happen in your future, and its easy to prepare for that future now.

(2) Even if you as an individual don't need privacy, data privacy isn't always about you. We need to be building systems that protect the most vulnerable among us. You should care about data privacy and invest in systems that promote data privacy because that means that folks who desperately need the privacy can have it. Note that what these folks might have to hide isn't illicit -- its just that systems of oppression mean that the value of personal information varies person to person

(3) Even if you dont care about the privacy of folks with marginalized identities, you should care about building a functioning society. Living in the panopticon, powered by a few companies and governments conducting massive amounts of digital surveillance, is bad for humanity as a whole. Promoting data privacy can be part of your effort into making society better.

sugarcoatedcat3 karma

What do you think is the long term effect of the rust programming language on cyber security, assuming it becomes mainstream?

kaptchuk4 karma

I think it is already becoming mainstream? But maybe I'm just surrounded by early adopters. I'm not great at programming languages, but my understanding is that Rust is an important part of how we actually achieve meaningful cyber security. We can do all the protocol development we want, but we need high quality, secure software that actually realized these protocols. Rust is the way to make that happen

capivaral3 karma

How do cryptographers think accountability when developing new tools and protocols? You can't really control who decides to use privacy preserving tools and (I don't think?) you can prevent them from being used for harm or abuse. But given they can also be used for "good," how do you try to find a balance?

kaptchuk9 karma

This is a great an challenging question. Heres some quick thoughts.

  1. Have the application you imagine designed into the protocol from the get-go. When you write your definitions, don't be afraid to consider the social context in which your protocol could be used. Trying to cut out all the "politics and social considerations" from your protocol analysis will likely just mean that you have a protocol that risks causing harm.

  2. If you are designing a protocol with the intention of actually having it deployed, its critical to be actively engaging with the community that you think will use it. Parachute crypto development will just yield protocols that don't match needs and get misused.

  3. For me, I try to always say aware of the double edged sword that is cryptographic protocol development. Its easy to fall into a narrative about the destructive nature of technology or be absorbed by techno-saviorism. As a matter of practice, staying in the middle helps highlight the potential for abuse and harm

  4. Its always possible for your math to get used for something you didn't intend it to be used for. Once it is out in the universe, you can't control it. But, you can stay active in advocating for its use in positive ways. This political aspect of designing cryptographic protocols cant just be ignored. I think that producing a protocol -- even just as research -- means that you are accepting the role of continuing to comment on the development and use of that protocol.

OptxPrime3 karma

Are cryptography-related jobs becoming more and more common? I am undergrad student with passion for maths and cs (especially algorithms and data structures) and I would like to explore job oportunities. What kind of crtyprography related jobs are there in industry?

kaptchuk5 karma

There are tons and tons of security jobs everywhere, and an increasing number of them are cryptography related. This is a great time to be into this! I'll also just say that the future seems to be Rust, when it comes to jobs in crypto and security, so that might be a valuable thing to pick up in undergrad.

Implementing cryptographic protocol is tough. Theres two flavors (1) coming up with your own protocol, and (2) implementing a protocol designed by someone else. #1 is serious serious HAZMAT zone. Even folks with PhDs in cryptography get this wrong. If you are interested in designing protocols, grad school still seems like the way to go. #2 is more accessible without grad school. There are an increasing number of cryptgraphy related start ups these days, doing things like secure multiparty computation, cryptocurrency related protocols, etc... Often times there will be a research team that will be design protocol and they are in desperate need of strong software engineers to make their white board nonsense real.

My understanding is that even the big tech companies are increasing interested in different kinds of cryptography as privacy becomes a bigger part of the public consciousness. Differential Privacy has been deployed by most of the big tech companies and the US Census. Every internet connection you make (hopefully) has lots of crypto happening to keep the communication authentic and private. Infrastructure teams need crypto to make sure secrets don't leak out.

If you understand crypto basics and have strong rust programming skills, my understanding is that your future will be bright.

lacks_imagination3 karma

The little I know about cryptography is that much of it still hinges on the inability to factor large semiprimes in a quick way. Is this still true or are there other foundations for cryptography?

kaptchuk3 karma

The RSA encryption algorithm was one of the first cryptographic algorithms to make it big. The security of the RSA algorithm comes down to factoring large numbers. Since then, we have starting building cryptography of all kinds of other assumptions. For instance, a very popular assumption is the discreet log assumption, which states that in some given some number gx, its hard to find x. This doesnt work if you are just dealing with real numbers, but in certain mathematical structures, this assumption appears to hold up. More recently, the cryptographic community has been working a lot with new assumptions for which we dont know of any attacks powered by quantum computers (there are fast factoring algorithms and attacks on the discreet log assumption if you have a quantum computer). For instance, once of the most popular ones is the "Learning with Errors Assumption" (https://en.wikipedia.org/wiki/Learning\_with\_errors)

DCMcDonald3 karma

Hi Gabe! šŸ‘‹ What are some simple steps people can take to secure their data online?

kaptchuk5 karma

- Password Managers are your friend! I got my mother, who is a non-technologist in her 60's to use a password manager, so I fully believe that most people can make the switch.

- doing a quick check on haveibeenpwned would also be valuable to see if your passwords have been owned leaked

- Check your privacy settings on your device! Think about what kind of information you are letting out there into the universe and consider if its actually necessary. Does an app really always need access to your location, or only when the app is open?

RothkoRathbone3 karma

Are you seeing more women in computer science? Are the women treated equally?

kaptchuk6 karma

Ill just preface this by saying that I'm a man with a ton of privileges, so it can be difficult for me to see all the ways that women (and folks with other marginalized identities) are mistreated in computer science. I also sit inside the crpyto/security research community, which almost certainly isn't representative of CS as a whole.

Personally, I've been excited to see more women in the computer science classrooms and research groups that I've been a part of. Particularly, it been awesome to see people who are not men excelling within my research communities. That said, the computer science community has a long way to go if it wants to shed its white tech-bro reputation. My understanding is that this is particularly true within industry (but again, thats a view from the outside). Adopting an equity mindset is starting to see changes within smaller parts of the community. This gives me some hope, but I certainly don't think we are there yet.

mgluck_232 karma

What are the main ways we're compromising our data privacy on a day-to-day basis without realizing?

kaptchuk4 karma

Check your phone's privacy settings!! The way lots of apps make money is to collect data on you and sell it to aggregators. If you want to feel afraid, take a look at this article from the NYTimes a couple years ago: https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html

I regularly go through and look at what in my phone is gathering my location and lock that shit down!

Also if you have the stomach for it, I like to run NoScript on my browser because the internet is dark and full of terrors.

FamousMoose15962 karma

How do you think higher education does teaching students about data privacy? How would you change the way people teach and learn about data privacy in schools?

kaptchuk4 karma

Right now, I think there is too big of a divide between the teaching that higher education does about (1) the social implications of not having data privacy, and (2) the technical means we have to prevent data from getting out in the world. Computer science is lacking on (1), and virtually no one outside CS is teaching (2). Technologists need to be socially literate, or we are totally screwed. Also, non-technologists will be able to make better policies if they have a basic working understanding of technology paradigms.

I spend a lot of time talking about social implications in my Network Security course for exactly this reason. Hopefully all my students now feel prepared both to understand important protocols like TLS/HTTPS and also why these technologies are important. Hopefully...

Cypher9752 karma

Hi Gabe, I'm doing my masters in Cyber security and I find the Cryptography subject hard to grasp and understand.. Are there any good online resources (youtube channels or websites,etc) available that a beginner can use to learn about the subject of cryptography..? I am not particularly good at mathematics maybe that's y I find it hard..

kaptchuk6 karma

Cryptography can be difficult, especially if math isn't your strength. To be fair, a lot of this is because the field was designed by mathematicians and was only taught to mathematicians for a long time. I've heard Dan Boneh's online course for crypto is great and takes a high utility approach to crypto -- that is to say teaching the useful stuff and not only teaching the theory. There are lots of other open access textbooks out there, like The Joy of Cryptography https://web.engr.oregonstate.edu/~rosulekm/crypto/, but these are only helpful if you are good at learning from a textbook.

Part of the problem with cryptography is that there are lots of layers of abstraction -- you can spend lots of time doing pure number theory or you can learn about the algorithms that are used in practice like AES and SHA256. That makes it hard to figure out where to start.

look at the definition and figure out **why** this is a good definition. Once you understand why the definition works, you can start playing around with constructions or looking at how the primitive is used in larger protocols. Getting the feeling of the definition is probably more important that getting at the details right. At least, this is how my brain works.

happiness77342 karma

Bruce Schneier no longer considers himself a computer security expert or even a cryptologist; he now likes to call himself a "public interest technologist". Not everyone in the field of CS sees that shift in outlook as a healthy development. What are your thoughts on being a public interest technologist and do you think that outlook is something that should be taught in universities or not?

kaptchuk3 karma

Moving beyond the specifics of Bruce Schneier, I think that its awesome that there are folks who are technologists by training that are spending their time looking at difficult social problems -- problems that are probably more difficult and interesting than the technical questions they might have looked at in the past. Importantly, I want people who have this kinds of deep technical background being the leading voice on issues to regulating technology and dissecting technology's social impact. Whats the alternative?? That folks without that kind of background are going the lead the conversation?

In my class I try to have students think about both the social dimensions and technical specifications of a protocol. I'm lucky -- Crypto and Network Security lend themselves well to this approach, so I don't have to force anything. I hope that folks who teach ML find it similarly easy to talk about the social dimensions of learning. I don't think this is watering down computer science education at all -- if anything it enriches the experience and encourages student.

For the moment, I'm thrilled to have computer science departments teaching the skills that will be valuable to people interested in going into public interest technologist. I'll note that students probably also need to be taking coursework in other departments so that they have a strong social science or philosophy background as well. Maybe we will get to the point where this becomes a discipline of its own at the university level, but I think we are a long way away from that.

tksn2 karma

I'm wondering if there is any safe way at all to place backdoors into encryption. In my opinion it's quite stupid, as it could allow bad faith actors to crack open pretty much everyone's private communication and track those who oppose them. But does that track technically? Is there some kind of encryption exchange or anything at all that could make secure backdoors.

My money's on no.

kaptchuk3 karma

tl;dr we need to define what a secure backdoor means before we can anwser the question. In some sense this is the harder part of the question. We actually tried this in a recent paper and found that maybe in the limited case where law enforcement is ok with only being able to decrypt messages that were sent *after* the target was under surveillance, it might be possible (according to our definition). Being able to retroactively decrypt messages while also providing good abuse resistance properties appears to be impossible.

--

So I know I'm not supposed to shill for my own work, but in this case I can't resist.

We recently wrote a paper that appeared at Eurocrypt 2021 on this exact topic (available here: https://eprint.iacr.org/2021/321.pdf) Its rather technical and is aimed at the cryptographic community, so I'll just give a quick rundown of our arguments here. Also, its worth nothing that everything below is basically aimed at end-to-end encrypted systems like Whatsapp and Signal. The solution space in encrypted devices looks a little different (although many of our ideas apply in that setting as well).

- Before we can even answer the question "Is it possible to have a safe backdoor," we need to define the problem more clearly. If you look carefully at the proposals that have been made in the past, you will notice that they implicitly make claims about what the author considers a "safe" backdoor. But as a cryptographer, I need a clear definition before I can even begin to analyze if something is secure.

- My co-authors and I came up with security properties of a backdoor system that we think are a clear minimum -- without these properties, any backdoor system is simply too much of a liability to deploy. (Even with these properties, it still might be a bad idea, but thats a slightly different conversation). Heres some of the issues we hope to address

  1. Transparency: Any system should have a cryptographic mechanism (ie. not just a departmental policy, but actual math) that requires some degree of transparency. The exact information that gets leaked by the transparency mechanism is a bigger question, but imagine something very simple: we want the public to learn how many time the backdoor capability is being used. This would allow for some amount of public oversight of how the backdoor is being used and facilitate an important policy discussion about its use. (You could also imagine something more advanced, eg. leaking the aggregate demographics of individuals being targeted for surveillance)
  2. Global Warrant Policies: As a society, we might want to place some boundaries on the types of search warrants that the judiciary can issue. For example, we might want to specify that warrants have to target individuals and ban the use of surveillance dragnets. But again, this is an important policy conversation that should somehow be enforced by the cryptographic mechanism itself.
  3. Detect-ability of catastrophic failure: One of my big fears about prior backdoor proposals is the catastrophic failure mode. For instance, key material that is supposed to remain secret is stolen by a foreign government and used to conduct mass scale surveillance. There's no way to even detect that this is happening in most prior proposals Given the rate at which we see data breaches, this failure mode is somewhat inevitable. We want a system that can at least notify everyone when this has happened so we start to re-key the system.

Hopefully it makes sense why having these properties are an important minimum to have.

- We gave a formal definition of a backdoor system that has addresses these issues and then go about seeing if its possible to build such a system. We found the following:

  1. Prospective Surveillance: In the case where law enforcement only need to use the backdoor to get access to messages that were sent after the individual in question was already under surveillance, we can kinda make this work. That is to say, we could build such a system from standard cryptographic tools that we can implement today. It would be very very very inefficient (remember: the question about if we should deploy such a system is a separate one than is it possible). But possible.
  2. Retrospective Surveillance: The case in which law enforcement want to use the backdoor on messages that were sent before the individual was under surveillance is more tricky. Essentially we are saying that the messaging system should be totally secure, but then retroactively we want to make a backdoor appear. Hopefully it makes sense why this is strictly harder than case 1. We actually found that achieving such a system implies the existence of a theoretical kind of encryption that is widely believed not to be possible to implement (not only inefficient -- actually impossible).

So where does that leave us? Maybe we could build a system that give good abuse resistance properties in just the prospective case, but this doesn't appear to actually be the ask from law enforcement. Retrospective probably is impossible, and that is the ask from law enforcement. But, I think the bigger take away is that this conversation as a whole skipped a step: we need to have a discussion about what a "safe backdoor" actually means in explicit terms before we can go about figuring out if we can build one.

henrikbedst2 karma

Has GDPR made a difference with regards to data privacy or are companies like Facebook, etc. too powerful to be significantly taken on through legislation now?

kaptchuk5 karma

My understanding is that GDPR has made some kind of difference. If nothing else, GDPR has helped us understand the scope of the problem we face as a society by allowing individuals to request all the data about themselves. Personally, I think we need the US to take a step when it comes to regulation in order to really start putting pressure on the logic of surveillance capitalism. Most of these companies are based in the US and seem to understand the world through a US concentric lens.

I'm not willing to give up legislation as a route to change. The only alternative I can think of organizing within the major technology companies. I think we have seen the start of this -- with walk outs and even a glimmer of collective bargaining within Tech. But there seems to be a long way to go on that front.

AMillionMonkeys2 karma

Is it too early to think about a move to quantum-computer-proof encryption for everyday use? The transition will probably happen some time but do you see it happening smoothly given the coordination needed between so many (often competing) entities?

kaptchuk6 karma

I can't remember who frame the issue this way (but i owe credit to someone...). The question is the value of information over time. If the data you are encrypting is going to be valuable in 50 years, maybe its not too soon to switch to post quantum primitives. If its something that probably only is valuable for the next 5 years, you are probably good ignoring the problem. NIST is currently running their post quantum competition and is finalizing its algorithm choices. We will probably see post-quantum algorithms in mainline distributions very soon -- which is great!

FWIW, I'm not convinced we are going to have quantum computers of the necessary size any time soon. But also I'm not an expert so /shrug

sugarcoatedcat2 karma

What do you think is the best way to inform people about data privacy issues?

kaptchuk3 karma

Using language that is approachable and makes the risks/harms real for people. Its not just saying "your privacy is important," because most people dont seem to care very much about their privacy. But I think if you frame it in terms of collective responsibility of collective harms, it can start to make it more real. For instance, all the major tech companies in the US make all their (billions of ) dollars from surveilling people and selling access to those people. This is a data privacy problem.

When it comes to keeping the information that people already want to keep private private, I want to flip your question on its head. We shouldn't need to inform people about how to keep their data private. Applications should be privacy by default and warning messages should be clear. We've seen huge jumps in the % of traffic on the web thats encrypted with TLS thanks to the warning message research from the browser companies. This same methodology should be applied everywhere.

Mr69Niceee1 karma

What do you think about the future of cryptocurrency ?

Do you think eventually government will regulate and it will just become the same as the fiat currency ?

kaptchuk2 karma

Personally, I think regulation feels inevitable. There's just too much snake oil getting thrown around in the space for it to be left alone. Guessing as to whether it becomes fiat currency is well beyond my oracle abilities.

I will just say this: I really hope that the impact of cryptocurrencies is that we live in a world where its cheap and easy to send money across the world (which is incredibly important for all kinds of marginalized communities like migrant workers) and that we end up with more privacy when it comes to our payments than we currently have. I'm not sure if we get to that world by cryptocurrencies just being ubiquitous or current monetary systems adopting some of the lessons we are learning from the cryptocurrency space.

laundry_writer1 karma

What would be a palatable Silicon Valley solution to data privacy?

kaptchuk2 karma

Palatable to me or to Silicon Valley? I imagine that right now there is no overlap in those two things but perhaps I'm wrong. I'm not convinced the current business models based on mass data collection are compatible with meaningful data privacy and data autonomy. Even the "privacy preserving" versions of this business model (where these companies can still learn what they want to learn and sell ads in the same way without learning anything about individuals) still seem to be extractive in a way I'm not comfortable with. I'm not sure where that leaves me personally (besides perhaps a little scared), but I'm not giving up on trying to find a solution yet!

CapitalAioli1 karma

What was it like working as a computer scientist in policy and specifically in Wyden's office?

kaptchuk2 karma

It was a lot of fun! I was there with another cryptographer, which made navigating the change in scenery more manageable. The folks in Wyden's office were amazing and made the entire experience both enjoyable and productive. We had the latitude to work on issues that were important to us personally, as they aligned well with what the office was working on anyhow. And I got to spend lots of time debating with lawyers, which was a whole new life skill that I didnt have previously.

epicurus42711 karma

How do you think the FBI is seizing stolen bitcoin (Colonial Pipeline being the most recent)? Do you think some crypto wallets could have backdoors?

kaptchuk2 karma

I'm not sure --- and to be honest, I haven't been following all the technical details of that story to make a good guess. If i had to guess in the dark its something along these lines: https://xkcd.com/538/. That said, its also possible that the FBI found/bought an exploit against someone else's software.

ppgrox1 karma

Hi gabe, what was the greatest week/night of your young life?

Also how do i throw a flick?

kaptchuk1 karma

Lets go Hop!

And I would have to say "gracefully"

BertramGuilfoyl0 karma

Did we go to grade school together?

kaptchuk2 karma

I guess its possible? On the internet, no one knows you're a dog!