Ari Juels has written a novel about a cult of neo-Pythagoreans, has fenced competitively and reads Latin (albeit rustily) — Renaissance skills, he says, that enhance the way he tackles his chosen field of cryptography. The way he sees it, cryptography is about human behavior as well as pure mathematics, which means it draws on many other forms of knowledge, including psychology and neuroscience.

A fan of collaboration, Juels is particularly thrilled to be part of Cornell Tech’s growing security team. “There’s quite a bit of excitement about the strength of the security and privacy and crypto group across the two campuses. We’ve got a really phenomenal group of faculty,” he said.

With his broad philosophical approach, it is no wonder that Juels has become a leading mind on topics of ever-increasing national urgency like privacy. In his work, Juels is already contemplating a post-privacy world where computer algorithms will have to be designed to protect the public from bias, intentional or not, by corporations and the government.

Juels is also thinking up ways to redesign the current financial system with electronic currencies. As usual, he is trying to stay one step of others by anticipating weakness in such systems before criminals do. Together with a team of peers from Cornell’s Ithaca campus, Juels recently established the Initiative for CryptoCurrencies and Contracts (IC3) at the Jacobs Technion-Cornell Institute.

He is eager to forward Cornell Tech’s mission to make academic research on campus relevant in the real world, and has an ongoing cloud security project with Amazon and a project with Verisign on puzzles that can stop automated attacks from clogging up servers.

Cornell Tech: What made you go into computer science?

Ari Juels: Basically I’m a failed professor of literature. At Amherst, I studied both Latin literature and mathematics, so for grad school I wasn’t sure whether I wanted to pursue literature or theoretical computer science, which is basically a mathematical discipline. So I applied for all kinds of fellowships, and I didn’t get any in literature, so I ended up going to Berkeley for computer science.

I was interested mainly in the theory of computation, which deals with some pretty deep questions, like what is fundamentally knowable in the computational sense.

For example, a natural question to ask about a computer program is: will it at some point stop running? It seems like this should be a fairly easy thing to determine, but it turns out to be an undecidable problem. So that is a very concrete way of saying that there are limits to our knowledge that can be defined in a computational sense.

You seem to have a philosophical approach to numbers.

Computer science has a considerable overlap with philosophy. If you talk to my colleague Rafael Pass, who studies the theory of cryptography and game theory, he regards these as philosophical disciplines.

So how did you become interested in cryptography?

A fellow student at Berkeley circulated a manuscript about electronic cash, and I was really struck by how cryptography was involved — it is a deeply mathematical discipline but with direct practical application. It’s rare in computer science that serious mathematics gets used in deployed systems. Typically, theoretical constructs get heavily watered down in practice, but in cryptography, the genuine mathematical core gets preserved when systems are built.

It also overlaps with so many other disciplines.

What fields in particular?

One example I like to give is an article I published a couple of years ago, which I coauthored with my wife, who is a neuropsychologist. We studied the possibility of incorporating secret passwords into people’s memories using what’s called implicit or procedural memory — as opposed to explicit memory. It’s the type of memory you use to ride a bike rather than memorize a poem.

There are lots of surprising connections of that kind. Security is relevant to pretty much every other field of computer science, and it draws on techniques from other fields, so I like the fact that it’s naturally ecumenical.

What are some of your specific research areas now?

I’m focusing on cryptocurrencies. Bitcoin is the best-known example, but I’m not sanguine about it in particular. However, there are technical elements of Bitcoin and its successors that I think will provide very interesting capabilities. One of these is what’s called “smart contracts,” which are programs that run autonomously. What makes Bitcoin distinctive is the fact that it’s basically a peer-to-peer currency. There’s no one entity in control of the system — it’s decentralized.

And if you build on that principle and start running programs that manipulate money, you end up creating self-enforcing contracts that don’t come under the oversight of a judicial system, and that’s very interesting.

A colleague in Ithaca and I have been looking at one application of smart contracts and trying to figure out what risk of illegal or malicious behavior would arise if such smart contracts were to see widespread use.

So you’re trying to stay a couple steps ahead of the criminals.

That’s it. Another interest is deception as a defensive tactic, and the use of what are called “honey objects” in computer science. Honey objects are basically just fake data used to lure adversaries for observation or to divert them away from real targets.

I am also working on what I like to call “post-privacy” technologies. My colleagues usually object to that label. The starting point for this third line of research, however, is the question of what happens if privacy erodes to the point of being indefensible. We seem to be on that trajectory.

Many of my peers work on technologies to shore up privacy, but I’m not very optimistic about our ability to do this. So I’m asking what happens if it basically evaporates? I’m beginning to look at how to detect the presence in algorithms of biases or inappropriate use of sensitive personal information.

What’s a concrete example?

Staples recently implemented a pricing algorithm that offered discounts to customers who lived near competitors. This seems reasonable on the face of it, but ended up imposing higher prices on socio-economically disadvantaged groups.

How does the graduate education you received differ from the one students are having at Cornell Tech today?

I was interested in theoretical computer science, so I was almost proud of the fact that I never got my hands dirty. But my interest in computer science has shifted over time and become increasingly applied. One of the reasons I’m here is because I’m interested in seeing the research I do get deployed in the real world, and that’s a fundamental part of Cornell Tech’s mission.