
By Grace Stanley
Thomas Ristenpart, professor of computer science at Cornell Tech and the Cornell Ann S. Bowers College of Computing and Information Science, has received the Association for Computing Machinery Conference on Computer and Communications Security (ACM CCS) Test of Time Award for his influential 2015 paper on privacy risks in machine learning.
The paper, “Model Inversion Attacks that Exploit Confidence Information and Basic Countermeasures,” was co-authored with Matt Fredrikson, associate professor at Carnegie Mellon University, and Somesh Jha, professor at the University of Wisconsin-Madison.
The award recognizes research that has had a lasting impact on the field of computer security and privacy. The paper was among the first to show how machine learning models — especially those made available through online services — can inadvertently leak sensitive information.
“This was an exciting time period for machine learning, with deep learning systems taking off in terms of functionality,” said Ristenpart. “One reason this generation of models were thought to be so powerful was that they might memorize, in some form, data from the training set. In turn if the training data is sensitive, there becomes a privacy question: Is it safe to give access to a model?”
The research demonstrated how adversaries could reconstruct private data, including recognizable facial images, simply by querying a model and analyzing its confidence levels. It also proposed early countermeasures, such as modifying training algorithms to reduce privacy risks.
“The paper was fortunately timed,” Ristenpart said. “Researchers were actively seeking to understand privacy issues in machine learning. This paper, together with others from this time period, showcased the need for better ways of evaluating privacy risks ahead of making models accessible. It also influenced follow-up work investigating other types of privacy leakage, how to build privacy-preserving machine learning algorithms, and more.”
The paper has had impact both on academic and industry practices, prompting deeper evaluations of privacy risks in machine learning systems and inspiring follow-up work on privacy-preserving algorithms.
In addition to the Test of Time Award, Ristenpart also received a Distinguished Paper Award at ACM CCS 2025 for a new paper titled “The OCH Authenticated Encryption Scheme.” His co-authors include Sanketh Menda, a recently graduated Cornell Tech Ph.D., as well as researchers at Florida State University, the University of North Carolina at Chapel Hill, and the University of California, San Diego.
Grace Stanley is the staff writer-editor at Cornell Tech.