Visit



As the use of non-fungible tokens (NFT) has skyrocketed within the art world, and as other industries such as media, sports and entertainment, have implemented them as revenue generators over the past year, so have automated bots that make this new revenue model less valuable for artists and companies.

These bots aren’t new though – many were originally generated from the high-end sneaker market and in recent months have permeated the blockchain. Not only do these bots deepen inequity within the marketplace but they also ultimately manipulate the prices of the resale market. In January 2021, the Yeezy ‘Sun’ drop retailed at approximately $250, but the shoes could be found on resale sites for around $600 that same day.

In September 2021, Time Magazine launched thousands of NFTs, but their launch was taken over by scalper bots, despite precautions put in place to limit the number of NFT purchases per person. This drop also resulted in inflated transaction fees on the blockchain network.

A new tool developed by researchers at Cornell Tech, led by PhD student Yan Ji and research engineer Tyler Kell, aims to enforce a one-NFT-per-person drop policy. A successful demo took place at ETH Denver this week, where attendees established unique identities to participate in a free NFT drop with prominent digital artist Zach Lieberman.

The smart contract technology tool the team developed establishes unique identities for participants to confirm they are not bots. The tool verifies legal names from trustworthy websites such as the Social Security Administration (the data is confidentially collected and processed through a trusted execution environment so that the information cannot be accessible by anyone including the developers) or alternatively by decentralized identities based on Proof of Attendance Protocol.

“NFTs were designed to democratize the future of art-buying, but for the most part, only people whose skills are technologically advanced enough have been consistently successful at acquiring these works,” said Cornell Tech research engineer Tyler Kell. “This new technique that our team developed will allow for NFT drops to be equitable and fair and ultimately improve the market overall.”

While previous attempts at preventing bots from taking over NFT drops have included a variety of solutions that allow sellers to target a specific individual including white lists, limiting unique crypto wallets and limiting the number of purchases per person. Each of these have been circumvented in different ways, as social media accounts are not strong credentials and users can purchase accounts to get on white lists.

The researchers intend to further this work by potentially developing other desirable tools and techniques such as sophisticated collector analysis and bot identification that can be used to establish a more equitable NFT market.


By Adam Conner-Simons

Researchers at Cornell Tech have created a new approach to helping survivors of domestic abuse stop assailants from hacking into their devices and social media to surveil, harass and hurt them.

The model focuses on “continuity of care,” so clients experience a seamless relationship with one volunteer tech consultant over time, similar to a health care setting. It matches survivors with consultants who understand their needs and establish trust, offers survivors multiple ways to safely communicate with consultants, and securely stores their tech abuse history and concerns.

“Personal data management in tech abuse is a complex thing that can’t always be ‘solved’ in a single half-hour visit,” said Emily Tseng, M.S. ’19, a doctoral student and lead author on a paper about the model. “Most of the approaches that exist in tech support are limited by a one-size-fits-all protocol more akin to an emergency room than a primary care provider.”

Tseng will present the paper, “Care Infrastructure for Digital Security in Intimate Partner Violence,” in April at the ACM CHI Conference on Human Factors in Computing Systems in New Orleans.

Tseng and her colleagues at Cornell Tech’s Clinic to End Tech Abuse developed the new approach, in partnership with New York City’s Mayor’s Office to End Domestic and Gender-Based Violence. Their research draws on eight months of data, as well as interviews with volunteer technology consultants and experts on intimate partner violence (IPV).

“This work provides an honest look at both the benefits and burdens of running a volunteer technology consultant service for IPV survivors, as well as the challenges that arise as we work to safely provide computer security advice as care,” said co-author Nicola Dell, associate professor at Cornell Tech’s Jacobs Technion-Cornell Institute. “Our hope is that our experiences will be valuable for others who are interested in helping at-risk communities experiencing computer insecurity.”

Survivors can experience many forms of gender-based violence, including technology facilitated abuse, said Cecile Noel, commissioner of the Mayor’s Office to End Domestic and Gender-Based Violence. “Cornell Tech’s groundbreaking program not only helps survivors experiencing technology abuse but is also working to better understand how people misuse technology so that we can create better protections for survivors,” Noel said. “We are proud of the critical role our longstanding partner Cornell Tech plays in improving the lives of survivors.”

Tech abuse often exists within a larger web of harm, Tseng said. “In an ideal world, the people on the ‘Geek Squad’ would be able to treat tech abuse with the sensitivity of a social worker.”

Assailants can abuse their victims through tech including spyware, also known as stalkerware, and through inappropriate use of location-tracking features in phones and other devices. They harass their former partners on social media, such as by posting private photos and posing as their victims to alienate family and friends. Abusers can also hack into email accounts and change recovery emails and phone numbers to their own, potentially devastating their victims’ careers.

In previous models, counselors remained anonymous, impacting their ability to build trust with survivors. Short, one-time appointments were not long enough to address clients’ needs. And appointments took place at a specific time; survivors who could not leave their homes or find a safe, private place to take a call were unable to access services and couldn’t reach counselors at other times. It can be frustrating and even re-traumatizing for survivors to share their stories with new consultants at each appointment, Tseng said.

One of the team’s larger goals is to offer survivors more peace of mind and feelings of empowerment – that they have the tools to handle future challenges. “With technology, there are so many ways to remain entangled with your abuser even after you’ve physically and romantically left the relationship,” Tseng said.

One tricky element is determining how much support is realistic. While a one-time “urgent care” visit is probably insufficient, prolonged engagement would be unsustainable for consultants and the clinic as a whole. “In several cases, consultants ended up working with clients over many appointments stretching on for weeks or months,” Tseng said.

As a next step, she wants to explore additional ways to evaluate ongoing security-care relations from the perspective of survivors, particularly people from marginalized communities.

Dell co-created the Clinic to End Tech Abuse with Thomas Ristenpart, associate professor at Cornell Tech; both Dell and Ristenpart are also affiliated with the Cornell Ann S. Bowers College of Computing and Information Science.

Other co-authors on the paper are Ristenpart, postdoctoral associate Rosanna Bellini, doctoral student Mehrnaz Sabet and Harkiran Sodhi, MBA ’21. The research was funded in part by a gift from Google and support from the National Science Foundation.

Adam Conner-Simons is director of communications at Cornell Tech.