Visit

By Tom Fleischman

How do online crowds form, grow and behave? How do they wield influence? What distinguishes desirable crowd activism from mob harassment?

In the summer of 2022, Cornell Tech and Cornell Law School professor James Grimmelmann and postdoctoral fellow Charles Duan hosted a virtual workshop in which participants attempted to answer these questions and more.

At the end of the two-day online workshop Grimmelmann, the Tessler Family Professor of Digital and Information Law at Cornell Tech and at Cornell Law School, and co-organizer Duan, now an assistant professor of law at American University’s Washington College of Law, asked participants to reflect on the conversations and identify important themes about platforms and crowds.

The result: “The Barons and the Mob: Essays on Centralized Platforms and Decentralized Crowds,” an introduction to the complexities of online crowds and the importance of understanding their nature in the context of efforts toward online platform regulation.

The introduction references a pair of online user “revolts.” In 2007, one of the users of news aggregator Digg posted an encryption key that could be used to circumvent copyright protection on Blu-Ray discs. Sixteen years later, Grimmelmann and Duan wrote, “history rhymed with itself” when Reddit, in preparation for a rumored IPO, started charging developers to access its previously free application programming interface. Users of both platforms rose up in revolt. In Digg’s case, the crowd won the revolt – not so with Reddit.

“The Digg disruption and the Reddit rebellion,” they wrote, “demonstrate the conflict between the two great sources of power on the Internet: the centralized platforms that control the infrastructure of online communities, and the decentralized crowds of users who come together in them.”

In all, a dozen experts share their perspectives in “The Barons and the Mob,” tackling what makes an online crowd; the influence of money on crowds; identifying misinformation; authenticity; network economics and other topics.

Grimmelmann spoke with the Chronicle about the essay collection:

Question: What was the impetus behind the workshop?

Answer: The idea came out of seeing some of the ways that crowds were self-consciously being weaponized for political and commercial purposes. The “to the moon” sentiment of the wallstreetbets subreddit wasn’t all that different from the kind of online energy associated with political movements or with influencer beefs. But platforms seemed to consider some of these crowds to be serious problems they had to block, and some of them to be benign intended uses. That paradox led us to look more closely at how platforms and crowds related to each other.

Q: Are there other moments in history that radically changed the dynamic between the “barons” and the “mob” – perhaps the invention of the printing press?

A: The printing press definitely helped catalyze new kinds of distributed groups, from scientific collaborations to journalism for “the public.” The age of revolution – starting especially with the French Revolution – demonstrated the dramatic power of the mob on the street compared with old aristocratic hierarchies. The mobs prevailed in the long run: Old forms of centralized power were swept aside and gave way to new political forms that were more responsive to mass public sentiment.

Q: Has the power dynamic between platforms and crowds morphed over time? Have crowds learned how to wield greater power over the last 20 to 30 years?

A: The Reddit moderator revolt last summer was a really striking moment, because Reddit explicitly decided that it was willing to take on the full power of a highly organized user group. It was a big bet, and Reddit basically won: Its IPO went ahead, and today the site has much more effective power over its user base. The pendulum seems to have swung in the direction of the platforms; they’re better able to predict and steer crowd dynamics than they were a few years ago.

You can see TikTok as an extreme example of this trend: The site harnesses crowd energy and enthusiasm but systematically works to prevent crowds from forming and sustaining themselves in ways that would form durable power alternatives.

Q: Do you see an ultimate “winner” in this push-and-pull between the “lords” (platforms) and the “commoners” (users)? Can there ever be a winner?

A: No – the tension is eternal. Without either of these forms, social media wouldn’t function. Platforms need crowds in order to be viable businesses, and crowds need platforms as a place to gather. They each have an interest in the other’s existence.

Q: Are there other big questions to be addressed in this space?

Yes – we don’t know how online crowds are catalyzed and controlled. We don’t know how to think about them as groups with agency for economic purposes. We don’t know what legitimate and effective forms of moderation to respond to them look like. We don’t know how regulations will go awry when crowds respond to them. And there are many more: This report is just a starting point, and a way of inviting people to think seriously about these issues.

Tom Fleischman is a Senior Writer/Editor for the Cornell Chronicle.


Cornell Tech today announced the newest cohort of 13 startup companies that will enter its established Runway incubator program this September. The program is run by the Jacobs Technion-Cornell Institute on the Cornell Tech campus and will welcome its largest cohort since the annual program began in 2014.

The 13 teams include eight Ph.D. founders, selected from 358 applications received this year for the Runway Startup Postdoc Program, and five “Spinout” teams, which are made up of 2024 Cornell Tech graduates who each won a $100,000 Start-Up Award at the conclusion of the 2023-2024 academic year.

Founders who participate in the program come from academic backgrounds and are focused on accelerating their young companies under the guidance of Cornell Tech faculty and advisors. To date, the program has launched more than 100 startups, including baby sleep monitor Nanit, real estate construction intelligence platform OnsiteIQ and infectious disease diagnostic Biotia. In total, the companies launched from the Runway Startups program have a valuation of more than $660 million and have created more than 500 new jobs in New York City.

Runway participants come to the Institute with early stage ideas and potential markets for their product(s). To help launch their startups and propel them into careers in the tech industry, they receive a package valued up to $325,000 for two years that includes a salary, a research budget, workspace on campus, and IP registration and use, as well as mentorship from academic and business experts in fields ranging from connective media, health technology, security and privacy, and computer vision.

“Runway is a proven catalyst for New York’s tech ecosystem. It creates entrepreneurship opportunities beyond those traditionally offered at universities and helps founders address the roots of real world issues and tackle them head on,” said Fernando Gómez-Baquero, Director of the Runway Startup Postdoc and the Spinout Programs at Cornell Tech. “The proposals from this year’s cohort have the potential to join alumni companies as they grow into fully realized startups, driving economic development, job growth, and New York’s leadership in innovation for years to come.”

Runway is part business school, part research institution, and part startup incubator. It helps tech founders translate their academic skills and mindset into entrepreneurial ventures. The incoming cohort plans to translate their ideas into startups utilizing tech, engineering and artificial intelligence, solving problems in fields ranging from healthcare to finance. The selected companies include:

  • Cipher, a marketplace that facilitates music licensing deals by connecting businesses to the biggest players of the music industry, tracking negotiations, and automating payments and licensing agreements.
  • Iriscience, integrating AR and VR into slit lamps, enabling remote eye examinations along with AI-assisted user interface to allow ophthalmologists and primary care providers to diagnose and treat patients in family clinics and underserved areas.
  • MercuryVote, a marketplace/auction house that provides a way for large and small retail investors to sell their unused and undervalued proxy votes for corporate elections, enabling activist investors to purchase the proxy votes to impact boards and proposals at the next corporate election.
  • MyophonX, a device consists of cutaneous electrodes embedded in a thin, flexible film, which capture Electromyography (EMG) signals from facial articulatory muscles. These signals are then processed by ANNs to produce speech, which can be transmitted via Bluetooth to an external device such as a phone, speaker, or headset and allow a person without a voice to speak.
  • mPulse-O2, a platform that enables accurate measurement of blood oxygen level and aims to transform the pulse oximetry technology by overcoming and changing the lack of inclusivity in design and validation of biomedical devices.
  • Onda Labs, addressing key challenges in water management, assisting utilities in boosting their revenue and improving overall efficiency by leveraging AI technologies.
  • Prendo, a digital platform for endometriosis that enables tracking patients’ symptoms individually on a daily basis and identifying the unique cyclic patterns and correlations of each symptom by generating “monthly symptom maps” and predicting the onset of symptoms.
  • PsyFlo, a platform that enables personalized, collaborative care for integrated behavioral health settings.
  • RapidReview, a platform that uses machine learning to accelerate research productivity by building tools that understand documents and help researchers navigate thousands of academic papers for literature review.
  • SensVita, a company that develops non-invasive sensors for wearable and furniture-integrated heart and lung monitoring by prioritizing no skin contact and broad application to clinical, at-home, and veterinary monitoring applications.
  • Simulacrum, an AI software venture that allows enterprises and institutions to make effective operational decisions in complex markets by providing them with behavioral market models learned from data to accurately predict future economic trends.
  • Vinci AI, a new ad format that combines the scale of interruptive ads and the engagement of in-video sponsored ads and that enables brands and creators to source sponsorship deals and automatically inserts brand advertisements into the background of creator videos.
  • WAVED Medical LLC, a medical software company developing technology that enhances breast cancer screening by using proprietary biophysical measurements that identifies pre-cancerous “at-risk” dense breast tissue most likely to progress to life-threatening disease.

Applications for the next cohort of Runway Startups, to begin in September 2025, will open on October 15, 2024 and close on February 15, 2025. For more information, go to https://tech.cornell.edu/programs/phd/startup-postdocs/ 


Cornell Law School and Cornell Tech are pleased to announce that David Reiss joined their faculties on July 1, 2024, as Clinical Professor of Law and Research Director of the Blassberg-Rice Center for Entrepreneurship Law. Based at the Cornell Tech campus on Roosevelt Island, Reiss will co-teach Cornell Law School’s Entrepreneurship Clinic, allowing the law school to provide a clinical offering in New York City for the first time. Reiss will also teach in Cornell Law and Cornell Tech’s program in Law, Technology and Entrepreneurship.

Reiss’s hire represents a significant milestone in the growth of the Blassberg-Rice Center for Entrepreneurship Law. Created with the support of a transformative gift from Franci J. Blassberg ’75, J.D. ’77, and Joseph L. Rice III, the Blassberg-Rice Center for Entrepreneurship Law will deepen Cornell Law School’s commitment to supporting entrepreneurship initiatives through clinical education. Reiss will teach alongside Celia Bigoness, Founding Director of the Blassberg-Rice Center and the Entrepreneurship Clinic. With Bigoness in Ithaca and Reiss in New York City, the Blassberg-Rice Center will provide pro bono legal services to entrepreneurs and small businesses across New York State.

“Cornell Law has a distinguished record of clinical service to the community and we could not be more grateful to the Blassberg-Rice family for the support to extend our students’ experience and pro bono assistance to New York City,” noted Jens Ohlin, Allan R. Tessler Dean of Cornell Law School.

“We are thrilled to welcome David to our campus as he joins the new Blassberg-Rice Center for Entrepreneurship Law as well as Cornell Tech’s Law, Technology and Entrepreneurship degree program,” said Greg Morrisett, Jack and Rilla Neafsey Dean and Vice Provost for Cornell Tech. “David’s legal background, extensive research and academic experience, and New York City network will be a great inspiration to our students and will serve to enhance these unique offerings on our campus.”

Reiss joins Cornell from Brooklyn Law School, where he taught for over 20 years and founded the Community Development Clinic. In addition to his teaching, Reiss is active in research, scholarship and professional service. Reiss served as the Research Director for Brooklyn’s Center for Urban Business Entrepreneurship, and he is a research affiliate at the NYU Furman Center, a collaboration between NYU Law School and the Robert F. Wagner Graduate School of Public Service. He serves on the New York State Bar Association’s Task Force on Emerging Digital Finance and Currency as an expert on the intersection of real estate and blockchain technology. Reiss is the author of the forthcoming book, Paying for the American Dream: How to Reform the Market for Mortgages (Oxford University Press).

“Training excellent 21st century lawyers requires law schools to keep up with technological innovation while also maintaining a focus on lawyering fundamentals: research, writing, and advocacy,” said Reiss. “Cornell Law School’s clinical program is keeping these two goals front and center as it educates tomorrow’s lawyers.”

“David is the perfect person to expand our clinical program to New York City,” said Bigoness. “He has a wealth of clinical teaching experience, and a strong professional network with government agencies, community organizations and businesses across the five boroughs. He’ll provide top-notch legal training to our students at the Cornell Tech campus, and will support the local community by empowering entrepreneurs and small businesses.”

Reiss earned his B.A. degree from Williams College in 1989, and his J.D., magna cum laude, from NYU Law School in 1996. After law school, Reiss spent five years as a corporate associate at pre-eminent firms – first at Morrison & Foerster, and then at Paul, Weiss, Rifkind, Wharton & Garrison. He then spent one year teaching at Seton Hall Law School, before joining the faculty of Brooklyn Law School in 2003.


The lines between two of New York City’s defining industries – artificial intelligence/technology and design – are increasingly blurry. Many designers, who lead the process of centering a new product on user needs, are beginning to adapt their work to feed into the growing nexus of these markets, while also ensuring that their new tech-enabled designs are working to improve human wellbeing and social good.

Angela Chen M.S. ’22, who is an alumna of Cornell in Ithaca and Cornell Tech, pioneered and launched two AI healthcare design products during her time at Cornell Tech and brings to life the value of tech for good. Her AI designs, Calmspace and Argo Data Marketplace, recently won the 2024 A’ Design Award (Italy), 2024 MUSE Design Awards, 2024 New York Product Design Awards and London Design Awards for their ingenuity in helping address problems faced by those working in the healthcare industry.

Chen found her passion in the interdisciplinary study of design and technology in her undergraduate years in Ithaca. As she was graduating, she saw how the rapid development of AI could be integrated as a tool in her design work. She decided to apply to Cornell Tech to learn and refine the software and technical skills she needed and to help connect her to New York City’s tech industry.

“I wanted to enrich myself in becoming a UX designer, a more creative technologist, an entrepreneur, particularly by empowering myself in one of the largest tech communities in the world, working to advance New York City’s economic development,” Chen explained. “Cornell Tech helped me integrate my design values into technology and break into the tech industry, doing things that were more experience-driven and providing a bridge between my studies and the industry that allowed me to work on real world challenges.”

Chen developed Calmspace during the COVID-19 pandemic when she was a first year student at Cornell Tech. She noticed, in speaking with the faculty and students around her, that there was a high prevalence of anxiety relating to digital devices amongst young people. To help address this and promote better mental health, she designed the Calmspace app, which offered a wide range of wellness features for both individual self-care and group activities, including meditation and yoga, encouraging young people to utilize technology in a way that is beneficial to their wellness.

Argo Data Marketplace was a Cornell Tech Product Studio project that Chen developed in collaboration with The MITRE Corporation. Their work together was also in response to the pandemic, as more experts and decision makers began to need access to confidential health-related data. She and her team aimed to deliver a software solution that facilitated secure sharing among enterprises and institutions. Chen worked to design the user interfaces empowered by the cutting-edge blind learning technology (a machine learning tool), playing a pivotal role in creating Argo as the first platform that supported commercializing confidential datasets for healthcare experts and enterprise users that addressed the complex challenges inherent in data security and privacy.

Both of Chen’s projects are focused on responsibly addressing the social needs she recognized to improve the health and wellbeing of her community and society at large. In every aspect of her design, Chen emphasizes the importance of taking on projects that are created ethically with an eye toward the future.

“It is important that my mindset as a designer is cemented in building a bridge that makes technology more accessible and exploring complex real world applications,” Chen said. “In addition to considering the product needs and business value, it’s about ensuring that the design values of my projects are impactful, sustainable, and create user friendly experiences.”

Much of this mindset was influenced by Chen’s time in Cornell Tech’s Connective Media Program, which allowed her to take human-centered design courses on social perspective and technical classes focused on AI and machine learning simultaneously.

The Connective Media Program is one of Cornell Tech’s many cross-disciplinary programs that provides both students with both hard technical skills and fosters social and economic awareness. Other examples include their PiTech initiative aimed at building a commitment to responsible tech and public interest technology, as well as their MBA program which focuses on the impact of tech and AI innovations in transforming the business landscape.

Through Cornell Tech’s integrated approach to education, tech entrepreneurs and innovators like Chen are able to manifest their design ideas with highly complex engineering and technology while also ensuring that human interest on the user end is kept front and center to maximize public good.


The Cornell Tech Council, the primary governance group for Cornell Tech and a subsidiary body of the Cornell University Board of Trustees, has announced the appointment of Howard Morgan Ph.D. ’68 as its new chairman and Adam Jacobs ’09 as its newest council member. Their appointments began on July 1, 2024.

The Cornell Tech Council comprises 15 business and technology leaders who oversee the mission and strategic goals of Cornell Tech, a graduate campus and research center of Cornell University founded in 2012. Located on Roosevelt Island in New York City, Cornell Tech develops new technologies through research, educates tech leaders, and builds new ventures through its business startup programs. The council advises the dean and senior leadership of Cornell Tech and its members serve as active champions and supporters.

Morgan succeeds David Siegel, who previously served as chairman since 2018, and who will continue to serve as a member of the council. “My term as Chair of the Cornell Tech Council has been immensely gratifying,” said Siegel. “Not only was I able to work with an exceptional group of leaders and civic-minded individuals, but I also had the opportunity to collaborate with them on ensuring the success of an institution that will continue to play an important role in Cornell and New York City’s tech ecosystem. I can’t think of a better successor than Howard, and I look forward to partnering with him as I continue to serve on the council”

Morgan, who has served on the Cornell Tech Council since 2021, co-founded First Round Capital in 2004, a firm that has been instrumental in funding early-stage technology startups. He retired in 2017 after leading a career that nurtured over 200 high-tech ventures. He currently chairs B Capital Group in New York, continuing his commitment to fostering innovative technology companies. An alumnus and longtime supporter of Cornell, Morgan has also served on the Cornell University Board of Trustees since 2019.

“Howard’s extensive academic, business, and tech ventures experiences, as well as his longtime support of Cornell Tech and leadership at Cornell University, make him the ideal person to lead the council as chairman at this critical time of growth for our campus,” said Greg Morrisett, Jack and Rilla Neafsey Dean and Vice Provost of Cornell Tech. “I look forward to working with Howard as we launch new initiatives that support our mission to educate tech leaders and contribute to the New York City tech economy. We are also thrilled to welcome Adam Jacobs, another esteemed Cornell alumnus, to the council this year. Adam’s focus on societal impact for businesses and entrepreneurial success make him a perfect fit, and we appreciate the influence and insight he brings to our campus.”

Morgan’s academic tenure includes professorships at the Wharton School and the Moore School at the University of Pennsylvania. He was pivotal in advancing user interface technology and optimizing computer networks. His work with ARPAnet in the 1970s laid the groundwork for modern internet technologies, influencing corporate and government agency communications. Beyond academia, Morgan’s leadership at Renaissance Technologies Corp. and his role in founding Idealab have underscored his influence in the tech industry. His contributions to public and

private boards, including Cold Spring Harbor Laboratory, Math for America and the New York Public Library, reflect his dedication to science, education, and service.

“I’m delighted to continue my service to Cornell by serving as the next Chairman of the Cornell Tech Council, especially, as we embark on this next phase of critical growth and expansion of the campus,” said Morgan. “I’m looking forward to building upon what David Siegel and other councilors have achieved.”

Adam Jacobs

New Cornell Tech councilor Adam Jacobs graduated from Cornell University in 2009 with a B.S. from the Dyson School of Applied Economics and Management at the Cornell SC Johnson College of Business. Today, as a founding partner of the San Diego-based Jacobs Scheriff Group, Jacobs works to enhance the value of businesses and organizations committed to societal impact. He is also a COO of a biotech startup and an investor in several startups focused on social change and a real estate entrepreneur. His philanthropic engagements in San Diego include volunteer leadership at the Lawrence Family Jewish Community Center, La Jolla Playhouse, and the Carlsbad Chamber of Commerce. Jacobs is actively involved with the San Diego Food Bank, San Diego Symphony, and the San Diego Zoo Wildlife Alliance, among other organizations.

His family’s legacy at Cornell includes his grandparents Irwin M. Jacobs ’54, BEE ’56, and the late Joan K. Jacobs ’54, who founded the Joan and Irwin Jacobs Technion-Cornell Institute at Cornell Tech with a transformational gift of $133 million in 2013. Irwin and Joan Jacobs also contr

ibuted to the education and aspirations of more than 500 Cornell University students through more than 1,300 awards over the past two decades.

“I am honored to join the Cornell Tech Council,” said Adam Jacobs. “As a proud alum of Cornell, I look forward to the opportunity to learn from the distinguished leadership, faculty, colleagues, and most importantly, students; contribute to the mission of creating lasting positive impact on our society; and continue my family’s connection and legacy with this esteemed institution.”

About Howard Morgan

Howard Morgan has more than 30 years of experience with over 200 high-tech entrepreneurial ventures. He is the Chairman of B Capital Group, New York, a venture capital fund that states that they back brash entrepreneurs building the next generation of groundbreaking technology companies. Dr. Morgan co-founded First Round Capital, a seed-stage venture capital firm, is president of the Arca Group, Inc., nurturing early-stage companies and taking them from seed stage through initial public offerings, and serves as a Director of Idealab, where he was a founding investor.

Previously, Dr. Morgan served as President of Renaissance Technologies Corp. in New York, where he supervised venture capital investments in high technology companies and was a founding board member and technical advisor of Franklin Electronic Publishers, one of the first manufacturers of personal computers. Dr. Morgan is a respected author and a frequent speaker at major industry conferences and has worked with many Fortune 100 companies and numerous government agencies.

Dr. Morgan also has an illustrious academic career; he was a professor at Cornell University and at the Wharton School and Moore School of the University of Pennsylvania. He has been a Visiting Professor at the California Institute of Technology and the Harvard Business School. Because of his early participation in the internet, he advised many corporate and government agencies on the uses of electronic and voice mail, implementing it throughout the Wharton School in the mid-1970s.

In addition, he has served on a number of public and private company boards and is a dedicated volunteer, leader and philanthropist. Dr. Morgan has been a member of the Cornell Tech Council since 2022, is a member of the Cornell Board of Trustees since 2019, and is an advisor to the Jacobs Institute Runway Program at Cornell Tech. He is also a Trustee of the Cold Spring Harbor Laboratory, Math for America, and the New York Public Library.

Dr. Morgan received a Ph.D in operations research from Cornell University and a B.S. in physics from City College of the City University of New York. Dr. Morgan and his wife, Eleanor Morgan, have made generous contributions to Cornell, including establishing the Howard and Eleanor Morgan Professorship at Cornell Tech and the Eleanor and Howard Morgan Professorship, both part of the School of Operations Research and Information Engineering.

About Adam Jacobs

Adam Jacobs is a founding partner of the San Diego-based Jacobs Scheriff Group, a business consulting company with a mission of enhancing the value of businesses and organizations that have a strong focus on societal impact, and the COO of a biotech startup. A graduate of Cornell University and three-year captain and starting catcher of the Big Red baseball team, Adam received a B.S. in applied economics and management from the Dyson School at the Cornell SC Johnson College of Business in 2009. He is part of a multigenerational Cornell family, also including his grandparents Irwin M. Jacobs ’54, B.E.E. ’56, and the late Joan K. Jacobs ’54, who together founded the Joan and Irwin Jacobs Technion-Cornell Institute at Cornell Tech. The Jacobs Institute advances graduate tech education, academic entrepreneurship, and innovation, and spans the Cornell Tech campus in New York City and the Technion-Israel Institute of Technology in Haifa.

In addition to his role with the Jacobs Scheriff Group, Adam is an investor in several startups focused on social change and a real estate entrepreneur. He is also highly involved in San Diego, enhancing and extending his family’s long tradition of engagement and philanthropy in the city. Adam serves as treasurer and executive board member of the Lawrence Family Jewish Community Center, as co chair of the La Jolla Playhouse’s Innovation Night, and as past chair of the board for the Carlsbad Chamber of Commerce. He is also involved with the San Diego Food Bank, San Diego Symphony, San Diego Zoo Wildlife Alliance, Jewish Family Service, and other organizations. He actively supports initiatives addressing housing, homelessness, and food insecurity, and is a member of The Giving Pledge Next Gen cohort and a former member of Forward Global.

Adam and his wife Amy, a trustee of the Salk Institute, have established the Amy and Adam Jacobs Family Philanthropic Fund through the Jewish Community Foundation and have a young daughter and son.


In an era where digital threats are ever-evolving, the need for advanced education and research in cybersecurity, trust, and safety is paramount. Cornell Tech’s new Security, Trust, and Safety (SETS) Initiative, a cutting-edge program aimed at revolutionizing these fields, aims to address these challenges head-on. The director of the SETS program, Google alum Alexios Mantzarlis, brings a wealth of experience and a vision to this critical endeavor.

We spoke with Mantzarlis to hear directly about what drove him to pursue this field, why Cornell Tech is the right institution for this type of initiative and what he hopes SETS will be able to achieve.

Can you tell us a bit about what led you to the fields of cybersecurity, safety, and trust?

I fell into this field by accident. After I launched a fact-checking startup in Italy, I became passionate about access to information and a fact-based public discourse built on transparency and mutual understanding.

I then moved to the United States to start the International Fact-Checking Network, an industry coalition of journalists combating misinformation. It was in that position that I realized the fundamental role our digital spaces played in setting the stage for a healthy information ecosystem. I saw both what governments could do through being one of the experts in the European Union’s High Level Group on online disinformation and advising the European Commission on scoping the phenomenon of fake news, as well as what platforms could do in helping Facebook launch a Third Party Fact-Checking Program.

Both of these experiences ultimately led me to join Google, where as Principal of Trust & Safety Intelligence I was the lead analyst for misinformation and generative AI, working on information quality policies across Google products like Search and Gemini.

What makes Cornell Tech uniquely positioned to lead in the security, trust, and safety fields?

Its people. Cornell Tech has an extraordinary set of experts across the privacy, cybersecurity, and digital safety fields – and they care deeply about the role of technology on society. This shows in their research and teaching, but it also shows in some of the projects that have emerged from this campus such as the Clinic to End Tech Abuse, the Digital Life Initiative and the Public Interest Tech Initiative.

And if all that wasn’t enough, Cornell Tech’s faculty and students are an integral part of a larger ecosystem of researchers and educators that includes equally stellar colleagues in Ithaca, N.Y. and Haifa, Israel that allow for collaboration and knowledge-sharing across disciplines, geographies, and lived experiences.

How do you envision the SETS Initiative impacting the broader field of cybersecurity and technology safety and trust?

A central goal for SETS is to leverage the collective expertise at Cornell Tech and build a new approach that more holistically approaches problems across systems security, data privacy, and trust and safety. The latter field has boomed over the past decade and now employs approximately 100,000 people worldwide, but it remains less systematically a focus of academic inquiry. Such inquiry is often disjointed from security and privacy, and SETS will be a place where we can tackle problems across all three topic areas in a closely integrated way.

What types of research projects or areas of study will SETS focus on initially?

One of the first things we’ll do is use the convening power of Cornell Tech to bring practitioners and researchers together on specific risk areas. To start, we’ll focus on the scourge of synthetic non-consensual intimate imagery that is more easy to generate and proliferate today, and on data center security.

Building on those initial initiatives, topics that SETS might address include stalkerware, doxxing and other forms of harassment that exploit cybersecurity or privacy vulnerabilities; anti-abuse features for end-to-end encrypted messaging; the consequences for privacy from new adtech developments; and LLM safety.

How important is interdisciplinary collaboration in the work you plan to do at SETS, and which industry or civic partners do you see Cornell Tech working with?

SETS aims to be interdisciplinary by default – leveraging experts in computer security, digital safety, policy, ethics, law and beyond – to assess the harms it studies. This is one of the things Cornell Tech is uniquely qualified to do as an institution that is inherently interdisciplinary and blends fields in its teaching, research, and programmatic offerings.

We aim to advance these issues that are strongly aligned with public and societal interest alongside partners who seek to inform and spur positive change. On something as complicated as synthetic non-consensual intimate imagery, for example, that might mean a group as diverse as public school educators, platform representatives, and victims. Ultimately, SETS will have succeeded if we have activated communities of practitioners fighting common harm together.

What do you see as the biggest challenges currently facing the technology industry when it comes to online safety and trust, especially with developments like artificial intelligence?

New technologies can end up harming vulnerable populations first and foremost. We’ve seen that with “nudifiers” – AI technology specifically designed to remove clothing from photos and create a fake nude image – making their way insidiously onto school grounds. Generative AI has been used as a kind of productivity tool by bad actors seeking to conduct influence operations and scam individuals and companies out of valuable information and money. Beyond that, we have yet to see the full impact of prompt injections, data poisoning, and interactive 1-to-1 social engineering. But those are coming.

Why is it important for academia and industry to work together to improve security and trust in technology and artificial intelligence?

Academia has the expertise, frameworks, and independence to study security and trust challenges with the public interest front and center. Industry has the infrastructure, resources, and moral obligation to engender systemic impact on our online infrastructures. Interaction between these two sectors is essential. Collaborative efforts can accelerate the development of robust security measures, foster innovation, and ensure that advancements in AI are both ethical and beneficial to society.

How will SETS involve and educate students, the broader Cornell Tech community, and even the general public about security and trust issues?

Cornell Tech already has a strong presence in many of these areas of focus. So first and foremost, we aim to bring it all together under a unified umbrella for those who want to concentrate on these topics. Beyond that, we will bolster it with additional courses and practical modules that we will start experimenting with soon.

When it comes to the general public, we are planning to launch a newsletter that translates and contextualizes research findings, as well as convene fireside chats and other public forums for conversations with leading practitioners and researchers. But we’re only just starting, so watch this space!

What are some of the long-term goals you have for SETS over the next five years?

Our North Star is to catalyze and harness the power of Cornell Tech and to provide world-class education that equips future leaders of this field to guide technological progress with strong ethical frameworks and a deep commitment to secure systems and safe online citizenship. We aim to foster a culture of continuous learning and innovation, ensuring that our graduates and the broader community are well-prepared to tackle emerging challenges and drive positive change in the technology landscape.


With an increased focus on mental health and growing understanding of its complexities, new research led by Cornell Tech Ph.D. candidate Dan Adler finds that there’s no one-size-fits all for how we experience mental health symptoms in everyday life. Using artificial intelligence, Adler is identifying trends that advance our understanding of the field to make symptom detection and treatment more effective.

New research led by Cornell Tech highlights the complex challenges and opportunities of using artificial intelligence to support mental health tracking and precision medicine. While the study found AI currently unreliable for such tracking, it raised important questions for future research, including the potential for bespoke, tailored solutions for targeted populations, and the challenges that are inherent when attempting to implement broad-stroke diagnoses and solutions to large and diverse groups of people.

Adler’s research paper, published in npj Mental Health Research, looked at how technology, such as smartphone data, can aid in measuring behaviors related to mental health. For instance, smartphones can track GPS data to monitor mobility, which is closely associated with depression symptoms – prior researchers have published papers showing that those who are more mobile throughout the day are less prone to depression symptoms than those who are more sedentary.

Adler’s research also uses AI to find correlations between behaviors and mental health. He explains that while some studies argue for the consistency of such measurements, his team, which includes faculty advisor Tanzeem Choudhury, Professor in Computing and Information Sciences and the Roger and Joelle Burnell Chair in Integrated Health and Technology, focuses on a larger, diverse population. Their research reveals that no single set of behaviors uniformly measures mental health across all individuals, a finding that emphasizes the importance of personalized measurement in mental health care.

Despite the dedication of mental health clinicians who strive to support their patients, Adler points out significant challenges within care, particularly with regard to measurement. Traditionally, mental health diagnoses and assessments rely heavily on self-reported information, clinician observations and collateral information from family and friends. This approach often complicates accurate diagnosis and treatment evaluation.

Mental health measurement is inherently complex and often lacks objective tools for clinicians to utilize because patient progress looks different to everyone. Adler notes the limitations of the historical pursuit of more objective measures, such as biomarkers in the brain, or the smartphone measurements he researched. “Research continues to emphasize that mental health isn’t that simple,” he said, emphasizing that while data-driven methods are promising, mental health remains a deeply personal and subjective experience.

“We used AI tools to find associations between behaviors and mental health, and we found that these tools are not very accurate,” Adler says of the paper. His research suggests conflicting signals in the data, indicating that a one-size-fits-all approach to mental health measurement is ineffective. Instead, Adler advocates for precision medicine and personalized tools, which can tailor care to individual triggers or needs.

For example, his paper shows that high phone use might be associated with depression for older adults, while low phone use might be associated with depression for younger adults, showing that additional context is needed to understand how behavior precisely impacts mental health.

Choudhury says that “the promise of wearable sensors and smartphones may lie in their ability to account for differences, track symptoms, and support precision treatment for individualized symptom trajectories.”

Adler’s engineering background and the interdisciplinary environment at Cornell Tech create a unique environment in which solutions can be explored in the context of multiple disciplines and perspectives. His work, influenced by personal experiences with the mental health care system, is driven by a passion to advance technological solutions to these challenges and create a more effective care system for patients and providers alike.

He stresses the importance of real-world impact in academic research, a principle deeply ingrained at Cornell Tech and in Choudhury’s People-Aware Computing group, which focuses on advancing the future of technology-assisted well-being.

For future research, Adler still sees significant potential in using AI to address access to care challenges. For example, Adler mentioned that new large language model tools could bridge gaps in mental health services. However, he cautions against the uncritical adoption of such technologies. Technologists, he argues, must implement guardrails to ensure these systems offer helpful, not harmful, guidance.

Adler envisions a balanced approach to AI in mental health care, where AI serves both as a way to fill gaps that are known to exist in the health care system and also as a way to supplement existing care practices. Adler believes that using AI to handle administrative tasks or summarize information can improve efficiency, but that it’s crucial to evaluate these tools to genuinely enhance care delivery.


By Kate Blackwood

Using experiments with COVID-19 related queries, Cornell sociology and information science researchers found that in a public health emergency, most people pick out and click on accurate information.

Although higher-ranked results are clicked more often, they are not more trusted, and misinformation does not damage trust in accurate results that appear on the same page. In fact, banners warning about misinformation decrease trust in misinformation somewhat but decrease trust in accurate information even more, according to “Misinformation Does Not Reduce Trust in Accurate Search Results, But Warning Banners May Backfire” published in Scientific Reports on May 14.

Internet users searching for medical advice might be vulnerable to believing, incorrectly, that the rank of the search result indicates authority, said co-author Michael Macy, Distinguished Professor of Arts and Sciences in Sociology and director of the Social Dynamics Laboratory in the College of Arts and Sciences (A&S). “When COVID hit, we thought this problem was worth investigating.”

The relationship between search result rank and misinformation is particularly important during a global pandemic because medical misinformation could be fatal, said Sterling Williams-Ceci ’21, a doctoral student in information science and the paper’s first author.

“Misinformation has been found to be highly ranked in audit studies of health searches, meaning accurate information inevitably gets pushed below it. So we tested whether exposure to highly ranked misinformation taints people’s trust in accurate information on the page, and especially in accurate results when they are ranked below the misinformation,” Williams-Ceci said. “Our study provided hopeful evidence that people do not lose faith in everything else they see in searches when they see misinformation at the very top of the list.”

Mor Naaman, professor of information science at Cornell Tech, and the Cornell Ann S. Bowers College of Computing and Information Science, also contributed to the study.

Williams-Ceci designed a series of online experiments to measure how results rank, the presence of misinformation, and the use of warning banners affect people’s trust in search results related to COVID-19.

The researchers built an online interface that showed participants a search engine results page with a question about COVID-19. The researchers randomized the rank of results that contained accurate information and manipulated whether one of the top three results contained misinformation. Participants were asked to choose one result that they would click, then to rate some of the individual results they had seen on a trustworthiness scale.

The experiments showed that misinformation was highly distrusted in comparison with accurate information, even when shown at or near the top of the results list. In fact, contrary to assumptions in prior work, there was no general relationship between search results’ ranking on the page and how trustworthy people considered them to be.

“Misinformation was rarely clicked and highly distrusted: Only 2.6% of participants who were exposed to inaccurate results clicked on these results,” the researchers wrote.

Further, the presence of misinformation, even when it showed up near the top of the results, did not cause people to distrust the accurate information they had seen below it.

Another experiment introduced warning banners on the search pages. These banners appeared at the top of the page for some participants and warned that unreliable information may be present in the results without identifying what this information said.

Google currently uses banners like these, but few studies have explored how they affect decisions about what information to trust in online searches, Williams-Ceci said.

The researchers found that one of these banners had an unanticipated backfire effect: It significantly decreased people’s trust in accurate results, while failing to decrease their trust in misinformation results to the same degree.

Overall, the results assuage fears that search engines diminish peoples’ trust in authoritative sources, such as the Centers for Disease Control and Prevention, even if these sources’ information is not at the top of the page, the researchers concluded. Macy said this is among the first studies to show that combatting misinformation with warning banners in search engines has mixed outcomes, potentially harmful to getting accurate results in front of internet users.

“The backfire effect of warning labels is very alarming, and further research is needed to learn more about why the labels backfire and how misinformation can be more effectively combatted, not only on Google but on other platforms as well,” Macy said.

Kate Blackwood is a writer for the Cornell University College of Arts and Sciences.


By Louis DiPietro

Amid the unpredictability and occasional chaos of emergency rooms, a robot has the potential to assist health care workers and support clinical teamwork, Cornell and Michigan State University researchers found.

The research team’s robotic crash cart prototype highlights the potential for robots to assist health care workers in bedside patient care and offers designers a framework to develop and test robots in other unconventional areas.

“When you’re trying to integrate a robot into a new environment, especially a high stakes, time-sensitive environment, you can’t go straight to a fully autonomous system,” said Angelique Taylor, assistant professor in information science at Cornell Tech and the Cornell Ann S. Bowers College of Computing and Information Science. “We first need to understand how a robot can help. What are the mechanisms in which the robot embodiment can be useful?”

Taylor is the lead author of “Towards Collaborative Crash Cart Robots that Support Clinical Teamwork,” which received a best paper honorable mention in the design category at the Association of Computing Machinery (ACM)/Institute of Electrical and Electronics Engineers (IEEE) International Conference on Human-Robot Interaction in March.

The paper builds on Taylor’s ongoing research exploring robotics and team dynamics in unpredictable health care settings, like emergency and operating rooms.

Within the medical field, robotics are used in surgery and other health care operations with clear, standardized procedures. The Cornell-Michigan State team, however, set out to learn how a robot can support health care workers in fluid and sometimes chaotic bedside situations, like resuscitating a patient who has gone into cardiac arrest.

The challenges of deploying robots in such unpredictable environments are immense, said Taylor, who has been researching the use of robotics in bedside care since her days as a doctoral student. For starters, patient rooms are often too small to accommodate a stand-alone robot, and current robotics are not yet robust enough to perceive, let alone assist within, the flurry of activity amid emergency situations. Furthermore, beyond the robot’s technical abilities, there remain critical questions concerning its impact on team dynamics, Taylor said.

But the potential for robotics in medicine is huge, particularly in relieving workloads for health care workers, and the team’s research is a solid step in understanding how robotics can help, Taylor said.

The team developed a robotic version of a crash cart, which is a rolling storage cabinet stocked with medical supplies that health care workers use when making their rounds. The robot is equipped with a camera, automated drawers, and – continuing Cornell Bowers CIS researchers’ practice of “garbatrage” – a repurposed hoverboard for maneuvering around.

Through a collaborative design process, researchers worked with 10 health care workers and learned that a robot could benefit teams during bedside care by providing guidance on medical procedures, offering feedback, and tracking tasks, and by managing medications, equipment, and medical supplies. Participants favored a robot with “shared control,” wherein health care workers maintain their autonomy regarding decision-making, while the robot serves as a kind of safeguard and monitors for any possible mistakes in procedures, researchers found.

“Sometimes, fully autonomous robots aren’t necessary,” said Taylor, who directs the Artificial Intelligence and Robotics Lab (AIRLab) at Cornell Tech. “They can cause more harm than good.”

As with similar human-robot studies she has conducted, Taylor said participants expressed concern over job displacement. But she doesn’t foresee it happening.

“Health care workers are highly skilled,” she said. “These environments can be chaotic, and there are too many technical challenges to consider.”

Paper coauthors are Tauhid Tanjim, a doctoral student in the field of information science at Cornell, and Huajie Cao and Hee Rin Lee, both of Michigan State University.

Louis DiPietro is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.


Intimate partner violence is notoriously underreported and correctly diagnosed at hospitals only around a quarter of the time, but a new method provides a more realistic picture of which groups of women are most affected, even when their cases go unrecorded.

PURPLE, an algorithm developed by researchers at Cornell and the Massachusetts Institute of Technology, estimates how often underreported health conditions occur in different demographic groups. Using hospital data, the researchers showed that PURPLE can better quantify which groups of women are most likely to experience intimate partner violence compared with methods that do not correct for underreporting.

The new method was developed by Divya Shanmugam, formerly a doctoral student at MIT who will join Cornell Tech as a postdoctoral researcher this fall, and Emma Pierson, the Andrew H. and Ann R. Tisch Assistant Professor of computer science at the Jacobs Technion-Cornell Institute at Cornell Tech and in the Cornell Ann S. Bowers College of Computing and Information Science. They describe their approach in “Quantifying Disparities in Intimate Partner Violence: a Machine Learning Method to Correct for Underreporting,” published May 15 in the journal npj Women’s Health.

“Often we care about how commonly a disease occurs in one population versus another, because it can help us target resources to the groups who need it most,” Pierson said. “The challenge is, many diseases are underdiagnosed. Underreporting is intimately bound up with societal inequality, because often it tends to affect groups more if they have worse access to health services.”

Shanmugam became interested in intimate partner violence after Pierson recommended the book “No Visible Bruises: What We Don’t Know About Domestic Violence Can Kill Us” by Rachel Louise Snyder. She realized that the pervasive issue of underreporting was something statistical methods could help address. The result was PURPLE (Positive Unlabeled Relative PrevaLence Estimator), a machine learning technique that estimates the relative prevalence of a condition when the true numbers of affected people in different groups are unknown.

The researchers applied PURPLE to two real-life datasets, one that included 293,297 emergency department visits to a hospital in the Boston area, and a second with 33.1 million emergency department visits to hospitals nationwide. PURPLE used demographic data along with actual diagnoses of intimate partner violence and associated symptoms, like a broken wrist or bruising, which could indicate the condition even when the patient was not actually diagnosed.

“These broad datasets, describing millions of emergency department visits, can produce relative prevalences that are misleading using only the observed diagnoses,” Shanmugam said. “PURPLE’s adjustments can bring us closer to the truth.”

PURPLE indicated that patients who are nonwhite, not legally married, on Medicaid or who live in lower-income or metropolitan areas are all more likely to experience intimate partner violence. These results match up with previous findings in the literature, demonstrating the plausibility of PURPLE’s results.

The results also show that correcting for underreporting is important to produce accurate estimates. Without this correction, the hospital datasets do not show a straightforward relationship between income level and rates of victimization. But PURPLE clearly shows that rates of violence are higher for women in lower income brackets, a finding that agrees with the literature.

Next, the researchers hope to see PURPLE applied to other often-underreported women’s health issues, such as endometriosis or polycystic ovarian syndrome.

“There’s still a lot more work to be done to measure the extent to which these outcomes are underdiagnosed, and I think PURPLE could be one tool to help answer that question,” Shanmugam said.

The new technique also has potential applications beyond health conditions. PURPLE could be used to reveal the relative prevalence of underreported police misconduct across precincts or the amounts of hate speech directed at different demographic groups.

Kaihua Hou, a doctoral student at the University of California, Berkeley, contributed to the study. Pierson also has an appointment with Weill Cornell Medicine.

Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.