Visit

Cornell Tech alumni startup Bowtie was recently acquired by MINDBODY, a leading technology platform for the wellness industry.

Bowtie is an AI-driven solution for businesses that can automatically book appointments, answer questions and conduct live chats via SMS or the web.

According to a release on MINDBODY’s website, “[Bowtie’s] AI features include, instant text backs for clients who call but are unable to connect with the business, AI messaging over SMS and Webchat for automated bookings and a communication portal to message and track customers across channels.”

The company was founded in 2016 during Startup Studio by Ron Fisher, Johnson Cornell Tech MBA ’16, Mike Wang and Vivek Sudarsan, both Masters of Computer Science ’16.

“The acquisition makes sense on a lot of different levels, but the defining thread from the beginning was to bring our AI technology to small and medium sized businesses, the very same market that has made MINDBODY the dominant software solution in the wellness space,” said Ron Fisher, Bowtie CEO and co-founder, in a release on MINDBODY’s website.


Blockchains have been hailed as fair and open, constructed so a single user can’t falsify or alter records because they’re all part of a transparent network.

The reality is not so simple, according to new Cornell Tech research.

Like high-frequency traders on Wall Street, a growing army of bots exploit inefficiencies in decentralized exchanges, which are places where users buy, sell or trade cryptocurrency independent of a central authority, the study found. The researchers also found that high fees paid to prioritize certain transactions pose a security threat to the entire blockchain.

These practices allow predatory users to anticipate and profit from everyday trades, siphoning millions or possibly billions of dollars a year in cryptocurrency.

“In a traditional system you have a broker or someone you’re trading through, and you trust them, or they’re legally required to do the right thing,” said Philip Daian, Cornell Tech doctoral student in computer science and first author of “Flash Boys 2.0: Frontrunning, Transaction Reordering and Consensus Instability in Decentralized Exchanges,” which was presented at the Cornell Blockchain Conference April 13 at Cornell Tech.

“In these systems, the broker is replaced by the blockchain, which seems like a trusted third party, but in reality there are a lot of different moving parts in the blockchain that can be manipulated,” he said. “So you have to be very careful about what the blockchain is actually giving you.”

To conduct the study, an eight-person team led by Ari Juels, professor of computer science at the Jacobs Technion-Cornell Institute at Cornell Tech and senior author of the paper, spent 18 months tracking trades on six decentralized exchanges. They then measured when they heard about the transactions, who reported them and at what time.

The information revealed how bots were exploiting time delays in the system to make trades far faster than human users could, allowing them to use tactics such as frontrunning – making deals based on advance information, which is illegal in many markets. The bots could also change the sequences of their own transactions to make them more profitable, or take advantage of human error.

Blockchains function like a constantly updated database distributed among a network of computers. Smart contracts use blockchain technology to automatically determine the flow of money among parties. Transactions on the blockchain are verified by “miners,” users who solve a series of problems in exchange for payment.

The miners determine the order of transactions on the blockchain, and the researchers found that this authority can also lead to corruption. Miners may accept higher fees to prioritize certain trades, making the entire system vulnerable, or they may even rewrite blockchain history to steal funds already allocated by smart contracts, the study found.

“The miners have a tremendous amount of power,” Daian said. “The blockchain doesn’t get rid of the middleman. It just turns one middleman into 100 middlemen, who you hope are not all being bribed or working against you for their own reasons. In some systems that could be good, but it doesn’t guarantee that your trades are going to be fair.”

Though the researchers studied only decentralized exchanges, which comprise a small but growing share of cryptocurrency trading, they said it’s likely these tactics are also used on centralized exchanges – potentially a billion-dollar issue.

That’s the bad news. But the good news is that many of these practices could be halted by increased security and better design, Daian said.

“If you use a cheap bank vault to store your expensive pile of gold, it will be more attractive for someone to break into it,” he said. “A lot of users are trading on these exchanges and having experiences that are not as good as they could be if the exchanges were designed better.”

Also contributing to the paper: Steven Goldfeder, a postdoctoral researcher at Cornell Tech; Tyler Kell, a research engineer at the Initiative for Cryptocurrencies and Contracts (IC3); Iddo Bentov, a research scientist at Cornell Tech; and researchers from the University of Illinois, Urbana-Champaign; Carnegie Mellon University; and ETH Zurich.

The research was supported in part by the National Science Foundation and IC3.


In an online marketplace like Airbnb, host profiles can mean the difference between a booked room and a vacant one. Too peppy, too long, too many exclamation points? Language is critical in a user’s search for trust and authenticity, crucial factors in any online exchange.

With so much at stake, should Airbnb hosts rely on an algorithm to write their profiles for them?

That depends, according to new research from Cornell and Stanford University. If everyone uses algorithmically generated profiles, users trust them. However, if only some hosts choose to delegate writing responsibilities to artificial intelligence, they are likely to be distrusted. Researchers billed this the “replicant effect” – a nod to the movie “Blade Runner.”

“Participants were looking for cues that felt mechanical versus language that felt more human and emotional,” said Maurice Jakesch, a doctoral student in information science at Cornell Tech and lead author of “AI-Mediated Communication: How the Perception that Profile Text was Written by AI Affects Trustworthiness,” which will be presented at the ACM Conference on Human Factors in Computing Systems, May 4-9 in Glasgow, Scotland.

“They had their own theories of what an AI-generated profile would look like,” Jakesch said. “If there were serious spelling mistakes in a profile, they would say it’s more human, whereas if someone’s profile looked disjointed or senseless, they assumed it was AI.”

AI stands to revolutionize natural language technologies and how humans interact with each other. People experience some AI-mediated communication already: Gmail scans the content of our arriving emails and generates suggested, one-click “smart replies,” and a new generation of writing aids not only fixes our spelling errors but polishes our writing style.

“We’re beginning to see the first instances of artificial intelligence operating as a mediator between humans, but it’s a question of: ‘Do people want that?’” Jakesch said. “We might run into a situation where AI-mediated communication is so widespread that it becomes part of how people evaluate what they see online.”

In the study, researchers sought to explore whether users trust algorithmically optimized or generated representations, particularly in online marketplaces. The team conducted three experiments, enlisting hundreds of participants on Amazon Mechanical Turk to evaluate real, human-generated Airbnb profiles. In some cases, participants were led to believe that some or all of the profiles were generated through an automated AI system. Participants were then asked to give each profile a trustworthiness score.

When told they were viewing either all human-generated or all AI-generated profiles, participants didn’t seem to trust one more than the other. They rated the human- and AI-generated profiles about the same.

But that changed when participants were informed they were viewing a mixed set of profiles. Left to decide whether the profiles they were reading were written by a human or an algorithm, users distrusted the ones they believed to be machine-generated.

“The more participants believed a profile was AI-generated, the less they tended to trust the host, even though the profiles they rated were written by the actual hosts,” the authors wrote.

What’s so bad about using AI-mediated communication? As one study participant put it, AI-generated profiles “can be handy but also a bit lazy. Which makes me question what else they’ll be lazy about.”

As AI becomes more commonplace and powerful, foundational guidelines, ethics and practice become vital, the researchers said. Their findings suggest there are ways to design AI communication tools that improve trust for human users. For starters, Jakesch said, companies could add an emblem on all text produced by AI, as some media outlets already use on algorithm-created content. Design and policy guidelines and norms for using AI-mediated communication are worth exploring now, he added.

“The value of these technologies will depend on how they are designed, how they are communicated to people and whether we’re able to establish a healthy ecosystem around them,” he said. “It comes with a range of new ethical questions: Is it acceptable to have my algorithmic assistant generate a more eloquent version of me? Is it okay to for an algorithm to tell my kids goodnight on my behalf? These are questions that need to be discussed.”

The paper was co-authored with Mor Naaman, associate professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech; Xiao Ma, a doctoral student in information science at Cornell Tech; and Megan French and Jeffrey T. Hancock of Stanford.


Spanning the departments of Computer Science, Information Science, and Communication, bridging campuses in Ithaca, NY, and New York City, scholars and innovators at Cornell University are guiding the field of human-computer interaction. Each year, our rundown of Cornell papers at the annual CHI Conference on Human Factors in Computing Systems illustrates Cornell’s influence, and 2019 is no different.

Cornell Tech and Jacobs Technion-Cornell Institute faculty and researchers contributed nine out of nineteen papers accepted to the conference from Cornell:

AI-Mediated Communication: How the Perception that Profile Text was Written by AI Affects Trustworthiness
Authors: Maurice Jakesch (Cornell Tech), Megan French (Stanford), Xiao Ma (Cornell Tech), Jeff Hancock (Stanford), and Mor Naaman (Cornell Tech/Jacobs Institute)

Cultivating Care Through Ambiguity: Lessons from a Service Learning Course
Authors: Samar Sabie and Tapan Parikh (both from Cornell Tech)

Designing Interactive 3D Printed Models with Teachers of the Visually Impaired
Authors: Lei Shi (Cornell Tech), Holly Lawson (Portland State), Zhuohao Zhang (Zhejiang University), and Shiri Azenkot (Cornell Tech/Jacobs Institute)

Engaging High School Students in Cameroon with Exam Practice Quizzes via SMS and WhatsApp
Authors: Anthony Poon, François Guimbretière, Sarah Giroux, Nicola Dell, Parfait Eloundou-Enyegue (all of Cornell University and Cornell Tech)

“Everyone Has Some Personal Stuff”: Designing to Support Digital Privacy with Shared Mobile Phone Use in Bangladesh 
Authors: Syed Ishtiaque Ahmed (Info Sci PhD ’17, now at the University of Toronto), Md. Romael Haque (Marquette), Irtaza Haider (Georgia Tech), Jay Chen (NYU), and Nicola Dell (Cornell Tech/Jacobs Institute)

Face and Ecological Validity in Simulations: Lessons from Search-and-Rescue HRI
Authors: Lorin Dole (Google) and Wendy Ju (Cornell Tech/Jacobs Institute)

Is Now A Good Time? An Empirical Study of Vehicle-Driver Communication Timing
Authors: Robert Semmens (NYU), Nikolas Martelaro (Accenture), Pushyami Kaveti (Northeastern), Simon Stent (Cambridge), and Wendy Ju (Cornell Tech/Jacobs Institute)

Unintended Consonances: Methods to Understand Robot Motor Sound Perception
Authors: Dylan Moore (Stanford), Paula Varela (Nofima), Tobias Dahl (SINTEF), Wendy Ju (Cornell Tech/Jacobs Institute), Tormod Naes (Nofima), and Ingunn Berget (Nofima)

When Do People Trust Their Social Groups? 
Authors: Xiao Ma (Cornell Tech/Facebook), Justin Cheng (Facebook), Shankar Iyer (Facebook), and Mor Naaman (Cornell Tech/Jacobs Institute)

Read the full list of Cornell University papers at CHI.


A new fellowship funded by Don Follett ’52 and Mibs Follett ’51 aims to encourage Cornell Engineering graduates to pursue master’s degrees at Cornell Tech, boosting the pipeline of students and cementing connections between the two campuses.

The Follett Family Fellowship will help Cornell Tech recruit a talented, diverse and high-performing pool of students. In turn, it will aid Cornell in retaining top students and, through Cornell Tech, fostering a robust tech ecosystem in New York City.

“In today’s world, technology is just absolutely exploding, and using technology properly is going to be a huge difference-maker in the world and in our country, so educating these students is very important,” said Follett, who retired as CEO of Follett Corp. in 1994. “To be in New York City, which is a very thriving place to be, and to have the education from Ithaca is just a win-win.”

The $2.5 million endowment will provide fellowships to Cornell Engineering undergraduate alumni enrolled in Master of Engineering programs at Cornell Tech, beginning in 2020. Once fully operational, it will support three to five students a year.

Follett, who received a bachelor’s degree in mechanical engineering, attended Cornell with the help of the GI Bill, a New York state scholarship for veterans and teaching work. “I got a fabulous education, and it didn’t cost me anything,” he said.

Now, his family’s gift will help defray costs for a new generation of engineers and entrepreneurs.

Encouraging top students who are already familiar with Cornell’s culture to attend Cornell Tech will also help advance President Martha E. Pollack’s vision for “One Cornell,” in which the two campuses complement and strengthen each other.

“The Follett Family Fellowship brings the campuses of Cornell closer together by supporting students who pursue graduate studies in engineering and computer science at Cornell Tech after their undergraduate studies on the main Ithaca campus,” said Dan Huttenlocher, the Jack and Rilla Neafsey Dean and Vice Provost of Cornell Tech.

Cornell Tech’s admissions team will promote the program at informational sessions targeting Cornell Engineering juniors considering master’s degrees. Cornell Tech will also partner with departments including Operations Research and Information Engineering, Computer Science and Electrical and Computer Engineering to help identify potential Follett Family fellows, though Cornell Engineering graduates from any department would qualify.

“This generous gift from Don and Mibs will allow students to take full advantage of everything Cornell has to offer,” said Lance Collins, the Joseph Silbert Dean of Engineering. “They can get a world-class undergraduate engineering education in Ithaca and then head to Roosevelt Island, where their master’s studies at Cornell Tech will immerse them in New York City’s thriving tech environment.”


Good Code is a weekly podcast about ethics in our digital world. We look at ways in which our increasingly digital societies could go terribly wrong, and speak with those trying to prevent that. Each week, host Chine Labbé engages with a different expert on the ethical dilemmas raised by our ever-more pervasive digital technologies. Good Code is a dynamic collaboration between the Digital Life Initiative at Cornell Tech and journalist Chine Labbé.

Follow @goodcodepodcast on Twitter,  Facebook, and Instagram.

On this episode:

Estrin has long studied ways to use our digital traces to improve our health. Now, she is also exploring ways to build systems with a finer granularity of control, in which our data are used for one specific goal only. Sounds pretty normal, right? Well, it isn’t!

Estrin explains why small data can be very helpful to complement clinical care for pain, depression, and chronic diseases.
She tells us that digital technologies could help us curb our very dependence to them, and she says that she no longer has “the arrogance of optimism”, when it comes to our technological future.

You can listen to this episode on iTunesSpotifySoundCloudStitcherGoogle PlayTuneInYouTube, and on all of your favorite podcast platforms.

We talked about:

  • In this episode, Deborah Estrin mentions her work as Director of the Small data lab, at Cornell Tech. Read about their projects here
  • For Estrin, the biggest digital divide we face nowadays in the US is age. On this page, you can read all articles by the Pew Research Center on the digital divide throughout the world.
  • Cellphones are pretty pervasive in the US, Estrin says. “Roughly three-quarters of Americans (77%) now own a smartphone,” according to this January 2017 Pew Research Center survey.
  • In this episode, we also talk about our dependence to tech. “Has dopamine got us hooked on tech?” Read about it in The Guardian.
  • Can tech help us curb our very dependencies to digital technology? Estrin believes so! Some researchers in her lab have designed digital nudges to help combat digital overuse. Read their academic paper.
  • We also talk about data obfuscation and wonder if people who resort to such techniques could be left out, in a world where health is greatly aided by digital traces. Data obfuscation can be used as a way to protest online surveillance: instead of trying to control your digital traces, you produce more data in order to confuse the trackers as to what your real interests, fears and feelings are. Philosopher Helen Nissenbaum, our guest in episode 2, wrote a book about it with Finn Brunton.
  • Estrin was named a Mac Arthur Foundation fellow in 2018, along with 24 other “extraordinary creative people”. See the whole class of 2018. And read why the Foundation does not like the term “Genius grant” often used by the press.  “We avoid using the term ‘genius’ to describe MacArthur Fellows because it connotes a singular characteristic of intellectual prowess”, they explain.

Read more:

  • Here is a simple overview of a new field called “digital phenotyping.”
  • Since 2017, Facebook has been using AI to work on suicide prevention. Their tool identifies suicidal posts before they are flagged by a human. In September 2018, the company said they had worked with first responders on “over 1000 wellness checks.”Read about it in TechCrunch.
  • This surgeon has created expression-sensing glasses to help patients with facial paralysis. It is now being used in a study of people with Parkinson’s disease. He hopes to build a “digital phenotype” of Parkinson’s patients. Could these glasses help monitor mental health in the future?

Good Code is a weekly podcast about ethics in our digital world. We look at ways in which our increasingly digital societies could go terribly wrong, and speak with those trying to prevent that. Each week, host Chine Labbé engages with a different expert on the ethical dilemmas raised by our ever-more pervasive digital technologies. Good Code is a dynamic collaboration between the Digital Life Initiative at Cornell Tech and journalist Chine Labbé.

Follow @goodcodepodcast on Twitter,  Facebook, and Instagram.

On this episode:

From small projects within existing cities, to entire neighborhoods built from the Internet up, our urban lives are getting smarter. Smart cities are attractive: to us, but also to global capital. Should we be skeptical, and resist the smart city movement?

We ask Ellen Goodman about the texture of life in the smart city of the future, and whether or not serendipity will be a thing of the past.
She tells us why she fears smart cities could lead to a loss of democratic accountability, and a concentration of power. And she warns that we should be vigilant about the theory of the “good” that is driving our urban innovations.

You can listen to this episode on iTunesSpotifySoundCloudStitcherGoogle PlayTuneInYouTube, and on all of your favorite podcast platforms.

We talked about:

Read more:


During a unique, one-day sprint held in Tel Aviv earlier this year, a group of Cornell Tech masters students built an app that the Tel Aviv Municipality wants to develop and implement for use throughout the city.

The new augmented reality app, AiR, will incentivize Tel Avivians to adopt environmentally friendly habits by performing specific tasks to accumulate rewards – such as free coffee and other similar perks – that can be redeemed at local businesses.

The ideation sprint, sponsored by MindState, was part of Cornell Tech’s annual iTrek program and consisted of a selection of top tier Israeli creative talent including ironSource, Wix, and Waze. Company representatives worked alongside the students from Cornell Tech – home of the Jacobs Technion-Cornell Institute.

iTrek is a 10-day trip during the January academic recess which provides students of varying backgrounds an opportunity to be exposed to and learn about the growing economy of innovation in Israel with a focus on the established startup ecosystem.

At the event, the 60 multi-disciplinary students and accompanying creative professionals were divided into groups to explore the social and cultural fabric of Tel Aviv’s urban environment using data, technology, design, and business strategy. Each team tackled one theme of a five pronged brief: Talk with the City, Move Around the City, Chill in the City, Breathe in the City, and Die in the City.

The hard work and collaborative efforts of those involved in the event resulted in the creation of the three winning apps: Flock, AiR, and Capsule.

The first place winner was Flock, one group’s response to the prompt “Chill in the City.” Flock is an app that creates custom, real time experiences personalized for the user based on his/her preferences and attributes. The application curates events at a location of the user’s choice and clusters others with matching preferences, ensuring no one ever misses out on a social experience in their city.

“Throughout the process I was constantly surprised by the multi-disciplinary team’s ambition to surpass our initial goals and resilience against challenges in building a full-fledged prototype and strong business plan,” said Flock team member Ishan Virk, Technion-Cornell Dual Master’s Degrees in Health Tech ’21.

Despite being the second place winner, AiR was the only creation from the MindState challenge to be picked up by the Tel Aviv Municipality. The challenge group created this app from the prompt “Breathe in the City.” The group was comprised of members Alana Lipson, Shaine Leibowitz, Mikaela Brown, Mark O’Looney, Miakel Williams, Danit Acker, Yossi Abodi, Yamit Haddad, and Samuel Regev.

“I am excited about developing the AiR app because it can make a positive impact on the city of Tel Aviv and improve the lives of our citizens,” said Gidi Schmerling, Tel Aviv Municipality’s Director of Communications. “Tel Avivians are forward thinking people and I am confident they will enjoy trying out this new app.”

Capsule, the third place winner, was created in response to the prompt “Die in the City. This application allows families to store and share memories – including location based memories, such as the local Tel Aviv bench where grandparents had their first kiss –  from one generation to the next. Using GPS, Capsule notifies the user when he/she is near a meaningful place.

This event was staged by MindState with the purpose to explore how changing mindsets can lead to global change. MindState’s methodology in the creation of this challenge was to bring together creative students from different cultural backgrounds and disciplines to collaborate in short sprints of solving challenges, developing thoughts and ideas, and ultimately bringing about change.


The Milstein Program in Technology and Humanity, which offers selected undergraduates in the College of Arts and Sciences a specialized curriculum to prepare them as leaders in an increasingly digital world, was celebrated April 12 at a ribbon-cutting at Cornell Tech.

The event marked the realization of a distinctive vision: to combine Cornell’s renowned undergraduate liberal arts education with opportunities at Cornell Tech. The program seeks to create and nurture a new generation of innovators uniquely conversant in the ways technology is transforming the world, as well as in the social and ethical implications of this change.

“There are virtually no problems that are purely technical,” said Cornell President Martha E. Pollack. “They are nearly all socio-technical, and indeed, humano-socio-technical. That’s what’s so exciting about the Milstein program: it prepares students to understand both the technical and the human aspects of new technologies. We’re so grateful to the Milstein family for their generosity in making the program possible, and delighted that they’ve chosen to support such a groundbreaking opportunity for our undergraduates.”

Bridging the Ithaca and New York City campuses, the Milstein program will provide 100 highly qualified students each year with access to a robust network of thinkers and leaders in business, technology and law. The program was created in 2017 with a $20 million gift from Howard Milstein ’73, Abby Milstein and Michael Milstein ’11.

The celebration took place as the inaugural cohort of 14 first-year students in the College of Arts and Sciences made its first trip to Cornell Tech. There, the undergraduates observed Studio Sprint, a monthly 24-hour period in which Cornell Tech students have no classes so they can focus on their product and business development work – a fitting introduction to Cornell Tech’s entrepreneurial spirit.

Students can apply to enter the Milstein program as first-year students or begin in their sophomore year, with a total of 25 students per class eventually enrolled. They will pursue traditional majors in the College of Arts and Sciences in addition to a multidisciplinary curriculum providing them with proficiency in computer science. All students will also spend two summers at Cornell Tech, taking an array of technical courses and participating in experiential, industry-focused opportunities.

“Students in the Milstein program are at the forefront of studying the interplay between digital technology and the human condition, the understanding of which is becoming so important to modern society,” said Dan Huttenlocher, the Jack and Rilla Neafsey Dean and Vice Provost of Cornell Tech.

“We are already seeing how the Milstein program is impacting our students and faculty more broadly here in Ithaca,” said Ray Jayawardhana, the Harold Tanner Dean of Arts and Sciences. “The program is attracting not only extraordinary students, but also remarkable collaborators, like Oskar Eustis, a leading light of the New York theater scene, and Mitchell Baker, executive chairwoman of Mozilla, who is working at the nexus of ethics and technology.”

The Milsteins partnered in helping to create the program’s vision in collaboration with Huttenlocher and Gretchen Ritter, professor of government and former dean of the College of Arts and Sciences. Amy Villarejo, the Frederic J. Whiton Professor of Humanities, is faculty director of the Milstein program in Ithaca. Tapan Parikh, associate professor of information science at Cornell Tech, is the faculty director at Cornell Tech.

In addition to the students’ visit, Cornell Tech hosted the first meeting of the new Milstein Program Advisory Council on April 12. Chaired by Michael Milstein, who is partner at New York Private Bank & Trust and Milstein Properties, CEO of Boylan Bottling Co. and co-founder and chairman of Grand Central Tech, the group’s members include: Chad Dickerson, former CEO of Etsy, current CEO coach at Reboot and a Cornell Tech fellow; Ulfar Erlingsson, Ph.D. ’04, head of security research at Google Brain; Oskar Eustis, artistic director of the Public Theater in New York City; Peggy Koenig ’78, chair of Abry Partners and a Cornell trustee and Cornell Tech overseer; and Josh Wolfe ’99, co-founder of Lux Capital.

“I think being part of the Milstein program is recognizing the power and potential of technology, and alongside that being able to create space for yourself with your own personal interests,” said Catie Rencricca ’22. “Everything I wanted out of the program, especially being able to find a community of people interested in similar things to me, has definitely exceeded my expectations.”


Good Code is a weekly podcast about ethics in our digital world. We look at ways in which our increasingly digital societies could go terribly wrong, and speak with those trying to prevent that. Each week, our host Chine Labbé engages with a different expert on the ethical dilemmas raised by our ever-more pervasive digital technologies. Good Code is a dynamic collaboration between the Digital Life Initiative at Cornell Tech and journalist Chine Labbé.

Follow @goodcodepodcast on Twitter,  Facebook, and Instagram.

On this episode:

There is already a lot of autonomy in modern wars, but a Human is still always involved in determining what the targets are. How would war change if Humans were taken out of the loop altogether?

We ask Peter Asaro about the humanitarian and moral questions raised by autonomous weapons, and we talk about the potential for technical glitches.

He also tells us about the ongoing talks at the UN on these weapons, and whether he’s optimistic about the possibility to reach an international ban.

You can listen to this episode on iTunesSpotifySoundCloudStitcherGoogle PlayTuneInYouTube, and on all of your favorite podcast platforms.

We talked about:

Read more: