An MIT Alumni Association Publication

Professor of Media Arts and Sciences Deb Roy SM ’95, PhD ’99 will give the keynote, “Constructive Dialogue and Technology,” at the 2022 MIT Alumni Leadership Conference (ALC). Scheduled to take place on campus September 16–17, ALC brings MIT’s powerful cohort of volunteers together to build connections, share insights, and learn more about leadership development. Register now.


Deb Roy SM ’95, PhD ’99 has always loved robots. He grew up building them in his basement in Canada, studied computer engineering at the University of Waterloo, then focused his MIT dissertation on machine-learning models of human language learning. To this end, he recorded 240,000 hours of audio and video of his own infant son’s acquisition of language, work that led to Birth of a Word, a TED Talk with more than 3 million views online.

A faculty member at MIT since 2000, Roy is also cofounder and former CEO of Bluefin Labs, a media analytics company acquired by Twitter. He served as Twitter’s chief media scientist from 2013 to 2017 and was executive director of the MIT Media Lab from 2019 to 2021.

Technology can be leveraged to help experienced facilitators foster nuanced and experience-based conversations.

Over time, Roy has dived ever deeper into the ways people use communications technology. Today Roy directs the MIT Center for Constructive Communication (CCC), which develops methods for understanding current social and mass media ecosystems and designs tools for listening and bridging societal divides. He is also cofounder and chair of Cortico, a nonprofit social technology company that partners with CCC to translate research into scaled deployment of communications tools and methods. 

In advance of his talk at ALC, Slice of MIT asked Roy to share some thoughts about his research, constructive communication, and how technology might be employed for social good. 
 
What do social media platforms need to learn from the real world of social interaction?

As humans, our social interactions are deeply shaped by our physical environment and by age-old social practices. 

When we come together in person, our perception of each other’s facial expressions and body language creates important feedback loops that regulate how we treat each other—feedback loops that are often missing from social media. In physical spaces, our expectations of privacy shift as we move—for example, from a crowded party or cafeteria to a private living room or office. Social media platforms lack these obvious boundaries, and for commercial reasons often push for more exposure than most of us want. 

Equally important, over time humans have developed tools and methods for facilitating dialogue, debate, and learning shaped by age-old wisdom. Today, while people still share stories and experiences, it often seems that we have lost the skills needed to engage in meaningful and constructive dialogue. Siloed in our social media feeds, we fail to experience what it means to listen to others, to be heard, and to learn from each other. 

Social media platforms are simply not designed to help people lean into difficult conversations, or to create space for quiet voices to be equally heard. However, technology can be leveraged to help experienced facilitators foster nuanced and experience-based conversations.

What has your work in data analysis and visualization shown about who people are listening to and communicating with online?

Motivated by the rumors surrounding the Boston Marathon bombings, researchers in our lab systematically investigated the spread of false news through social media. Across a large number of fact-checked news stories, we found that false news spread far faster and further than true news on Twitter, perhaps because those stories more often stir emotions of surprise and disgust. 

In 2016 we turned our attention to the chatter on Twitter related to the US presidential election. We helped news organizations use social media as a listening channel to complement findings from surveys and focus groups. After the election, we created a visual representation of the social networks of people who were actively talking about election issues. We found an alarming level of fragmentation. Even more alarming, the fragmentation included journalists—the people we rely on to help us understand those outside our own networks. 

Overall, it was clear that the loudest and most polarizing voices dominate social media platforms—often driving toxic polarization and division and leaving little room for less strident voices or nuance.

Now that we know that many people are operating in online echo chambers, what can technologists do to promote more constructive dialogue?

The impact of social media—both positive and negative—demonstrates the great power of the underlying technologies. Technologists can design communication systems that foster and scale age-old practices of constructive dialogue and deep listening. We can design smart tools that provide scaffolding so that anyone can easily learn to facilitate a conversation. Stunning advances in machine learning and natural language processing can be marshaled to create scalable forms of deep listening. The creative explosion in media formats ushered in by modern social media platforms—often shaped by our youth—can be leveraged to share the stories of others across divides. 

The current social media networks have been greatly influenced by the media industry—rooted in a broadcaster-audience model often paired with advertising and designed to spread virally. I envision social dialogue networks grounded in human practices of constructive dialogue and sharing of stories, powered by state-of-the art AI and digital design. And I believe the best way to proceed is in collaboration with communities, because it is people themselves who must ultimately have agency in how platforms are designed and used.