By Madeline Swanson | Art by Nate Bynum in collaboration with DallE-3 and Adobe Firefly
Do you ever scroll the internet or social media and wonder if that Facebook post you read or that Instagram story you just watched seemed a bit fishy?
In a world where the digital and real-life realms are increasingly intertwined, ensuring our online engagements are ethical and constructive is increasingly important. At the University of Michigan, the Center for Social Media Responsibility (CSMR) is using data and research to develop tools that help social media companies operate responsibly.
“Those public responsibilities include giving people a way to connect without harassment, and this includes a responsibility to help the public be informed while also creating avenues for freedom of expression,” said Paul Resnick (BS ’85), director of the CSMR and Michael D. Cohen Collegiate Professor of Information at the School of Information.
Founded in 2018, the CSMR was established to address growing concerns over the influence of social media on public discourse and democracy. With a multidisciplinary approach, the center draws on expertise from the fields of information science, law, public policy, and data analytics to tackle some of the most pressing dangers plaguing the digital landscape today.
At a moment in history when quelling the spread of misinformation can be a challenge, and there’s a growing lack of trust in public institutions—from higher education to government—citizens’ confidence in the credibility of the media is crucial to making knowledgeable decisions. Resnick pointed out that accurate information allows people to base their opinions and actions in reality, rather than consuming something that can exacerbate division and hinder productive dialogue. Without trustworthy media, he said, people risk consuming material that can undermine the cohesion necessary for a functioning democracy.
“Social media platforms are basically the conveners of our public conversations,” Resnick said. “They shape how we talk to each other, how informed we are. We need to hold the platforms accountable for shaping a productive and ethical discourse, and help them to do so."
From developing tools that help users identify and counter misleading material to conducting in-depth research on the impact of digital platforms on public opinion to educational outreach through Moderation Matters videos and working to reduce polarization, the CSMR is helping social media companies stay accountable and boost public confidence in sources of vital information.
One tool, the Iffy Quotient, tracks the flow of misinformation on social media and measures how much attention “iffy” news sites are getting, while the H.O.T. Speech Metric estimates the percentage of “hateful,” “offensive,” and “toxic” comments on various social platforms.
“You can tell that there's something going on in the world when one of them goes up. With certain events, like the 2020 election, you get spikes indicating the temperature of the online environment is rising,” Resnick said.
Recently, the center launched GuesSync, a two-player cooperative “empathy” game designed to address political polarization. During the game, players try to guess what fraction of the population supports certain policy positions.
Partnerships are also central to CSMR's strategy. The center collaborates with major technology companies, government agencies, and nonprofits to influence policy and promote best practices. Its work has gained national recognition for collaborations with Meta, which has financially supported work at CSMR; X (formerly Twitter); and various news organizations to improve the transparent sourcing and reliability of information shared online.
“We have the opportunity to be an innovator in presenting data so that it’s useful to the public,” Resnick said. “We can serve as a trusted source for monitoring the platforms and explaining how these things work.”