Nearly a decade ago, Jocelyn Brewer began thinking about social media and other digital technology as a kind of food. Brewer, a psychologist based in Australia, compared the amount of time people spend on websites and social media as “digital calories.”
“It’s really about the principles of intuitive eating,” says Brewer, whose special interest is in the psychology of technology and staying human in the digital age. “You’re not addicted to your smartphone. You’re addicted to the connection you’re getting from what you’re doing online. We all are dependent on connection, community and a sense of belonging.”
Brewer used the metaphor to underscore the dangers of digital media consumption during a DGHI Think Global event titled, “Social Media: A Double-Edged Sword.” She and other panelists discussed the potential for social media to communicate useful, and sometimes lifesaving, information to billions of users, but noted the significant role some platforms have had in spreading mistruths and disinformation.
Jonathan Quick, M.D., an adjunct professor of global health who moderated the panel, emphasized that harnessing the benefits of social media will depend on action from governments, companies and users themselves.
But Phillip Napoli, a Duke professor of public policy who studies media regulation, said the mindset around social media governance needs to shift. “If we can figure out a way for some notion of public interest obligations to be baked into how these platforms operate, and not accept the knee jerk reaction that (any regulation) automatically represents some kind of violation of the First Amendment, there is space to navigate there,” he said.
Watch the full discussion below, or scroll for highlights.
ABOUT THE SPEAKERS
Jonathan Quick (moderator) teaches global health policy, serves on foundation grant advisory boards and mentors students. His current research focuses on market-driven epidemics, from tobacco to opioids to social media. He’s written more than 100 books, chapters and articles during his career.
Zain Jafar is a third-year undergraduate student at Duke University completing the pre-medical track and studying a self-designed major titled Health Equity and US Healthcare Reform. His undergraduate research examines areas such as health disparities and the intersections of technology and public health.
Philip M. Napoli, Ph.D., is the James R. Shepley Professor of Public Policy in the Sanford School of Public Policy where he is also the director of the DeWitt Wallace Center for Media and Democracy. He’s also provided testimony, consultation and research to government bodies such as the U.S. Senate and Australian Communications and Media Authority.
Verner Venegas-Vera, M.D, is a nephrologist at the Mexican Institute of Social Security and Christus Muguerza Faro del Mayab in Merida, Yucatan. He also serves as advisor to the Nephrology Social Media Collective, and is an active investigator with publications in the field.
Jocelyn Brewer is part of the Cyberpsychology Research Group at Sydney University and is a council member for the Centre for Digital Wellbeing. She’s also a speaker, educator and media commentator on cyberpsychology, digital wellbeing and mental health.
The benefits and harms of social media
“We’re having benefits such as interactions with peers outside of your network. Twitter is the number one science platform… And if you want to be a blood donor, you can find that on there. We have to teach the next generation how to navigate and read more with objectivity and if we don’t have that, we can have cognitive overload. When you have a lot of information, you can start being aggressive, feel anxiety and depression.”
“Another thing associated with [social media] algorithms is this strong component of addiction. Addiction is the hallmark of a lot of these social media companies’ business models to keep users coming back… You have a for-profit company that operates in a manner to generate the most profit and make the most lucrative decisions. That often means sticking to algorithms and maintaining the addictive nature of these apps.”
“Not one country has ever required these platforms to demonstrate that they can content moderate at an unprecedented scale with content being distributed and circulated. Every government fell victim to the notion that this is technology and technology policy, not media policy, and it was seldom applied in this context.”
The potential to have a global convention on how to use social media
“It’s tricky because how countries can approach free speech can be so different. There’s so many variables across legal regimes and cultural norms that can go into it. It’s couched in how they do not impede the benefits of this technology from reaching everybody… It’s amazing how much our policymakers are susceptible to that line of reasoning.”
“I think there’s a clear line when you look at the human history of this kind of behavior from a small group of people who stand to benefit.”
“If you look at the fathers of the free market, they weren’t envisioning a free-for-all, a winner takes all. It was supposed to benefit everybody, and now it’s only benefitting the rich.”
How can social media companies better regulate content
“During the pandemic, [artificial intelligence] already existed to flag types of content for content warnings or pull it down. If you have the technology to put a content warning on it, then you have the technology to block it at its source. Some of these companies may not be using the full extent of their abilities in order to prevent this misinformation.”
“It comes [back] to this idea of prebunking (NPR described the strategy as providing people with correct information before false information is shared). That has been demonstrated with positive benefits, but it comes with the question of is that an overreach? Being aware of the benefits that we’ve seen from prebunking, I think it’s important we move in that direction and see how that action plays out in the future development of these apps.”
How can people use social media better, more responsibly
“We have to remember we are the final product for these companies. We have to take care of kids and our communities because they are seeing us as products of attention. One suggestion for the future is we can push... and help the platforms to take down all the misinformation.”
“It’s going to take a lot of education, understanding and promoting on this part of technology and teaching things like media literacy so we know how to seek good quality information from recognized and trusted sources and have that level of healthy critical thinking.”
“From a regulation and policy standpoint, if we can figure out a way for some notion of public interest obligations to be baked into how these platforms operate, and not accept the knee jerk reaction that (any regulation) automatically represents some kind of violation of the First Amendment, there is space to navigate there.”