Winners Announced April 23rd
A Responsible Tech Future February 28, 2023

A Deep Dive into Responsible Social Media Models

From in-app regulation to boundaries users can set, there are more avenues than one to enforce social media safety.

With billions of social media users worldwide, it is critical to examine its impact on individuals and society as a whole. It has also given rise to several issues with heavy societal and political implications, despite making us more connected than ever.

Disinformation, hate speech, data privacy, social media addiction, and the impact on mental health are just a few of the issues it’s exacerbated. Tackling these and enforcing sustainable regulation has become a big challenge as it’s not only about the content. The apps’ own algorithms foster those issues within each platform. 

In a global 2022 survey, 84% of respondents said social media has made it easy to manipulate false information. Around 70% expressed that the mass spread of misinformation on social media is a major threat, ranking it second to climate change.

Source: Pew Research Center

 

A massive reason for the spread of false news on social media is the reward system that platforms’ algorithms incidentally create. Users become more likely to share information without the need to fact-check because it leads to an automatic reaction. Posting inflammatory or sensational news online gets engagement, subconsciously training users to share information without considering its validity.

A major issue is its toll on users’ mental health–particularly in teens and young adults–for reasons ranging from social media addiction to seeing unrealistic standards for lifestyle and beauty. In 2021, it was reported that Instagram made 32% of teenage girls feel worse about their body image. In 2022, another study found that young social media users are most likely to be negatively impacted by social media, particularly around puberty and, later, around 19

Queer and Trans users have had a similar experience, with 84% of LGBTQIA+ adults stressing that there are inadequate protections on social media to prevent discrimination, harassment, or disinformation, as reported by GLAAD.

Existing Regulation and Responsible Social Media Practices

The ever-evolving nature of social media and the very structure of the platforms has made it difficult to successfully regulate social media use–both on the user and platform side. It’s created a vast gray area in legal regulation, as the onus is currently on the platforms to self-govern. But recently introduced policies such as the General Data Protection Regulation, Digital Markets Act, the Data Governance Act, and the Communications Decency Act have begun developing a legal framework to guide the future of legislative regulation.

The present focus, however, has shifted toward industry-wide regulation. Social media companies have started developing features within the platforms to incentivize users to build better and more responsible social media habits.

In 2020, Twitter introduced a “read before you retweet” feature to halt the knee-jerk reaction of retweeting an article without reading it. And in 2022, they introduced their Community Notes feature to allow users to add context to a tweet that doesn’t yet have the complete picture.

In 2019, Pinterest developed its compassionate search prompts for users searching for mental health resources. In early February, TikTok announced it would add a new account enforcement system for users who repeatedly violate their safety policies.

More notably, in 2021, Instagram rolled out its plan to better support its users that might be affected by negative body image. One main feature is suggesting help resources from experts to those searching or engaging with eating disorder-related content. Their resources include contact information for local hotlines and even a nudge to DM a friend in that very moment.

Instagram users searching for posts containing sensitive topics are immediately redirected to help resources, ranging from helplines to resources users can use to help themselves.

 

Independent Regulation and Resources for Users and Platforms

More companies are adopting features that limit or hide specific words from a person’s feed or personal page, and restrict harmful behavior from certain users. They’re giving insights on how much time a user spends on the platform and continue to update their community guidelines with up-to-date recommendations backed by expert knowledge.

But those independent of the platforms have also created a couple of resources to galvanize social media safety.

One resource is GLAAD’s Social Media Safety Index (SMSI), reporting on platforms’ present guidelines that aim to protect LGBTQIA+ users’ digital safety from harms like deadnaming, misgendering, and being faced with bigoted content in their feeds.

In 2022, they introduced the Platform Scorecard with 12 markers found from industry best practices to closely measure platforms’ community guidelines on their safekeeping of the digital safety of LGBTQIA+ users. It addresses a broad range of issues, like if users have the space to share their pronouns in their profiles and regulation of harmful advertising. 

Another is the youth advocacy group Log Off Movement, founded in 2020 to spotlight the real harm of social media on young users. Through its website, podcast, blog, and social media accounts, Log Off closely explores the impact of social media to foster a supportive community and push for setting healthy boundaries with it.

While they focus on youth activism and advocating for adopting humane design features in social media to structurally encourage responsible use, the organization also has resources like its Digital Challenge to guide users through cutting down their screen time.

Young people are in this stage where exploration is necessary… but the tools at the heart of social media, like auto scroll, are not conducive for that. It gives you what it wants you to have.”
— Emma Lembke, Log Off Movement

Future of Responsible Social Media

Ultimately, responsible social media models are nuanced and expansive in their application. As the medium continues to assume new definitions and functions, the route to fostering responsible social media practices is proving to be labyrinthine.

To explore this maze and create a roadmap out, we partnered with Omidyar Network to produce the 2022 Webby Trend Report, It’s Up to Us: A Responsible Tech Future. Through a survey of more than 300 digital industry experts, we dove into the guiding principles shaping current policies and industry standards. We found that the responsibility to create a standard of ethical use of technology is shared across the sector and that a sustained, collaborative effort from different parts of the industry is needed to create system-wide checks and balances.

We came across a wide range of solutions from every corner, including from younger Internet users. Log Off Movement’s push to create safer social media models for young people is one, with their founder Emma Lembke citing specific platform features like auto-scroll as factors that heighten users’ negative relationship with social media.

Other experts suggest limiting the amount of information users come across to prevent information overload. Others suggest integrating AI into in-app regulatory systems to sort through harmful content efficiently. Some additional tips are to foster better digital literacy skills to be informed about what kind of data to share or not to share online and to push platforms to publicize their transparency reports to create a greater culture of accountability.

Drag