Controlling Kids’ Consumption
Social media apps such as Facebook and Instagram which are owned by the social media conglomerate Meta allow people to connect with friends and share entertaining content. However, these apps can also lead to people viewing triggering and violent images or videos. It is fairly common for users’ feeds to be full of alarming videos, including footage of women being harassed and assaulted, videos of bombings in Gaza and hate crimes committed against vulnerable groups. Shockingly, 60% of teenagers have seen acts of real-world violence, such as war crimes and the abuse of firearms through social media (The Youth Endowment Fund). This horrific content is especially present on Reels, Instagram’s video sharing feature, as many creators are enticed by the possibility of gaining views and consequently earning money through Instagram’s Creator Fund. Many psychologists believe that a possible reason so much violent content was able to thrive on social media before Meta’s censorship is that teenagers have grown desensitized to alarming topics after seeing them all over the news during the COVID-19 pandemic (National Institute of Health). Sophomore Aniyah Crumble believes that witnessing violence on social media can worsen teenagers’ mental health.
“With so much violent and dangerous content spread on social media, it only makes sense that teenagers would feel discouraged by the state of the world,” Crumble said. “On social media it is so easy to get trapped in a bubble of bad news and horrific events happening throughout the world. [Therefore], it is good that some social media networks are working to limit this content.”
In an effort to reduce the spread of such ghastly videos, Meta, the company that owns Instagram and Facebook, announced on Jan. 9 that they will begin blocking minors from seeing triggering posts. Meta was pushed to implement this censorship after 33 states including California sued the company for endangering children. Continuing this effort, a bipartisan group of 42 attorney generals from across the country stated that Meta’s products violate law SB 287 and are key contributors to mental health problems among teenagers. SB 287 states that platforms cannot purposefully design algorithms so that children witness content about controlled substances, firearms or violence
However, the posts that will be restricted do not just involve explicit violence, either; Meta aims to censor a slew of topics, including substance abuse, self-harm and content that promotes eating disorders such as calorie counting videos which frequently appear on Instagram (CNBC). Although eating disorders have existed for centuries, a spike has recently occurred among teenagers, especially young women, due to social media content that promotes disordered behaviors, extreme diets and unsafe supplements (Healthline). As a result of this growth, many are suggesting that uploads surrounding these topics be hidden so people do not fall into the dangerous pattern of disordered eating.
“Instagram is full of filtered and photoshopped photos [of conventionally attractive people],” Roxas said. “When users see these unrealistic, and oftentimes fake bodies and faces, they form insecurities which of course lead to a decline in teens’ mental health. Posts that directly promote eating disorders and unhealthy weight loss add to this as well.”
Even though Meta’s initiative can protect young people from seeing triggering posts, seeing false information and prevent them from being negatively influenced, it can also put them at risk (TIME). If a law requires a network to filter out harmful posts and media, help may not reach those who need it most. In particular, adolescents may feel isolated if they are unable to communicate or relate to others facing the same struggles, as online communities are a major source of mental health support. Furthermore, if a person resorts to social media to broadcast their cries for help, their posts may be flagged as inappropriate and subsequently censored, which undermines Meta’s original intentions of keeping its users safe.
Another aspect of Meta’s initiative is directing users to resources when they search for topics like eating disorders or self harm. Links to websites and phone numbers will automatically be the only search result. These resources include National Suicide Prevention Lifeline and contact to mental health professionals. Meta also allows users to report if they feel that someone they follow is struggling mentally. The user who got reported will then receive a notification with mental health resources. Sophomore Hannah Morgan feels that these filters will improve the quality of knowledge young people may gain from social media. “People under 13 years old should not be on social media, but they should be aware of what happens online,” Morgan said. “Social media can teach people about world events, important causes and differences in life. However it is still very harmful. When I have kids, I would only let them get social media in high school when they are more mature and have a better understanding of topics both educational and upsetting from a source other than the internet.”