Time for Big Tech to Grow Up

8 minutes read | First published: October 13, 2021
An altered version of the Instagram logo with angry eyebrows and sharp teeth

Illustration by: Sk1iddy, age 13

One of the defining moments of the Facebook whistleblower congressional hearing was when Congressman Richard Blumenthal read out a text message from a member of his constituency in Connecticut. Blumenthal was visibly emotional as he shared the story of a father whose 14-year-old daughter had gone into a negative spiral on Instagram and developed anorexia. The text message from the father ended “I fear she will never be the same”. The hearing fell silent.

Three years ago, another 14-year-old girl in the UK, Molly Russell, killed herself. Her father claimed her death was ‘helped by Instagram’ and the way the platform’s algorithms pushed graphic self-harm and suicidal material. In the aftermath of his daughter’s suicide, he appealed to the British Government and to social media companies to recognise and respond to the negative influence of social media on young people.

“The physical isolation of pandemic-related lockdowns has only increased young people’s dependency on social media, amplifying problems of mental health and online harms”

In the years between these two fathers telling the stories of their daughters, the physical isolation of pandemic-related lockdowns has only increased young people’s dependency on social media, amplifying problems of mental health and online harms. At the same time, recent internal leaks from Facebook have revealed that 13.5% of teen girls still say Instagram makes thoughts of suicide worse and 17% of teen girls say the same for eating disorders.

Parents, siblings, grandparents and educators are watching on as the ‘tweens’ and teens in their lives struggle through a complex world of harmful content at the same time as trying to figure out who they are and find their way in the world. Balancing these extremes, whilst trying to give young people the space and freedom to grow independently, is the stuff of many parents’ nightmares.

The reason why former Facebook employee Frances Haugen’s whistleblowing could be game-changing is that it proves Facebook knows that its algorithms are disproportionately harmful, not only to young people but also to society and democracy, yet in Haugen’s words, ‘over and over again, [Facebook] has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety.’ What makes the problem unique for younger generations is that Facebook has an agenda to grow as a business, which means staying popular in a competitive market. Future generations are one of their safest market bets, if (and it’s a big if) they can continue attracting young users. In 2012 when Facebook acquired Instagram (for a relatively modest $1 billion dollars), this solved some of their growth problems, but there are always new threats, such as the increasingly popular video sharing platform TikTok, or the chat app Discord.

“Unfortunately for young users, the content that drives high engagement is often harmful, provoking and tailored to their deepest vulnerabilities”

The solution to market dominance is simple: make Instagram into a place where it’s hard to look away and it’s easy to stay. Unfortunately for young users, the content that drives high engagement is often harmful, provoking and tailored to their deepest vulnerabilities. Haugen showed evidence that confirmed what many had suspected: the algorithm Facebook uses to serve content turns ‘engagement’ (what you look at, for how long and how often) into profits through advertising, even if this engagement is harmful. It is this ‘engagement-driven logic’ that amplifies negative spirals and creates content ‘rabbit warrens’ (the more you look, the deeper you go). As she explained in a recent interview: ‘What’s super tragic is Facebook’s own research says as these young women begin to consume this eating disorder content, they get more and more depressed and it actually makes them use the app more and so they end up in this feedback cycle where they hate their bodies more and more.’ Haugen showed that this is not only known by the company through its own research, but that it creates a conflict of interest that Facebook has disavowed. If they fix the problem, they will make significantly less money. They have chosen not to: a decision made by a 1 trillion dollar company that Haugen frames as ‘disastrous for children and for democracy’.

Digital technologies, from social media to computer games, have become central to the way young people learn, connect, grow and explore their identities. Indeed, these technologies also have benefits: they can help some young people avoid isolation, seek support with mental health challenges or escape unhealthy home environments. But the idea that these benefits outshine the ills, or that we can leave it up to young people to find a different path through a universe of media algorithmically trained to seek them out and pull them in, ignores the insidious nature of the problem. An overly protective response is wrong: taking technology away from young people is not going to make the problems vanish. Instead, we need to find ways to preserve and grow the digital environment that young people treasure while making it safe, inclusive and nurturing. Recent infrastructure failures such as the blackout that left Facebook and other products such as Instagram and Messenger offline for over 5 hours also raise important questions about what it means to have such centralised power, knowledge and data.

As societies, we have to start talking about technologies as both problem-solving and problem-making. There is a desperate need to hold some of the wealthiest companies in the world to account for an environment that no amount of educated teachers, attentive parents or even the most disciplined and savvy ‘tweens’ can fix. In standing up to Big Tech, there is no need to start from scratch. Whilst the response to the death of Molly Russell has been slow, it has given researchers and advocates in the UK, such as Baroness Beeban Kidron of 5Rights and Sonia Livingstone of LSE, the chance to push for changes in the form of the Age Appropriate Design Code and the long awaited Online Safety Bill, on which Haugen will advise in the coming weeks. The mood is also changing in China, with TikTok announcing a new ‘bed time’ feature for 16 and 17 year olds and making changes to their direct messaging features for younger users. Similarly, Tencent has recently moved to curb computer game addiction amongst its younger users, restricting computer games for under 18s to the weekends, albeit facing criticism for using facial recognition technology to enforce the age-code. With all eyes on Facebook, it has paused the development of ‘Instagram Kids’ for 10–12 year olds, in what it claims is an effort to listen more to concerned policy makers and parents in the wake of the recent revelations.

As a non-profit working internationally on these issues with young people, we at Tactical Tech have seen these issues play out first-hand. Our new youth project aims precisely to find out ‘What the Future Wants’, while resources such as Data Detox x Youth, an interactive toolkit for 11- to 16-year-olds, help put young people in control of technologies including social media. We see that young people depend on technology, yet are frustrated with the lack of care that technology companies take for their well-being and mental health. They are not passive users but instead increasingly aware of the problems these technologies create for them, their friends and their younger siblings. The next generation is technology-dependent but also technology-critical. They have led some of the first demonstrations around the world against carelessly devised algorithms. Listening to them is important — but being accountable to them is also essential.

“The next generation is technology-dependent but also technology-critical”

Facebook has abused its position of trust. It has been dancing around the problems it has created in the information and democracy space since Brexit and Trump’s election. The misinformation it has allowed to propagate around the pandemic has turned up the heat on its practices. Despite the seemingly disparate nature of these topics — youth, elections and healthcare — their root cause is the same. They stem from the attention-based and amplifying nature of the platform, as outlined in the documentary The Social Dilemma, and the unethical and astronomical profits these logics produce. Evidence presented by Haugen of the knowingly negative impact these practices have on young people’s lives creates not only the potential for a breakthrough but also a gateway issue that unites users and regulators across political divides.

Non-profits, educators and parents spend time and effort trying to teach young people how to navigate the digital world. Now it is time to acknowledge: it’s Big Tech that needs to grow up. The moment of tech-enthusiasm has passed and Mark Zuckerberg’s apologies have been stretched over too many acts of neglect. Haugen has given regulators the evidence they need for real change. At the very least, we owe it to the youngest members of society to act.

Stephanie Hankey is the Executive Director and co-founder of the Berlin-based non-profit Tactical Tech, the co-curator of The Glass Room and a Loeb Fellow at the Graduate School of Design, Harvard University. Thanks to Michael Uwemedimo, Daisy Kidd and Sasha Ockenden for comments and additions.