Algorithmic Accountability

Free will — the freedom to decide for ourselves the actions we take based on our understanding of right and wrong and the possible consequences — is a value sacred to humanity. But that freedom is quietly being shaped by systems most of us never see.

Algorithmic accountability applies to many domains: an algorithm deciding who gets a loan, who gets flagged by law enforcement, who gets hired. These matter. But the most intimate algorithmic system in most people's lives is simpler and more pervasive than any of those. It's the feed — the endless, personalized stream of content that greets you every time you open your phone or browser.

That feed is not neutral. It is not a window onto the world. It is an engineered environment, optimized at enormous scale, specifically to capture and hold attention. Doomscrolling isn't a willpower problem. It's an engineering problem, and nobody is currently required to answer for what it's doing to our society.

Breaking down the issue

How do algorithms work?

Most people think of a social media feed the way they think of a newspaper — a selection of things happening in the world, maybe tailored to their interests. But it works far differently.

A feed is a prediction engine; every time an app is opened, it asks one question: what is this specific person most likely to keep watching? Then it answers that question, serves up the content, watches what users do, and immediately updates its model of each individual. The world it shows isn't THE world. It's a mirror that keeps adjusting until users can't look away.

What makes this different from anything that came before it is the combination of three things working together:

  • Personal targeting
    Unlike television, radio, or print — which broadcast the same content to everyone — these systems build a unique profile for each user and serve content tailored specifically to keep you watching. Not the average person. You.

  • Real-time behavioral feedback
    Every pause, replay, skip, and scroll is recorded and fed back into the system instantly. The algorithm learns what holds your attention as it holds it, updating its model of you in real time, at no additional cost.

  • Billion-user scale
    Previous persuasion systems — advertising, propaganda, even early internet ads — had real costs per person reached, which limited their reach and their refinement. These systems scale to billions of users essentially for free, which means they can be tested, tuned, and optimized to a degree no prior technology has approached.

The result is the most efficient attention-capture machine ever built. The emotions that keep people watching longest are not calm or neutral ones. They are outrage, fear, anxiety, and tribal solidarity — the feeling that your people are under threat and you need to stay informed.

What’s the impact?

Over 5 billion people worldwide use social media — roughly 62% of the global population.[¹] The average person spends around 2 hours and 21 minutes per day inside these systems.[²] But averages obscure what's actually happening at the extremes. Surveys consistently find a significant portion of users — across age groups and geographies — reporting 6, 8, or more than 10 hours of daily screen time. For those people, the feed is not a feature of their day. It is their day.

The platforms know this, and they have little incentive to change. Meta alone generated $160.6 billion in advertising revenue in 2024 — virtually all of it from feed-based ads.[³] TikTok, YouTube, and others add hundreds of billions more to a global attention economy now valued in the trillions.

The human cost of that model is increasingly well documented. Research consistently links heavy social media use to elevated rates of anxiety, depression, and loneliness — in adults as well as adolescents.[⁴] In 2021, internal Facebook documents leaked to the Wall Street Journal revealed that the company's own researchers had found Instagram was making body image issues worse for one in three teen girls. [⁵]

Researchers have even documented social media’s role in violence in communities across the world including Myanmar, Brazil and the United States.[]

This is not an accident of the technology; it’s a consequence of the incentive structure. The same algorithm that keeps everyone watching is the one making the platforms rich. Nobody in that equation is optimizing for human well-being — and until recently, nobody outside it was either.

What’s been attempted

The European Union's Digital Services Act, which entered into force for the largest platforms in August 2023 and became fully applicable in February 2024, represents the most serious regulatory attempt yet to impose algorithmic accountability on these companies.[⁶] It is a beginning. In most of the world, including the United States, the regulatory landscape remains largely untouched.

What we propose

Design choices created the algorithms that lead to documented harms now, and different design choices can create healthier systems. We call for the following design changes in our tech platforms:

  • The reset button

    Every platform must give users the ability to wipe their algorithmic history and start fresh. One tap. No buried settings. This is the most basic form of user control imaginable, and it costs platforms nothing except the potency of the profile they've built on you without asking.

  • Consensual signals only

    Feed algorithms must be optimizable exclusively on explicit user actions — likes, saves, shares, follows. Passive behavioral harvesting — using how long you hovered, whether you rewatched, how fast you scrolled — cannot be used to train recommendation systems. You keep personalization. You lose the exploitation.

  • Transparent standards

    Platforms must be audited against these criteria by independent third parties, with results made public. We are building toward a certification standard — a clear, tiered rating of how much user control each platform genuinely offers, so that users, advertisers, and regulators can see exactly where each platform stands.

The above constitutes a vision where the feed works for you — where you can hand-pick an algorithm tuned to morning news, or cricket, or calisthenics, or cooking, and switch between them like playlists. Where your attention is something you choose to give, not something that gets taken.

That world is technically possible right now. The only thing standing between you and it is the current incentive structure — and that is exactly what we are here to change.

References

[¹] DataReportal, Digital 2024: Global Overview Report (January 2024). https://datareportal.com/reports/digital-2024-global-overview-report

[²] Backlinko, Social Media Users & Growth Statistics (citing GWI Q3 2024 data). https://backlinko.com/social-media-users

[³] Meta Platforms Q4 & Full Year 2024 Earnings Report (January 2025). https://mediaincanada.com/2025/01/30/metas-revenues-increased-by-22-in-2024/

[⁴] Hunt, M.G. et al., "No More FOMO: Limiting Social Media Decreases Loneliness and Depression," Journal of Social and Clinical Psychology (2018). https://penntoday.upenn.edu/news/social-media-use-increases-depression-and-loneliness

[⁵] Wells, G., Horwitz, J. & Seetharaman, D., "Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show," The Wall Street Journal (September 14, 2021). https://www.cnbc.com/2021/09/14/facebook-documents-show-how-toxic-instagram-is-for-teens-wsj.html

[⁶] European Commission, Digital Services Act: enforcement timeline (August 2023 / February 2024). https://www.eu-digital-services-act.com/