Like many people, when I first heard about the lawsuits against major social media companies over algorithmic addiction and youth mental health, my initial reaction was complicated. Part of me thought about parenting, supervision, personal responsibility and mental health. Those things matter. They always will.
But the more I’ve learned, the more I’ve realized these cases are not as simple as “bad parenting,” “weak kids” or “mental illness.”
And it’s not just about content either. It’s about systems.
These platforms didn’t just create apps — they created algorithmic systems that decide what people see, how often they see it, how long they see it and what comes next. They didn’t just host content — they built the machinery that selects, amplifies and repeatedly delivers it to users through recommendation engines, infinite scroll, notifications and engagement loops.
That matters. Because when a system is designed to pull specific content out of millions of posts and place it directly in front of a user over and over again, that’s not neutral. That’s not passive hosting. That is active distribution.
And when that system is deployed on children, it becomes something much more serious.
Minors don’t have the cognitive development, impulse control or critical filtering capacity to understand what’s happening behind the screen. Most adults don’t either. The average user believes they’re “just scrolling,” not realizing that complex systems are constantly learning their behavior, predicting their reactions and feeding them content designed to keep them engaged.
This isn’t science fiction. It’s how modern platforms are built.
We’re talking about algorithms created by people, optimized by people and designed with specific goals in mind — engagement, retention, time-on platform and profit. These systems don’t think, but they do execute human priorities at a massive scale.
And here’s the part that feels important to say clearly: A child cannot meaningfully consent to a system they cannot understand.
Even adults struggle to understand algorithmic recommendation systems. So placing the full burden of responsibility on parents and children alone ignores the reality of the power imbalance between massive tech companies and individual families.
Yes, parents should use controls. Yes, supervision matters. Yes, personal responsibility matters. But it’s not honest to pretend that families are operating on equal footing with corporations that employ behavioral scientists, engineers, psychologists and data scientists to design systems that maximize engagement and habit formation.
These platforms were not designed accidentally. The feedback loops were not accidental. The reinforcement cycles were not accidental. The personalization systems were not accidental. They were built intentionally — to keep users engaged and coming back. So when people dismiss cases like this as simply “parental failure” or “mental weakness,” they miss the larger issue: system design matters.
These cases aren’t about demonizing technology. Social media has benefits — connection, creativity, education and community. But systems engineered for psychological engagement also carry responsibility for the environments they create, especially when those environments shape developing minds.
What these cases really ask is simple: If companies design systems that actively shape attention, behavior and exposure — especially for minors — do they also have a duty of care?
Not just disclaimers.
Not just fine print.
Not just policies written after lawsuits.
But real responsibility for how their systems function in the real world.
Accountability doesn’t come from lawsuits alone. It comes from understanding. When people see how these systems work, — how content is pulled, prioritized, reinforced and repeated — the conversation shifts from blaming individuals to examining structures.
This isn’t about hate, fear or outrage. It’s about awareness, responsibility and protection.
We regulate cars for safety. We regulate toys for safety. We regulate food, medicine and consumer products for safety.
Because when a product can cause harm, responsibility doesn’t stop at the user — it includes the designer.
Digital systems should not be the exception, especially when they are embedded in the daily lives of children.
These cases may or may not succeed fully in court. But the conversation they force matters deeply. Because it’s not just about one family — it’s about how we protect kids in a world where invisible systems increasingly shape their reality.
In Pittsburgh and across the country, families are navigating these technologies every day. This legal moment isn’t just about courtroom outcomes. It’s about helping our community understand how these systems work, what their impacts can be and how we protect the well-being of our children and neighbors.
This isn’t a political issue. It’s not about blame. It’s about awareness, responsibility and ensuring that our digital future supports healthy development — not just endless engagement.
Christina Jones is a mother and grandmother from Whitehall.