Protecting our kids online


The data is clear: Kids spending more time online and on social media is detrimental to their mental health.

In many ways, this is by design. The algorithms tech companies create for their platforms are meant to keep you glued to your feed, conveying visual information that can severely damage self-esteem and increase levels of anxiety and depression.

These findings, confirmed by multiple studies from the National Institutes of Health, show that these problems are worse for young people. Children are especially susceptible to falling into algorithmic rabbit holes, because they are more easily swayed by prominent social media influencers looking to sell a certain product or lifestyle.

Earlier this year, we held a roundtable with state Attorney General Letitia James in Floral Park, to discuss the impacts of social media use on young people and to call for new legislation to regulate how tech companies present information through their feeds. Social media feeds are designed to be infinite, making it easy for any user to sit and scroll for hours on end through a constant stream of content and information. Parents spoke about the negative effects that extensive social media use has on their children, while pediatricians were reporting more instances of mental health struggles among their young patients.

One of the bills we are advocating for in the Assembly is the Child Data Protection Act, which would ban the collection and sale of personal data online for users under age 13 without the consent of a parent, and for users between ages 13 and 18 without their consent. The mass collection of personal data has become a lucrative business online, and companies are willing to pay to learn more about their customers and potential customers. This helps them target content and advertising to maximize viewership, which feeds into the addictive nature of endless scrolling that is such a big problem for young people using social media. By placing these restrictions on access to children’s data, we can stop online platforms from precision-targeting their content to maximize the time kids spend online.

Another key piece of legislation is the Stop Addictive Feeds Exploitation for Kids Act, or SAFE. If enacted, the bill would limit algorithms’ capacity to introduce new content from accounts a user isn’t following. Users under 18 would see content only from accounts they follow. As well, children wouldn’t receive notifications from social media platforms between midnight and 6 a.m., and parents would be allowed to block their kids’ access to social media between these hours. Only parental consent would allow those restrictions to be lifted.

The bill would also empower the attorney general to enforce these provisions by imposing a $5,000 fine on companies for each violation. And parents and guardians would be able to sue for damages.

These algorithms should not influence what children see online. It should be up to parents and their children to decide what kind of content they want to see.

These bills provide reasonable protections for kids using social media, without regulating speech or expression. The mental health crisis impacting young people across the country requires thoughtful public-policy solutions, and that includes holding online platforms accountable for pushing addictive content onto young people’s social media feeds at a pivotal time in their emotional and social development.

Parents, doctors and psychologists all agree that there are negative consequences associated with excessive social media use. We should be encouraging responsible use, and working collaboratively with parents and the tech industry to chart a new way of approaching how kids use online platforms. New York can take a leadership role in protecting kids online, which could go a long way toward alleviating the mental health crisis that has been worsening since the pandemic.

Michaelle Solages represents the 22nd Assembly District.