After son’s suicide, Lincoln Park couple push measure for greater scrutiny of social media use

Rose and Rob Bronstein were blindsided by their 15-year-old son Nate’s suicide in early 2022.

The Bronsteins say Nate was a funny, athletic and well-liked kid. What they didn’t know, they said, is that in the weeks leading up to his death, Nate was being harassed by other Latin School of Chicago students on the social media platform Snapchat.

They believe a bill under consideration this spring by Illinois legislators could have saved his life.

The Let Parents Choose Protection Act is also referred to as Sammy’s Law after 16-year-old Sammy Chapman, who died from a fentanyl overdose in his California home last year after taking drugs he found advertised on Snapchat. If passed, the bill would prohibit popular social media platforms such as Snapchat from blocking outside safety software that detect and notify parents of potential threats including substance abuse and suicide.

Rose Bronstein said in a recent interview that if she knew about such safety software when giving her kids their first phones, “that’s the first thing I would have done.”

“I did not monitor anything,” she said. “He (Nate) liked looking at sports statistics and followed all the fantasy football leagues and stuff, so he was always like looking at sports things, but intuitively he was such a good kid, like I didn’t worry. But it’s just the fact that he got caught up in this and he was being attacked so harshly.”

The Bronsteins are suing the Latin School and current and former board members and staff for wrongful death, alleging Nate notified the school that he felt the messages about him constituted bullying. According to the lawsuit, students sent messages saying “kill yourself” and spread a “death threat involving smoking Nate’s ashes.” A Latin School representative has said the school acted responsibly and that the allegations in the lawsuit are “incomplete and misleading.”

Third-party safety software such as Bark, which charges users $14 per month to track activity on apps including YouTube, X and Reddit, are programmed to send parents email and text notifications of potential threats their children encounter on some social media platforms. Instagram and Facebook, both owned by parent company Meta, permit the use of third-party safety sites for data aside from a user’s direct messages. Other apps popular with young people, including Snapchat, Discord and TikTok, currently block all third-party software access, something that wouldn’t be allowed under Sammy’s Law.

“This would address so many harms,” Rose Bronstein said. “It would be an immediate, life-saving impact.”

Children’s use of social media has been a high-profile issue nationwide in recent years, though deciding how to regulate platforms has proven tricky for lawmakers as they grapple with such issues as data security, free speech and privacy. For those reasons and others, Sammy’s Law has prompted opposition from technology groups as well as the American Civil Liberties Union.

In January, the U.S. Senate Judiciary Committee pressed leaders of popular social media platforms on issues faced by young users of their products including addiction, eating disorders, bullying and sextortion.

When asked to address affected parents from across the country, Meta CEO Mark Zuckerberg turned to parents in the gallery, including the Bronsteins, who silently held up photos of their dead children and said, “I’m sorry for everything you have all been through.” He said his company continues to work on “industrywide efforts” to protect children.

Rep. Jennifer Gong-Gershowitz, a Glenview Democrat, is author of the Sammy’s Law bill in Illinois and said she hopes that its passage would pressure the federal government to take similar action.

“What I have seen is states being more successful in getting bills passed, and I think there is a standard that’s being set at the state level that will hopefully be an impetus for more federal action moving forward,” she said in a recent interview.

Versions of Sammy’s Law have also been introduced in the California legislature and in Congress, though neither bill has moved out of committee. Illinois’ version, which currently has 19 sponsors, was unanimously passed by the House Consumer Protection Committee on March 20.

State Attorney General Kwame Raoul, whose office would be responsible for holding platforms accountable under Sammy’s Law, is also backing the bill. In a recent interview, Raoul said “a lot of these platforms have been unable to police themselves,” warranting such government intervention.

Among those opposing the bill are progressive social groups who express concerns that expanded use of third-party safety software could expose LGBTQ+ children or children seeking reproductive health care living in unsupportive households. On April 4, a coalition made up of Equality Illinois, Planned Parenthood Illinois Action, Advocates of Planned Parenthood of the St. Louis Region and Southwest Missouri, and the ACLU of Illinois sent Gong-Gershowitz a letter laying out the issues they have with Illinois’ version of Sammy’s law.

“We know that connecting via social media is one of the primary ways that young people have access to information, especially when other limitations like book bans are happening across the country,” Equality Illinois Deputy Director Mony Ruiz-Velasco said in a recent interview. “So we are concerned about the chilling effect it’ll have on people, and particularly young people, having access to resources and communities.”

Proponents have called these worries unfounded, saying the language limits the information third-party safety apps release to parents within the context of specific threats.

“Parents don’t get access to everything — they’re not reading every text message,” Gong-Gershowitz said in a recent interview. “It’s sort of like sending up a signal flare, right when there is a serious danger so that a parent could intervene at a critical moment.”

The dangers listed in the bill include everything from suicide, substance abuse and harassment to anxiety, depression, academic dishonesty and sharing one’s home address or phone number. Third-party safety software often uses contextual artificial intelligence to decide what content to flag, Gong-Gershowitz said.

To address concerns about protecting sensitive, non-threatening information, Gong-Gershowitz filed an amendment to the bill Wednesday that specifies that “user data solely limited to resources, support or information related to reproductive health, sexual orientation or gender identity shall not constitute a harm” that would trigger a notification from third-party safety software.

The progressive coalition remains opposed to the legislation, saying the amendment fails to assuage underlying concerns of child users feeling unsafe to communicate with others about such topics online.

“How do you account for the fact that somebody’s expressing anxiety, but maybe the reason for the anxiety is because their parents aren’t accepting that they’re LGBTQ?” said Ed Yohnka of the ACLU of Illinois. “Maybe it’s because they’re having trouble with, you know, other friends in school who aren’t, and the parents aren’t supportive. It doesn’t change that harm at the end of the day.”

Technology advocacy groups including the national trade organization Chamber of Progress also stand against the proposed legislation. Policy analyst Hope Ledford testified during a House Consumer Protection Committee hearing last month that such measures could compromise child users’ privacy rights.

“While it’s important to encourage parental involvement to ensure minor safety online, they are not always best suited to control how their child uses an online platform,” Ledford said. “LGBTQ+ youth use online platforms to seek emotional support, search for information about their identities and find entities that accept them when their parents do not.”

Ledford also touted steps taken by social media companies to make their services safer such as Snapchat Family Center, which parents can opt into to receive limited information about a teen’s activity and restrict viewing of some sensitive content.

Snapchat parent company Snap Inc. and TikTok declined to answer questions for this story on their reasoning for prohibiting parents from using third-party safety software.

While waiting to see if the bill moves forward in the general assembly, the Bronsteins said they will continue working through the nonprofit they founded after Nate’s death, Buckets Over Bullying, to educate kids and other parents about the dangers of growing up with unfettered access to social media.

“When you see your peer in trouble, you’ve got to get help,” Rose Bronstein said. “You’re actually getting a child out of trouble and not in trouble. … That’s the message that I’m trying to impress, because if somebody would have said ‘people are being mean, things are circulating about Nate Bronstein,’ things could have changed so much. We wouldn’t be here today.”

For 24/7 help, call the free and confidential National Suicide Prevention Lifeline at 1-800-273-8255.

ostevens@chicagotribune.com