On Tuesday, the Senate overwhelmingly approved the Kids Online Safety Act with a strong bipartisan vote of 91- 3.

This important legislation aims to protect children from the potential harms of social media, gaming, and other online platforms. In addition, the Senate also passed the Children and Teens’ Online Privacy Protection Act, known as COPPA 2.0, reported Fox 4 KDFW.

“Too many kids see suicide or substance abuse material promoted online. We’ve heard from families who have lost their kids. It’s heartbreaking. We must act to protect kids online. Today, the Senate has passed KOSA and COPPA 2.0 to help make sure this doesn’t happen again,” said Senate Majority Leader Chuck Schumer after the vote.

Here is more of what Fox 4 reported on this win for lawmakers and parents:

CLICK HERE TO GET THE DALLAS EXPRESS APP

What is the KOSA bill?

The Kids Online Safety Act, also known as KOSA, is a bill that sets out requirements to protect minors from online harms.

With its passage, KOSA creates a “duty of care” — a legal term that requires companies to take reasonable steps to prevent harm — for online platforms minors will likely use.

The requirements apply to covered platforms, which are applications or social networks that connect to the internet and are likely to be used by minors. However, the bill exempts internet service providers, email services and educational institutions from the requirements.

Covered platforms must take reasonable measures in the design and operation of products or services used by minors to prevent and mitigate certain harms, such as sexual exploitation, online bullying, the promotion of suicide, eating disorders, substance abuse and advertisments for illegal products such as narcotics, tobacco or alcohol.

Additionally, covered platforms must provide minors with certain safeguards, such as settings that restrict access to a minors’ personal data. This would limit other users from communicating with children and limit features that “increase, sustain, or extend the use” of the platform — such as autoplay for videos or platform rewards. In general, online platforms would have to default to the safest settings possible for accounts it believes belong to minors.

Also, the platforms must provide parents or guardians with tools to help supervise a minors’ use of a platform, such as control of privacy and account settings.

“So many of the harms that young people experience online and on social media are the result of deliberate design choices that these companies make,” said Josh Golin, executive director of Fairplay, a nonprofit working to insulate children from commercialization, marketing and harms from Big Tech.