What Is KOSA and How Would it Affect Data Privacy?
The Kids Online Safety Act, or KOSA, is a bill meant to work in conjunction with COPPA. While COPPA mostly deals with data privacy for minors, KOSA targets the platforms that the user data interacts with.
“KOSA legislation is largely focused on platform design” says Irene Ly, policy counsel for Common Sense Media. “It imposes responsibility on technology companies to design their products and platforms with teen and child safety in mind. It also provides tools for parents to use to keep their children safe.”
Under KOSA, if an online platform design or its algorithms are harming minors, developers would need to take steps to address the harm.
“KOSA is trying to infuse more responsibility into the design process,” says Haley Hinkle, policy counsel for Fairplay. “It has measures on increased transparency, opening platforms up to research opportunities and adding more parental control tools.” This moves the responsibility for keeping minors safe to the platforms themselves rather than relying on the users to take those steps.
LEARN MORE: How is third-party risk impacting K–12 education?
“Companies are developing features using A/B testing and neuroscience research to drive greater engagement, but minors are especially vulnerable to many of these features being used,” Hinkle says. “KOSA is trying to address this imbalance of power between platforms and minors.”
What Is COPPA and What Is Its Role in Children’s Data Privacy?
The Children’s Online Privacy Protection Act, or COPPA, is a federal law intended to protect the privacy of internet users under the age of 13.
When the original COPPA bill was passed in 1998, the internet was a new technology. Research and studies from that era showed oversight was needed to protect children from predatory data collection and advertising practices. Three important results of this bill are that it:
- Authorized the FTC to issue penalties for noncompliance
- Required organizations providing online services to issue a privacy policy
- Required verifiable parental consent before the collection, use or disclosure of children’s data
COPPA has been revised in the past. In 2013, it broadened the definition of data types covered to add photos, video, audio recordings, cookies and geolocation information. In 2017, its scope expanded to include any service, application or device that connects to the internet.
Though legislators have made efforts to keep COPPA up to date, technology continues to evolve quickly while the legislative process struggles to keep up.
What Are Lawmakers Trying to Amend in COPPA?
One of the biggest problems with COPPA that legislators hope to amend is that it only covers children, not teens.
“The guidelines we have currently protect kids under 13,” says Hinkle. “COPPA does not address teens. We need protection for all minors up to 18. We also want to see strong provisions around targeted advertising.”
Another COPPA guideline that should be updated concerns parental consent.
“Companies quickly discovered easy ways to address compliance needs,” says Ly. “Many simply add an age gate to their websites in the form of a verification step.”
EXPLORE: See how K–12 schools use technology as a guardrail for digital citizenship.
Frequently, this verification step only requires the user to click a button to confirm that they are of age and consent to data collection.
“Child or adult, anyone can click the button, but this meets current compliance needs,” Ly adds. “While we need to extend this opt-in consent requirement on online platforms to teens, we also need to require companies to minimize the amount of data they collect and share on users in the first place.”
Concern About Social Media Drives Efforts on COPPA and KOSA
Legislators sharpened their focus on better addressing online safety for minors following the dramatic testimony of Facebook (now Meta) whistleblower Frances Haugen back in October 2021.
“Some of it touched on the company’s knowledge of the harmful effects of its products on children,” Ly says. “This testimony revealed how much they knew, which really drove increased interest in the regulation of social media platforms. KOSA is a direct product of those hearings.”
Then, in December 2021, U.S. Surgeon General Vivek Murthy released a report titled “Protecting Youth Mental Health,” which touched on how social media negatively impacts the mental health of young people.
All of this was going on against the backdrop of COVID-19. “The experience of students learning remotely at home during COVID raised awareness of these issues and made them an even more pressing concern for parents,” says Hinkle.
“Designing programs for minors requires more than following the laws, it requires thinking ethically about minors’ data,” says Linnette Attai, founder of PlayWell. “It is very different to be in the business of creating products for minors compared with creating for adults, so the government needs to set guidelines.”
MORE ON EDTECH: How are schools using the metaverse for education?
How Can Schools Prepare for COPPA and KOSA Legislation?
Though these bills did not make it over the line in 2022, they will likely remain on the near-term agenda for legislators. There is strong bipartisan support to address the ongoing problems involved in securing the safety and privacy of minors online.
Until then, here are some things K–12 administrators and IT staff should be thinking about now to be well positioned when these bills pass.
“COPPA’s updates may change how schools approach consent and how they handle data collection practices,” Hinkle says. “Some schools may need to revisit how they secure parental consent.”