Children's Online Safety Laws in 2026: COPPA 2.0, KOSA, and What Parents Should Know
Children’s Online Safety Laws in 2026: COPPA 2.0, KOSA, and What Parents Should Know
The legal framework protecting children online is undergoing its most significant overhaul since the original COPPA was signed in 1998. In 2026, a wave of new federal and state legislation is closing long-standing loopholes, extending protections to teenagers, and forcing tech companies to fundamentally redesign how their platforms interact with young users.
For parents, these laws create new tools and new rights. But they also create new complexity. This guide explains the key legislation, what it means for families, and what parents should do now.
Product picks are based on editorial review. Verify age-appropriateness. Affiliate links may appear.
The Original COPPA: Why It Was Not Enough
The Children’s Online Privacy Protection Act, passed in 1998, required websites to obtain parental consent before collecting data from children under 13. It was groundbreaking for its time, but it had two fundamental problems:
-
The age 13 cutoff. COPPA treated 13-year-olds like adults, leaving teenagers completely unprotected. This meant that platforms like Instagram, TikTok, and YouTube could freely collect data from and market to 13, 14, 15, and 16-year-olds.
-
Self-reported age. COPPA relied on users truthfully entering their birthdate. Every parent knows how that worked out — a generation of 9-year-olds claiming to be 13 to create social media accounts.
COPPA 2.0: Protecting Teens
COPPA 2.0 extends the original law’s protections to cover children and teenagers up to age 16. According to Mayer Brown’s legislative tracker, the key provisions include:
- Age coverage extended to 16: All data collection and privacy protections that previously applied only to under-13s now apply to 13-16 year olds as well.
- Ban on targeted advertising to minors: Companies cannot serve behaviorally targeted ads to anyone under 17. This eliminates the business model that incentivized platforms to keep teens scrolling.
- Dedicated FTC enforcement division: A new division within the Federal Trade Commission focused specifically on children’s online privacy, with expanded investigative and penalty authority.
- Data minimization: Companies must limit the personal data they collect from minors to what is strictly necessary for the service to function.
According to ITIF’s analysis, while the goals are laudable, some critics argue the implementation needs refinement to avoid unintended consequences, such as restricting teens’ access to beneficial educational content.
KOSA: The Kids Online Safety Act
Running parallel to COPPA 2.0, the Kids Online Safety Act (KOSA) takes a different approach by regulating platform design rather than just data collection:
- Mandatory risk assessments: Companies must evaluate how their platforms affect minors’ mental health and well-being.
- Restricted default settings: Minors’ accounts must have the strictest privacy and safety settings enabled by default. Parents and teens can loosen them, but the starting point must be protective.
- Algorithm disclosure: Companies must explain how their recommendation algorithms work and give parents tools to adjust or disable algorithmic content feeds.
- Parental oversight tools: Platforms must provide parents with meaningful controls over their children’s accounts, including activity monitoring and content filtering.
For practical tools you can implement today, see our online safety for kids guide and cyberbullying prevention resources.
Age Verification: The Core Challenge
All of these laws depend on knowing whether a user is a minor. In February 2026, the FTC issued a policy statement announcing it will not penalize companies that collect personal information solely for the purpose of age verification — effectively giving tech companies a green light to implement age-checking systems.
The App Store Accountability Act goes further: requiring age verification at the account level, parental consent for each minor’s app download, and linking children’s devices to a parent or guardian.
State-level action is also accelerating. According to Mayer Brown, four states have enacted age-appropriate design codes — California, Maryland, Nebraska, and Vermont — and Alabama became the fourth state to sign social media age verification into law in February 2026, joining Utah, Louisiana, and Texas.
What This Means for Parents
New Rights You Have
- Right to know: Companies must disclose what data they collect from your child and how it is used.
- Right to delete: You can request deletion of your child’s data.
- Right to restrict: You can limit how your child’s data is shared with third parties.
- Default protections: Your child’s accounts will start with the strictest settings enabled.
- Algorithm control: You can adjust or disable recommendation algorithms on your child’s accounts.
What Parents Should Do Now
- Review your child’s accounts. Check privacy settings on every platform your child uses. Enable the strictest available options.
- Enable parental controls. Use built-in parental control features on devices and apps. See our video game parenting guide for gaming-specific controls.
- Talk to your kids. Explain why these protections exist and what data collection means. Our digital citizenship guide provides conversation frameworks.
- Monitor without surveilling. The goal is to keep children safe while building trust. Open conversations are more effective than secret monitoring.
- Check age-appropriate alternatives. For younger children, start with platforms designed for kids. See our screen time rules by age for age-appropriate platform recommendations.
What This Means for Kids and Teens
These laws aim to create a safer online environment, but they also affect how kids use the internet:
- Some content may become harder to access. Age gates and parental consent requirements may add friction to accessing platforms and content.
- Recommendations will change. With algorithm restrictions, the “infinite scroll” experience that keeps teens engaged for hours will be curtailed.
- Privacy improves. Less data collection means fewer targeted ads and less personal information floating around the internet.
For older kids interested in understanding the technology behind these systems, see our AI for kids parent’s guide and teaching kids to code guide.
The Bottom Line
2026 marks a watershed moment for children’s online safety. COPPA 2.0 closes the teen protection gap, KOSA forces platforms to prioritize safety by design, and state laws are adding additional layers of protection. For parents, these laws create powerful new tools — but they work best when combined with ongoing conversation, education, and supervision at home.
Sources
- Little Users, Big Rules: Tracking Children’s Privacy Legislation — Mayer Brown — accessed March 26, 2026
- Social Media Companies Scramble to Verify Minors Online — Fortune — accessed March 26, 2026
- FTC Issues COPPA Policy Statement on Age Verification Technologies — FTC — accessed March 26, 2026
- COPPA 2.0 and KIDS Act Need Fixes — ITIF — accessed March 26, 2026