Kids and Online Privacy: What Parents Should Actually Know

Your child's data is worth more than you think. Not in the abstract sense where privacy advocates warn about surveillance capitalism, but in the concrete sense where an educational app sells reading level data to a data broker, who packages it with behavioral health information scraped from a forum, combines it with location patterns from a game, and sells the profile to an advertiser targeting anxious parents of struggling readers. That transaction happens without your knowledge, often without your consent, and almost always without meaningful oversight.
The mechanism is simple. Apps request permissions. You tap "Allow" because the app won't work otherwise. The app collects everything it can. Some of that data is necessary for functionality. Most of it is not. The excess gets aggregated, anonymized (poorly), and sold. The anonymization fails because combining a few data points re-identifies most people. Your child's unique combination of school, age, interests, and location is a fingerprint.
I have two cats, Gandalf and Radagast. I don't have kids. But I've read enough contracts between schools and edtech vendors to understand how little protection exists in practice. This article explains how children's data moves through the system, what the law actually requires, where the gaps are, and what you can do about it.
How Children's Data Gets Collected
Children interact with technology in three main contexts: apps and games, school platforms, and social media. Each context has different rules, different incentives, and different risks.
Apps and games collect device identifiers (IDFA on iOS, Advertising ID on Android), which allow tracking across apps and websites. They collect location data, often continuously. They request access to contacts, photos, microphone, and camera. Many games aimed at children include in-app purchases and use behavioral psychology techniques to maximize engagement and spending. Free-to-play games are particularly aggressive because the business model depends on converting a small percentage of users into high spenders.
Educational apps collect academic performance data: grades, test scores, reading levels, areas of difficulty. Some collect information about learning disabilities, behavioral issues, and family circumstances. This data is often shared with the school district, but it may also be retained by the app vendor and used for product analytics or sold to third parties under vague terms in the privacy policy.
School platforms include learning management systems, gradebooks, communication tools, and online testing platforms. Schools sign contracts with vendors that govern data handling, but parents rarely see these contracts. FERPA, the federal law governing educational records, allows schools to share student data with contractors providing educational services without parental consent. The definition of "educational services" is broad. Once data leaves the school's control, FERPA's protections weaken.
Some school districts use surveillance tools that monitor student activity on school-issued devices, including web browsing, app usage, and even keystrokes. The stated purpose is usually safety (preventing self-harm, detecting threats), but the tools also generate data about students' interests, mental health, and behavior that persists in databases controlled by private companies.
Social media platforms have age restrictions (usually 13+), but enforcement is minimal. Children lie about their age. Parents create accounts for younger children. Once a child has an account, the platform collects the same data it collects from adults: interactions, interests, location, device information, and behavioral patterns used to build an advertising profile.
TikTok, Instagram, Snapchat, and YouTube all use recommendation algorithms that optimize for engagement. For children, this often means content that is emotionally intense, algorithmically sticky, and sometimes harmful. The platforms know this. Internal research from multiple companies has documented the mental health effects of their products on teenagers. The research rarely changes the product because the engagement metrics drive revenue.
What COPPA Actually Does (and Doesn't Do)
The Children's Online Privacy Protection Act (COPPA) is the primary federal law governing online data collection from children under 13. COPPA requires websites and apps directed at children to obtain verifiable parental consent before collecting personal information.
Personal information under COPPA includes name, address, email, phone number, Social Security number, geolocation data, photos, videos, audio recordings, and persistent identifiers like device IDs that allow tracking across sites and apps.
The law requires operators to:
- Post a privacy policy describing data collection practices
- Obtain verifiable parental consent before collecting data
- Allow parents to review and delete their child's information
- Maintain reasonable security for collected data
- Retain data only as long as necessary
COPPA applies to operators of websites and apps directed at children under 13, and to general-audience sites that have actual knowledge they are collecting data from children under 13.
Here's where it breaks down.
Age verification is minimal. Most apps ask for a birthdate. Children lie. There is no requirement to verify the birthdate against external records. If a child enters a birthdate indicating they are 13 or older, the app treats them as an adult and collects data without restriction.
"Directed at children" is a narrow definition. An app can avoid COPPA by claiming it is directed at a general audience, even if children are a significant portion of the user base. Apps use age gates (asking the user's age at signup) to create plausible deniability. If the user claims to be 13+, the app is not collecting from children under COPPA, even if the claim is false.
Enforcement is limited. The FTC enforces COPPA, but the agency has limited resources and thousands of apps to monitor. Enforcement actions happen, but they are infrequent and usually target high-profile violations. Most violations go undetected or unaddressed.
COPPA does not cover teenagers. Children 13 and older have no special privacy protections under federal law. They are treated as adults for data collection purposes. This is a significant gap because teenagers are heavy users of social media and mobile apps, and their data is extensively collected and monetized.
Some states have passed laws extending protections to teenagers. California's Age-Appropriate Design Code Act requires platforms likely to be accessed by children under 18 to assess and mitigate privacy risks, default to high-privacy settings, and avoid using dark patterns to encourage data sharing. The law has faced legal challenges and its implementation has been delayed, but it represents a model that other states may follow.
The School Data Problem
Schools collect extensive data on students. Grades, attendance, disciplinary records, health information, special education status, free lunch eligibility, and standardized test scores are all part of the educational record governed by FERPA.
FERPA gives parents the right to access their child's educational records, request corrections, and control disclosure to third parties. But FERPA has a broad exception for "school officials with legitimate educational interests," which includes contractors providing services to the school.
When a school signs a contract with an edtech vendor, that vendor typically becomes a "school official" under FERPA, which allows the vendor to access student data without additional parental consent. The contract is supposed to restrict how the vendor uses the data, but parents rarely see these contracts, and enforcement depends on the school district's willingness to audit vendors and terminate contracts for violations.
Some edtech vendors use student data for purposes beyond providing the contracted service. Product analytics, algorithm training, and aggregated data sales are common. The vendor's privacy policy may allow these uses, and the school's contract may not prohibit them. Parents have little visibility into what happens to their child's data once it leaves the school's direct control.
A 2021 investigation by consumer protection researchers found that many popular educational apps shared data with advertising and analytics companies, even when the apps claimed to comply with COPPA and student privacy laws. The data sharing happened through embedded software development kits (SDKs) provided by ad networks, which collected device identifiers, location data, and usage patterns without clear disclosure.
The edtech version is messier than most parents expect. A reading level assessment from an app in third grade can get sold to a data broker, combined with social media activity and location data, and used to target ads for SAT prep courses before a child is in high school. The institutions don't guard the data because they don't control it. The vendors do.
What Data Brokers Know About Your Child
Data brokers are companies that collect, aggregate, and sell personal information. They operate largely outside public view, buying data from apps, websites, retailers, and other sources, then combining it into profiles sold to advertisers, marketers, insurers, employers, and others.
Children's data enters the data broker ecosystem through apps that sell user data, through websites that allow third-party trackers, and through data breaches. Once in the ecosystem, the data is difficult to remove because it gets copied, resold, and merged with data from other sources.
A typical profile might include:
- Name, age, address, school
- Device identifiers and IP addresses
- App usage and screen time patterns
- Location history
- Interests inferred from browsing and app activity
- Social connections
- Purchase history from in-app transactions
- Academic performance data from educational apps
Data brokers claim to anonymize this data by removing direct identifiers like names and addresses, but researchers have repeatedly demonstrated that anonymized data can be re-identified by combining a few data points. Your child's age, school, and zip code are often enough to uniquely identify them. Add interests and location patterns, and re-identification becomes trivial.
Some data brokers specialize in marketing to parents based on their children's characteristics. These brokers sell lists of "parents of children with ADHD," "parents of struggling readers," "parents of gifted children," and similar categories. The data comes from educational apps, parenting forums, and behavioral patterns observed across apps and websites.
This targeting is legal. COPPA restricts data collection from children under 13, but it does not restrict data collection about children from other sources, and it does not restrict the sale of that data once collected. Parents have limited ability to see what data brokers have on their children or to request deletion.
What You Can Actually Do
The privacy landscape for children is broken at the structural level. Laws are weak, enforcement is minimal, and the business model of free apps depends on data extraction. You cannot opt out of the system entirely without disconnecting your child from school, social life, and most forms of modern communication. But you can reduce exposure.
Review app permissions. Before installing an app, check what permissions it requests. Deny location, contacts, microphone, and camera access unless the app's core functionality requires it. Most apps request far more permissions than they need. A game does not need access to your contacts. A drawing app does not need your location.
On iOS, go to Settings > Privacy & Security to review and revoke permissions. On Android, go to Settings > Privacy > Permission Manager. Spend ten minutes going through installed apps and turning off unnecessary permissions. This is the single most effective action you can take.
Read privacy policies. I know. They are long and written in legal language designed to obscure rather than inform. But skim for these sections: what data is collected, how it is used, whether it is shared with third parties, and how to delete your child's data. If the policy says data is shared with "partners" or "service providers" without naming them, assume the worst.
If an app's privacy policy is not easily accessible or does not clearly explain data practices, do not install the app. Legitimate apps have clear policies. Apps that hide their data practices are hiding them for a reason.
Use family sharing and parental controls. iOS Screen Time and Google Family Link allow you to manage app installations, set screen time limits, and review app usage. These tools are not perfect, but they give you visibility into what your child is doing on their device.
Set up restrictions so your child cannot install apps without approval. This forces a conversation about each new app and gives you an opportunity to review permissions and privacy policies before the app is installed.
Talk to your child's school. Ask what edtech platforms the school uses. Request copies of contracts between the school and vendors. Ask how student data is protected and whether it is shared with third parties. Most schools will not volunteer this information, but many will provide it if you ask.
If your school uses surveillance tools on student devices, ask what data is collected, how long it is retained, who has access, and what happens to students flagged by the system. Some schools have found that these tools generate more false positives than genuine threats, and that the surveillance chills student expression and research on sensitive topics.
Limit social media access. This is contentious. Many parents feel that prohibiting social media isolates their child socially. That may be true, depending on your child's age and social context. But social media platforms are designed to maximize engagement through algorithmic recommendation systems that often promote emotionally intense content. The mental health effects on teenagers are well-documented.
If your child uses social media, set the account to private, disable location sharing, review privacy settings together, and have ongoing conversations about what they share and who can see it. Explain that everything posted online is potentially permanent and public, even if the platform claims otherwise.
Use privacy-focused alternatives. Some apps and platforms prioritize privacy. Signal for messaging. DuckDuckGo for search. Firefox with tracking protection for browsing. These tools are not perfect, but they collect less data than mainstream alternatives.
For younger children, consider using a dedicated device (an old phone or tablet) with only approved apps installed, rather than giving them unrestricted access to a device connected to your accounts and data.
Request data deletion. Under COPPA, parents have the right to request deletion of their child's data from apps and websites directed at children under 13. Under California law (CCPA/CPRA), California residents have the right to request deletion of their data regardless of age. Some other states have similar laws.
Data deletion requests are not always honored, and deleted data is not always fully removed from backups and third-party systems, but making the request creates a record and may result in some data being removed.
Monitor for breaches. Use Have I Been Pwned to check if your child's email address has appeared in known data breaches. If it has, change passwords on affected accounts and enable two-factor authentication where possible.
Many breaches involve children's data. In 2021, a breach at an edtech company exposed personal information of millions of students, including names, birthdates, and school information. The company did not directly notify parents. Most parents learned about the breach from news coverage months later.
The Bigger Picture: What Needs to Change
Individual actions help, but they do not solve the structural problem. The business model of free apps depends on data extraction. Schools depend on edtech vendors that prioritize features over privacy. Social media platforms optimize for engagement over well-being. These are not problems you can solve by reading privacy policies.
Real solutions require:
- Stronger laws. COPPA should be updated to cover teenagers, require meaningful age verification, and impose stricter limits on data collection and sharing. State laws like California's Age-Appropriate Design Code Act should be adopted more broadly.
- Better enforcement. The FTC needs more resources to enforce existing laws and investigate violations. Penalties for COPPA violations should be large enough to change corporate behavior.
- Transparency in school contracts. Parents should have access to contracts between schools and edtech vendors, and schools should be required to conduct privacy impact assessments before adopting new platforms.
- Restrictions on data brokers. Data brokers should be required to allow parents to see what data they hold on children and to delete it on request. The sale of children's data should be prohibited without explicit parental consent.
- Platform accountability. Social media platforms should be required to assess the impact of their algorithms on children and to mitigate harms. Internal research on these effects should be made public.
None of this is happening quickly. In the meantime, you are left managing the problem at the individual level, which is exhausting and insufficient but better than doing nothing.
In Dungeons & Dragons, when your party faces an enemy far above your level, you don't stand and fight. You use the terrain, set traps, and limit exposure. You cannot defeat the surveillance apparatus, but you can make it work harder to track your child. Deny permissions. Read policies. Ask questions. Delete data. It is not a solution, but it is a start.
How This Plays Out in Practice
A concrete example. Your 10-year-old wants to play a popular mobile game. The game is free. It requests permission to access location, contacts, and camera. The privacy policy says data is shared with "advertising partners" and "analytics providers."
You install the game and deny all permissions except storage (required for the game to save progress). The game works. Your child plays for a few weeks. You check the app's data usage in your device settings and see it has sent 200MB of data over the network, far more than the game's content would require. That data is telemetry: gameplay patterns, session length, in-app behavior.
You search for the game's name plus "privacy" and find a news article from six months ago reporting that the developer was fined by a European regulator for collecting data from children without proper consent. The fine was €50,000. The game has 10 million downloads and generates millions in revenue from in-app purchases. The fine is a rounding error.
You delete the game. Your child is upset. You explain why. They do not fully understand, but they accept it. You find an alternative game with a better privacy policy and no third-party data sharing. It is not as polished, but it is good enough.
This is the pattern. You make small decisions that reduce exposure incrementally. You cannot eliminate the risk, but you can manage it.
What About Teenagers?
Teenagers have no special privacy protections under federal law. They are treated as adults for data collection purposes, despite having less experience evaluating privacy risks and more vulnerability to social pressure and algorithmic manipulation.
Teenagers use social media extensively. They share personal information, location data, and images. They participate in online communities where behavioral data is collected and monetized. They are targeted by advertisers using sophisticated psychological techniques.
The same principles apply: review permissions, use privacy-focused tools, have ongoing conversations about what they share online. But teenagers have more autonomy and less willingness to accept parental restrictions. The conversation shifts from control to education.
Explain how data collection works. Show them how to review privacy settings. Walk through examples of how their data is used by advertisers and platforms. Explain the permanence of online content and the risks of oversharing.
Teenagers often understand these concepts intellectually but underestimate their personal risk. They believe they are savvy enough to avoid problems. Sometimes they are. Often they are not. The best you can do is provide information and maintain an open dialogue.
The Role of Schools in This Mess
Schools are in a difficult position. They need technology to deliver education, especially after the shift to remote learning during the pandemic. Edtech vendors offer platforms that are feature-rich, easy to deploy, and often free or low-cost. The cost is data.
Some schools carefully vet vendors and negotiate contracts that limit data use. Many do not. Budget constraints, lack of technical expertise, and pressure to adopt new tools quickly lead to contracts that prioritize functionality over privacy.
Parents can push back. Attend school board meetings. Ask questions about data practices. Request transparency in vendor contracts. Form coalitions with other parents to create pressure for better policies.
Schools respond to organized parent advocacy more than individual complaints. If one parent raises concerns, the school may dismiss them as overblown. If twenty parents raise the same concerns, the school pays attention.
Where This Is Headed
The trajectory is toward more data collection, not less. AI systems require training data. Personalized learning platforms require behavioral data. Advertising models require tracking. The incentives all point in the same direction.
At the same time, awareness of privacy risks is growing. High-profile breaches, investigative journalism, and advocacy by organizations like the EFF have made privacy a mainstream concern. Some companies are responding by offering privacy-focused alternatives. Apple has positioned privacy as a competitive advantage. Signal and DuckDuckGo have built user bases by prioritizing privacy over features.
Regulation is slowly catching up. California, Virginia, Colorado, and other states have passed comprehensive privacy laws. The EU's GDPR provides a model for stronger protections. Federal legislation is possible, though progress has been slow.
The outcome depends on whether public pressure for privacy protections outpaces the economic incentives for data collection. Right now, the incentives are winning.
Final Thoughts
Your child's data is already out there. In app databases, data broker files, advertising profiles, school records, and third-party analytics platforms. You cannot undo that. But you can limit future exposure.
Start with app permissions. That is the lowest-effort, highest-impact action. Then move to reviewing privacy policies, talking to your child's school, and setting up parental controls. Build from there.
This is not a problem you solve once. It is an ongoing process of evaluation, adjustment, and conversation. Technology changes. Your child's needs change. The privacy landscape changes. You adapt.
The goal is not perfect privacy. That is not achievable. The goal is informed decision-making and incremental reduction of risk. You make choices that align with your values and your child's needs, knowing that every choice involves trade-offs.
Most parents underestimate how much data is collected about their children and overestimate the protections provided by law. Closing that gap requires understanding the mechanisms, reading the fine print, and asking uncomfortable questions. It is work. But it is work that matters.



