Not everything on the internet is appropriate for all ages, and there are valid reasons to protect children from certain environments and activities. Among these are inappropriate content, including violence, sexual situations, extremism and other material that may be potentially harmful to children. Another reason is exposure to certain types of advertising and, of course, children’s privacy. The federal Children’s Online Privacy Protection Act (COPPA) restricts information that companies can collect from children younger than 13, which is the primary reason why most social networking services don’t allow access to anyone who they know to be under 13.
Related Articles
Magid: We deserve a scam-free life
Magid: Why you should consider an internet home phone
Magid: Making taxes less taxing and tracking finances
Magid: A robot that vacuums, mops and empties its bins
Magid: Tuesday is Safer Internet Day. We can all do our part
In addition to banning younger children, social media companies typically have protections for teens, and these are sometimes different depending on whether it’s a younger teen (perhaps under 16) or someone who’s within a couple of years of being an adult.
Although these policies help protect teens on social media and keep younger children off of the platforms, they only work if the teen or child is honest about their age. Underage users might be kicked off a platform if the company finds out they’re underage, but a 2022 study conducted by UK regulator Ofcom found that one in three British children lied about their age to social media companies, and I have no reason to believe the number is any lower in the United States.
Age assurance
Because of this, many regulators and even some technology companies are calling for “age assurance,” which involves either a technological or document-based way of knowing how old someone is or at least what age-range they’re in. This is harder than it might seem because of privacy laws and policies and the lack of credit and other information on children that can often be used to verify an adult’s identity and age.
Differences of opinion and state-level bills
There is a difference of opinion about who should be responsible for age assurance. Meta has called upon Apple and Google to take responsibility for age verification at the app store level since these companies already have the age of children whose parents have set up a child account for their device or from teens that set up their own accounts and were truthful about their age. If a child enters a birthdate that indicates they’re under 13, they’re referred to a web page where a parent can set up the account for them. Anyone 13 or older has the option to independently set up their account.
Both Apple and Google already have optional ways for parents to manage or block their children’s app downloads, and Meta argues that there should be a federal law that requires them to verify users’ age range and require parents to approve downloads from teens and children.
There are both federal and state bills pending on this subject. Utah just passed such a law, supported by Meta, Snap and X. South Dakota may be next with Senate Bill 180, which requires app stores to “request age information from the individual and verify the individual’s age using methods of age verification that are reasonably designed to ensure accuracy.” The bill further states “If the age verification process determines the individual is a minor, require the account to be affiliated with a parent account; and obtain verifiable parental consent” before the minor can download or purchase a covered app or make in-app purchases.
A bar versus a food court
Both Apple and Google have opposed these laws arguing they could violate privacy by collecting and disseminating personal information to all apps. “Some apps,” Apple wrote in a white paper, “may find it appropriate or even legally required to use age verification … often through collecting a user’s sensitive personal information (like a government issued ID) to keep kids away from inappropriate content. But most apps don’t.” Apple likened it to asking merchants who sell alcohol in a mall to verify a buyer’s age,” but not “if they just want to go to the food court.” Google says it “raises real privacy and safety risks, like the potential for bad actors to sell the data or use it for other nefarious purposes.”
Both Apple and Google have recently announced a compromise of sorts. Apple says it will “put parents in control by allowing them to share information about the age range of their kids with apps to enable developers to provide only age-appropriate content, all without needing to share their birthdate or other sensitive information.” This is an optional feature and not mandated as some of the proposed laws would require.
Google has taken a similar position. On Wednesday, the company issued a statement saying, “In our proposal, only developers who create apps that may be risky for minors would request industry standard age signals from app stores, and the information is only shared with permission from a user (or their parent).” Google is also proposing “a centralized dashboard for parents to manage their children’s online activities across different apps in one place and for developers to easily integrate with.”
Step in the right direction
In an upcoming release of iOS (18.4), Apple will ask for “age range” for the user of the device. Options are 12 or younger, 13 to 17 or 18+. Apple will then set up parental controls and safety features based on this information. The company said this will “put parents in control by allowing them to share information about the age range of their kids with apps to enable developers to provide only age-appropriate content,” without sharing sensitive information.
Apple and Google’s compromises may not go far enough to satisfy Meta and some lawmakers, but they are a step in the right direction. A Meta spokesperson reportedly called the Apple announcement “a positive first step,” but added that “developers can only apply these age-appropriate protections with a teen’s approval.”
Parental controls vs. youth autonomy
My own take is that parental controls need to be balanced with youth autonomy. I can see cases, for example among LGBTQ+ youth, where a young person may have a good reason to be reluctant to share their app and web use with a parent or where a parent would deny permission because of their attitudes toward such things as sexual orientation or political or religious views. I know I’m going against the grain of “parental rights,” but I also feel a need to protect the free speech rights of teens. The last time I read the first amendment, there weren’t any clauses saying that it only applies to people over 18.
Related Articles
First-of-its-kind lawsuit alleges human trafficking by California’s Bumble Bee Seafoods
RH, formerly Restoration Hardware, preps new outlet store in San Jose
Santa Clara Stadium Authority approves budget, rejects 49ers request to pay for office space
In 2014, 70% of Bay Area homes sold for under $1 million. Now, just 26% do
San Jose extends ban on lying, sitting on downtown sidewalks
Disclosure: Larry Magid is CEO of ConnectSafely, a nonprofit internet safety organization that has recieved support from companies on both sides of this issue.