Complaint Lodged with FTC Over Google Play Store Allowing “Inappropriate” Kids Apps

It is increasingly harder and harder to steer clear from apps and online services that collect our data, but it shouldn’t be hard to steer clear of the same for children, as that violates the Children’s Online Privacy Protection Act (COPPA).

However, a group of twenty-two consumer advocates lodged a formal complaint against Google. They are asking that the Federal Trade Commission investigate whether parents were misled by “inappropriate” kids apps that appeared in Google Play Store that violated both COPPA and Google’s own policies.

FTC Receives Complaint Against Google

If you’re a parent and looking for safe apps for your child in the Google Play Store, you expect what you find there will be safe and that apps will obey the law, but consumer advocates are saying that’s not the case.

“The business model for the Play Store’s Family section benefits advertisers, developers, and Google at the expense of children and parents,” said Campaign for a Commercial-Free Childhood Executive Director Josh Golin in a statement.

“Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases.”

The official complaint lodged with the FTC includes a few examples: “Preschool Education Center” and “Top 28 Nursery Rhymes and Song” are included in the complaint, as they access location. “Baby Panda’s Carnival” and “Design It Girl – Fashion Salon” are included in another group of apps that sent device identification data to advertising technology companies that allow profiles to be built of the young users.


Along with the above apps that break the rules, there are also several apps that may not be age appropriate. These include “Dentist Game for Kids.” This app allows users to give virtual patients shots in the back of their throat. “Doctor X & the Urban Heroes” has users cutting clothing off virtual patients.

There are also apps cited in the complaint that parent reviews allege allow excessive in-app purchases, but no examples were given.

Google’s Response

Google takes “these issues very seriously and continues to work hard to remove any content that is inappropriately aimed at children from our platform,” said a spokesperson from the company.

“Parents want their children to be safe online,and we work hard to protect them,” continued the spokesperson in a statement. “Apps in our Designed for Families program have to comply with strict policies on content, privacy, and advertising, and we take action on any policy violations that we find.”

Apps that are deemed suitable for children are marked with a star and the recommended age group by Google. They claim they removed thousands of apps from the family program during 2018 after they found violations. They also say one-third of applicants to the Play Store don’t even make it and are rejected outright.


This isn’t new territory for Google. Earlier this year 6,000 free children’s Android apps were analyzed, and more than half were found to share details with outside companies in possible violation of COPPA.


A University of Michigan study showed that ninety-five percent of 135 apps marketed by Google and aimed at children under the age of 5 had advertising. More than half had pop-up ads that were difficult for a young child to close.

New Mexico’s attorney general named Google in a lawsuit in September that accused app developer Tiny Lab Productions of sending children’s location data to other companies.

The FTC is known to take action against app developers who violate COPPA. One company was fined $300,000 in 2014 and ordered to delete data collected from users under the age of 13. The group settled a case against another developer for $950,000 in 2016 form tracking locations of its young users without getting parental consent first.

Perhaps most disturbingly, “Blaze and the Monster Machines” was removed from the Play Store earlier this after a voice recording threatening kids with a knife went viral, leading to U.K. parent complaints.

Clearly this is a problem for Google. Despite being hit with legal action repeatedly, they still don’t seem to have a way to vet apps in the Play Store to keep children safe, which should be alarming to anyone whose child is allowed to play apps on a phone at their discretion.

What do you think can be done about this problem? If laws and legal action can’t keep children safe from having their location tracked and information collected, what else do you think can be done to keep kids who use apps from the Google Play Store safe? Let us know your thoughts on “inappropriate” kids apps in the comments section below.

Subscribe to our newsletter!

Our latest tutorials delivered straight to your inbox