Pictured: Several individuals sitting in a row, holding their mobile phones
Photo by ROBIN WORRALL on Unsplash

By Kiara Ortiz

In 2018, TikTok, which was formally known as Musical.ly, was one of the most downloaded mobile applications (app(s)) in the world.[1] This exciting app gave social media users a creative platform to publicly share short lip-syncing videos on their profiles. However, this fun and creative space quickly turned into a platform that illegally collected personal information from children under thirteen years old.[2] Although the Children’s Online Privacy Protection Act (COPPA) was created to protect children under thirteen, the rules do not adequately protect children from online companies collecting and sharing their personal information.

On Wednesday, February 27, 2019, TikTok agreed to pay $5.7 million dollars to settle the Federal Trade Commission’s (FTC) allegations that the mobile app violated COPPA.[3] COPPA provides a set of rules that apply to online services and/or operators of websites that are 1) directed toward children and collect personal information from them or 2) directed toward a general audience but have actual knowledge they are collecting personal information from children.[4]  The FTC alleged that TikTok had violated COPPA by collecting personal information from children under thirteen years old without prior parental consent.[5]

TikTok requires users to create a username and password by providing their first and last name, a short biography, email address, phone number, and a profile picture.[6] Although there is an option to change a profile from public to private, the user’s profile picture, username, and biography is still publicly available for other users to see.[7] The app also allows users to directly message each other regardless of users’ ages.[8]

The FTC stated in its complaint that TikTok violated the COPPA rules by failing to provide direct notice to parents, failing to obtain consent from parents prior to collecting personal information, and failing to notify the children’s parents about the app’s collection and use of their personal information.[9] Basically, any information that the company collects from a child under thirteen can be a violation of the COPPA rules.[10]

For years prior to the settlement, TikTok received thousands of complaints from concerned parents regarding the public nature of their young children’s profiles.[11]  However, TikTok claimed that because its app did not intentionally attract children under thirteen years old, the COPPA rules did not apply.[12] Nonetheless, the FTC stated that the app fell under the COPPA rules because it targeted these children through its visual and audio musical content, as well as the presence of child celebrities or celebrities who appeal to kids on the app.[13] Additionally, the FTC noted that TikTok was aware that a significant portion of its users were younger than thirteen, yet it still failed to seek parental consent.[14]  The FTC also points to data supporting the number of young users in TikTok’s audience composition.[15]  

In addition to paying the $5.7 million civil penalty, TikTok has agreed to alter its platform to ensure that it is in compliance with the COPPA rules by creating an additional app experience that allows TikTok to split users into age-appropriate virtual environments.[16] However, this dispute continues to raise a number of concerns about the adequacy of the COPPA rules and enforcement in actually protecting children under thirteen years old from the collection of their personal information. 

As the company takes steps to comply with COPPA, there is still some concern that the app will attract children under thirteen. Most apps and online services prompt users with an age verification question during registration as a way to eliminate the need for parental consent.[17] However, even with age verification technology, it can be difficult to ensure that children under thirteen are not creating profiles under different age groups to gain access to the general audience websites.[18] Although COPPA lays out strict rules to protect children under thirteen years old from the collection of their personal information, if children under thirteen are creating profiles claiming that they are thirteen or older, the federal law loses its effectiveness. For online companies to be sure that they are completely compliant with the COPPA rules, they should consider creating separate platforms for children under the age of thirteen. Alternatively, COPPA can also be amended to ensure that general audience sites are not inadvertently collecting illegal information by setting strict guidelines for the amount of personal information that they collect from any user.

Another issue this lawsuit has raised is the lack of education parents have within this area. Nowadays, more than half of children ages eleven to twelve have social media profiles even though most platforms’ minimum age is thirteen.[19] In some of these cases, parents either allow their children to register for these apps because they are unaware of the age restrictions or they do not fully understand that their children’s personal information may be collected and shared with third party advertisers.[20] Currently, the Code of Federal Regulations states that a company must “provide notice on the Web site or online service of what information it collects from children, and how it uses such information, and its disclosure practices for such information.”[21] However, the regulation does not have a section that specifically provides any guidance as to how these companies should display this information.[22] Although the FTC has provided steps to avoid liability, the regulations’ vagueness gives online platforms too much discretion as to how they should display their privacy policies.[23] Essentially, companies can bury important parts of the policy in language that may not be easy to understand to the average parent.[24] To ensure that parents are fully educated as to how their children’s, and quite possibly their own, personal information is being collected and shared, the COPPA rules must provide more guidance as to how online platforms can clearly and concisely share their privacy policies. 

As children increasingly become more tech-savvy and gain greater access to mobile devices at a younger age, the current COPPA rules do not adequately protect children from online companies collecting and sharing their personal information. Age falsification and the lack of general education in underage social media usage are roadblocks that the FTC must consider to effectively balance children’s privacy interests with the interests of online social media platforms when enforcing the COPPA rules. The FTC must pay closer attention to how online companies and mobile applications are registering their users, displaying their privacy policies, and ensuring that, overall, they are adequately complying with COPPA. On the other hand, COPPA should also provide specific guidelines for companies to create clear and easy-to-understand privacy policies so that users are fully aware of the age restrictions and where their data is going. With the help of the FTC’s increased scrutiny, and an additional set of privacy policy guidelines outlined in COPPA, companies would have less trouble creating platforms that comply with COPPA. 


[1] Nick Bastone, A Viral Video App You’ve Probably Never Heard of had More Downloads in September than Facebook, YouTube, or Snapchat, Business Insider,https://www.businessinsider.com/tiktok-most-downloaded-app-2018-11 (last updated Nov. 2, 2018, 6:37 PM).

[2] See Video Social Networking App Musical.ly Agrees to Settle FTC Allegations That it Violated Children’s Privacy Law, Fed. Trade Comm’n (Feb. 27, 2019), https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc.

[3] Id. (making this settlement the largest civil penalty ever obtained by the commission in a children’s privacy case).

[4] 16 C.F.R. § 312.3 (2018).

[5] See generally Complaint, United States v. Musical.ly, Inc., No. 2:19-cv-1439 (C.D. Cal. filed Feb. 27, 2019).

[6] See Complaint at 5-6.

[7] Id. at 5 (stating that usernames, profile pictures, and bios remain public and searchable even when the profile is private).

[8] Id. at 6.

[9] See Complaint at 10.

[10] See 16 C.F.R. § 312.2 (1)-(6). According to the COPPA rules, personal information is defined as individually identifiable information about an individual online such as first and last name, a home address, online contact information, a screen or username, telephone number, social security number, a persistent identifier (IP address, cookie, serial number, etc.) photograph, video or audio file, geolocation information, and any information concerning the child or the parents of that child. 16 C.F.R. § 312.2 (1)-(10).

[11] Complaint at 6 (showing that in a two-week period, TikTok received more than 300 complaints from parents asking to have their child’s account closed).

[12] See Video Social Networking App Musical.ly Agrees to Settle FTC Allegations That it Violated Children’s Privacy Law, Fed. Trade Comm’n (Feb. 27, 2019), https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc.

[13] See 16 C.F.R. § 312.2 (listing factors that assist in determining whether a website is directed toward children).

[14] Complaint at 6.

[15] In February 2017, TikTok knowingly sent messages to forty-six users’, who appeared to be under thirteen, to edit their profile description to indicate that their accounts were being run by a parent or adult. However, TikTok did not take steps to verify these accounts to ensure that the users complied. See, e.g.id. at 7.

[16] See Lesley Fair, Largest FTC COPPA Settlement Requires Musical.ly to Change its Tune, Fed. Trade Comm’n (Feb. 27, 2019, 12:57 PM), https://www.ftc.gov/news-events/blogs/business-blog/2019/02/largest-ftc-coppa-settlement-requires-musically-change-its.

[17] See, e.g., Social Media Sites Should Find New Ways to Verify Children’s Ages, theJournal.ie (Mar. 29, 2019, 11:40 AM), https://www.thejournal.ie/cyber-safety-children-social-media-3930331-Mar2018/.

[18] Nicole Perlroth, Verifying Ages Online is a Daunting Task, Even for Experts, N.Y. Times (June 17, 2012), https://www.nytimes.com/2012/06/18/technology/verifying-ages-online-is-a-daunting-task-even-for-experts.html.

[19] Jacqueline Howard, What’s the Average Age When Kids Get a Social Media Account?, CNN, https://www.cnn.com/2018/06/22/health/social-media-for-kids-parent-curve/index.html (last updated June 22, 2018, 10:22 AM).

[20] Id. (stating that, most of the time, parents do not understand what they are agreeing to).

[21] 16 C.F.R. §312.3(a).

[22] 16 C.F.R. §312.3.

[23] See Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business, Fed. Trade Comm’n, https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance(last visited Mar. 7, 2019). 

[24] See id

Posted in

Share this post