The Federal Trade Commission (FTC) has declined to approve a new method for obtaining parental consent under the Children’s Online Privacy Protection Act (COPPA) that would involve analyzing facial geometry to verify an adult’s identity.

In a letter to the Entertainment Software Rating Board (ESRB), Yoti (a digital identity company), and SuperAwesome (a company that provides technology parental verification requirements), the FTC denied the June 2023 application for the “Privacy-Protective Facial Age Estimation” software as a new means of obtaining parental consent under COPPA. However, the FTC made this determination “without prejudice to the applicants filing in the future” because the FTC anticipates receiving additional information and research about age verification technologies and their applications. The FTC said that “this insight is expected to be provided in a report that the National Institute of Standards and Technology (NIST) is slated to soon release about Yoti’s facial age estimation model.”

COPPA requires websites and online services directed at children under the age of 13 to obtain verifiable parental consent before collecting or using personal information from children. COPPA explicitly outlines several methods of obtaining such parental consent, and also includes a provision that allows for entities to submit new methods of obtaining verifiable parental consent under the rule with the FTC’s approval. In the application to the FTC, ESRB said that the “Privacy-Protective Facial Age Estimation provides an accurate, reliable, accessible, fast, simple and privacy-preserving mechanism for ensuring that the person providing consent is the child’s parent.” The application further states that the technology “can be implemented in a way that is consistent with COPPA’s data minimization, confidentiality, security, and integrity, and retention and deletion provisions, as well as the [FTC’s] concerns about potential bias and discrimination.”

However, the FTC stated in its decision that, in response to its call for public comments on the application, those who opposed the method raised “concerns about privacy protections, accuracy, and deepfakes.”

The application for this technology will have to be resubmitted after the NIST report is released.