The UK’s Information Commissioner’s Office (ICO) has taken preliminary enforcement action against Snap Inc., the American camera and social media company, over concerns related to its “My AI” chatbot’s data processing practices. The ICO, responsible for enforcing data privacy regulations including GDPR, expressed particular concern about how “My AI” handles the data of children aged 13 to 17.
Furthermore, a preliminary notice has been issued, outlining potential actions the ICO may mandate, including potentially requiring Snap to halt data processing associated with “My AI” in the UK. Snap, with over 21 million monthly active UK users as of May, could face restrictions on offering the product in the UK if a final enforcement notice is issued.
Additionally, Snap responded by stating that it is closely reviewing the ICO’s preliminary decision and emphasized its commitment to user privacy. The company also noted that “My AI” underwent a rigorous legal and privacy review before its public release. The chatbot feature was initially offered to UK Snapchat+ subscribers in February and later made available to the broader Snapchat user base in April. It relies on OpenAI’s GPT technology to operate.
The ICO’s findings are provisional at this stage, and no definitive conclusion has been reached regarding any data protection breaches or the issuance of an enforcement notice. The ICO has indicated that it will carefully evaluate Snap’s response before deciding whether to issue a final enforcement notice.
John Edwards, the UK Information Commissioner, expressed concern about Snap’s apparent failure to adequately assess privacy risks, particularly concerning children and other users, before introducing “My AI.” This incident serves as a reminder to organizations utilizing generative AI to consider their data protection obligations from the outset of development.