British data watchdog, the Information Commissioner Office (ICO), is actively investigating Snapchat to determine if the popular U.S. instant messaging app is taking adequate measures to remove underage users from its platform.
According to sources familiar with the matter, the ICO is taking this step after concerns were raised regarding the significant number of children under the age of 13 using the app. In a bid to comply with UK data protection laws, social media platforms are required to obtain parental consent before processing the data of children under 13.
Despite setting a minimum user age of 13, various social media companies have struggled to effectively prevent young children from accessing their platforms.
Snapchat Response and ICO Actions:
Snap Inc, the parent company of Snapchat, has been cautious in sharing specific details about the steps it has taken to address the issue of underage users. A spokesperson from Snap mentioned that they share the same objectives as the ICO in ensuring age-appropriate digital platforms aligned with the Children Code.
Constructive discussions between Snap and the ICO are ongoing as they work towards achieving this goal. The ICO, as part of its preliminary assessment, has engaged with users and other regulatory bodies to evaluate any potential breaches by Snap.
Growing Concerns and Prior Incidents:
The concern over underage users on social media platforms has been escalating. Last year, a study conducted by Ofcom found that approximately 60% of children aged eight to eleven possessed at least one social media account, often falsely created using incorrect birthdates.
Notably, Snapchat emerged as the favored platform among underage social media users. Following a Reuters report, the ICO received several complaints from the public regarding Snap handling of children data, especially its perceived lack of effort in preventing young children from accessing the app.
Potential Consequences and Industry Pressure:
Should the ICO conclude that Snap has violated its regulations, the company might face a substantial fine, potentially up to 4% of its annual global turnover, which could amount to approximately $184 million based on recent financial results. This case is part of a wider global trend where social media companies are under increasing pressure to enhance content moderation on their platforms.
The National Society for the Prevention of Cruelty to Young Children (NSPCC) highlighted Snapchat role in instances of indecent image distribution involving children.
NSPCC Concerns and Industry Comparison:
Richard Collard, associate head of child safety online for NSPCC, expressed deep concern over the use of Snapchat by children under 13. Collard stressed that youngsters as young as 11 and 12 were engaging in inappropriate activities on the platform, including sharing explicit images and interacting with adults.
In a related incident, TikTok, a rival platform, was fined £12.7 million ($16.2 million) by the ICO earlier this year for misusing children data. This decision underscored the need for stricter actions to ensure the safety of underage users.
Measures Taken and Snap Approach:
Snapchat does implement some measures to restrict underage users, preventing them from registering with a birthdate indicating they are under 13.
However, other platforms, such as TikTok, take more proactive steps by continuing to block underage users who provide false birthdates. This approach aligns with a concerted effort to protect young users from accessing inappropriate content.
The ICO investigation into Snapchat handling of underage users underscores the growing concern surrounding the presence of young children on social media platforms. As the industry faces mounting pressure to ensure the safety of its users, the outcomes of these investigations will likely have far-reaching implications for the digital landscape.