Due to possible violations of the Digital Services Act (DSA) including minor safety and other issues, TikTok is under investigation by the EU. The “rabbit hole effect,” age verification concerns, default privacy settings, and addictive algorithms will be the main topics of discussion throughout the official sessions. According to a news release, the European Commission is also looking into ad transparency and researcher access to data.
The investigation is centered on children’s safety and privacy. The Commission will examine TikTok’s architecture and algorithms for any potentially detrimental characteristics, including as addictive behavior and “rabbit hole effects” that may lead to inappropriate content. The evaluation seeks to “counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being [and] the respect of the rights of the child,” according to the European Commission.
Examining TikTok’s age verification features, which are meant to stop children from accessing improper content, is part of that process. Simultaneously, it will compel the social media platform to guarantee elevated standards of secrecy, safety, and protection for kids about default privacy configurations — akin to what happened with Meta’s Instagram and Facebook.
TikTok’s adherence to the DSA’s requirements to “provide a searchable and reliable repository for advertisements” is another matter that Europe is investigating. In addition, as mandated by the DSA, it is looking into any flaws in researcher access to TikTok’s publicly available data.
The Commission will continue to gather evidence once the hearings begin. It can take further enforcement actions, such as interim measures and non-compliance determinations, according to the procedure.
By allowing users to choose not to have algorithms power their For You Page (FYP), TikTok (and parent company ByteDance) was already compelled to make significant adjustments for EU consumers in order to comply with the DSA. It also eliminated customized advertisements for EU users between the ages of 13 and 17 and included additional methods for reporting problematic material.
The EU is already looking into what TikTok and Meta have done to lessen the spread of false information and unlawful content about the continuing bloodshed in the Middle East. Meta was fined $414 million in 2022 for mandating customized advertisements. It’s reported that TikTok is developing a similar plan, and that it will provide a subscription tier to enable users to avoid tailored adverts. The EU is being urged by civil rights organizations to reject these ideas because they “pay for privacy.”