Categories
Uncategorized

Digital Services Act, TikTok and the Romanian Elections: Where Does Free Speech End and Platform Responsibility Begin?

The article examines DSA obligations for very large platforms—risk assessments, content moderation, political ads—and how they play out in an electoral setting. It also reflects on the limits of free speech, disinformation control and what candidates, influencers and voters should realistically expect online.

In just a few years, debates in Romania about fake news, algorithmic amplification and foreign interference have moved from theory to a very concrete stress test: the cancellation of the 2024 presidential election results because of massive disinformation and suspected Russian influence, much of it channelled through TikTok and other online platforms. According to public statements in the European Parliament and several international analyses, Romania became the first EU Member State whose national election was invalidated due to foreign interference and disinformation campaigns largely run via TikTok and similar services.At the same time, the European Union has put in place a new legal backbone for the digital environment: the Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act – DSA). The DSA introduces obligations for all intermediary service providers and, in particular, for very large online platforms such as TikTok. These obligations include risk assessments, mitigation measures for systemic risks like disinformation and election interference, transparency for advertising and access to data for researchers.Romania has implemented the national enforcement framework for the DSA through Law no. 50/2024, which designates the National Authority for Management and Regulation in Communications (ANCOM) as the Digital Services Coordinator responsible for supervising intermediary services and enforcing the DSA domestically.

Against this background, the TikTok–Romanian elections story is a perfect case study for lawyers, regulators and platform users: where does legitimate content moderation end and censorship begin? Who is liable for disinformation and covert political campaigns: the platform, the content creators, the agencies behind the scenes, or the foreign state that orchestrates the operation? And, crucially for candidates and users, what legal tools exist to challenge abuses and to protect fundamental rights?

1. What is the Digital Services Act and why does it matter for elections?

1.1. The DSA as a “constitution” for online platforms

The Digital Services Act (DSA) is an EU regulation that modernises the legal framework for online intermediaries. It replaces and updates core elements of the old e-Commerce Directive and creates harmonised rules for internet access providers, caching and hosting services, online platforms and marketplaces. The regulation is directly applicable in all Member States and is complemented by national enforcement laws such as Romania’s Law no. 50/2024.

The DSA pursues several key objectives:

  • to ensure a safer digital environment where illegal content, goods and services are tackled effectively;
  • to protect fundamental rights, including freedom of expression and information;
  • to prevent and mitigate systemic risks linked to very large platforms, such as disinformation, threats to electoral processes, or risks to minors;
  • to promote transparency and accountability for online platforms, especially regarding advertising and recommender systems.

Importantly, the DSA maintains the classic principle that platforms are not subject to a general monitoring obligation over all content. They are not supposed to become the “truth police”. Instead, they must put in place structured procedures to act when they become aware of illegal content or systemic risks.

1.2. Graduated obligations: from all intermediaries to very large platforms

The DSA uses a layered approach. All intermediary services have some basic duties, but the more powerful the service, the more stringent the obligations.

  • All intermediary services (including internet access providers) must provide clear information in their terms of service, establish contact points for authorities and users, and publish yearly transparency reports about content moderation measures.
  • Hosting services and online platforms must implement effective notice-and-action mechanisms for users to report illegal content. They must inform the user whose content was removed, explain the reasons and offer complaint mechanisms. They also have to cooperate with trusted flaggers designated by national authorities.
  • Very large online platforms (VLOPs) and very large online search engines (VLOSEs) – i.e., services reaching more than 45 million monthly active users in the EU – have to perform yearly assessments of systemic risks and adopt mitigation measures. They are subject to independent audits, must provide more detailed transparency reports and must give researchers access to data.

TikTok has been formally designated as a VLOP and therefore falls under this top tier of obligations, including those relating to election integrity, disinformation and political advertising.

1.3. Disinformation and electoral integrity as “systemic risks”

The DSA does not criminalise “fake news” as such and does not create a new offence of disinformation. Instead, it expressly refers to disinformation and manipulation of electoral processes as systemic risks that very large platforms must assess and mitigate.

In practice, this means that TikTok and similar platforms are expected to:

  • analyse how their recommender systems and design choices can facilitate disinformation and polarisation;
  • adopt concrete mitigation measures (for example, downranking or limiting the amplification of coordinated harmful campaigns, cooperating with fact-checkers, demonetising certain content, increasing friction for content sharing, or limiting certain features around elections);
  • allow independent auditors, regulators and researchers to scrutinise their systems, based on structured data access obligations.

DSA articles on advertising transparency require online platforms to clearly label ads, provide information on who paid for them and why they were targeted to a specific audience, and to maintain a searchable repository of advertisements. These obligations are complemented by the Regulation (EU) 2024/900 on the transparency and targeting of political advertising, which sets specific rules for political ads, including stricter transparency and data-use requirements and minimum retention periods for information about each political ad.

2. Implementation in Romania: Law no. 50/2024 and the role of ANCOM

2.1. Law no. 50/2024 as the national enforcement framework

While the DSA is directly applicable, each Member State must establish its own institutional set-up and procedures for supervision and enforcement. In Romania, this has been done via Law no. 50/2024 on measures for the application of the DSA.

Law no. 50/2024 deals mainly with:

  • designating the Digital Services Coordinator and other competent authorities;
  • establishing procedures for investigations, information requests and inspections addressed to intermediary service providers;
  • setting out the sanctions applicable at national level, aligned with the ceilings in the DSA (which allows fines of up to 6% of the provider’s worldwide annual turnover for serious or repeated infringements);
  • clarifying cooperation mechanisms with foreign authorities and with the European Commission.

The full text of the law can be consulted on the official portal legislatie.just.ro, while practical explanations and summaries are available in ANCOM’s “Digital Services” section and in various legal and policy analyses.

2.2. ANCOM as Digital Services Coordinator

Law no. 50/2024 designates ANCOM as Romania’s Digital Services Coordinator. According to information published by the authority, ANCOM is responsible for supervising compliance with the DSA by intermediary service providers established in Romania and for cooperating with foreign regulators and with the European Commission.

In practice, ANCOM can:

  • receive and examine complaints related to possible breaches of the DSA by service providers;
  • request information from platforms, including data on content moderation, advertising and systemic risk assessments;
  • carry out inspections and investigations and, where appropriate, impose corrective measures and fines;
  • cooperate with other national authorities (data protection, electoral authorities, audiovisual regulator etc.) and with the European Commission, especially in cases involving very large online platforms.

For VLOPs like TikTok, the main enforcement authority under the DSA is the European Commission, but national coordinators such as ANCOM remain crucial for collecting evidence, understanding local contexts and supporting cross-border enforcement.

2.3. Other key Romanian authorities in the electoral context

The DSA interacts with a broader ecosystem of national laws and institutions. In the context of the Romanian presidential elections, several authorities play an important role alongside ANCOM:

  • the Permanent Electoral Authority (AEP), which regulates and supervises electoral campaigns, campaign financing and the use of advertising, including online political advertising;
  • the National Supervisory Authority for Personal Data Processing (ANSPDCP), responsible for enforcing data protection rules (GDPR) and therefore highly relevant for micro-targeting of voters and the processing of sensitive data;
  • the audiovisual regulator (the current or future equivalent of the National Audiovisual Council), in situations where content overlaps with audiovisual media services or where online platforms host content similar to television;
  • the prosecution services and criminal courts, when disinformation campaigns intersect with crimes such as computer fraud, cyber-attacks, illicit campaign financing, foreign interference or other offences provided by the Criminal Code or special laws.

The Romanian TikTok–elections scenario shows that effective enforcement requires these institutions to cooperate in practice, not only on paper. Without genuine cooperation, the DSA risks becoming a beautifully written but weak instrument in the face of complex hybrid operations.

3. TikTok and the Romanian presidential elections

3.1. A “cyber-thriller” election

The 2024 Romanian presidential election quickly became an international case study in algorithmic influence and foreign interference. A formerly marginal, far-right, pro-Russian candidate – Călin Georgescu – surged in the first round of the election, largely on the back of viral content on TikTok and other social media platforms, despite being largely absent from traditional media and opinion polls only weeks earlier.

Investigations by journalists, think tanks and EU institutions describe:

  • coordinated networks of accounts and influencers pushing pro-Russian narratives and anti-EU messages;
  • manipulative short-form videos that weaponised emotions, fear and frustration, especially among young voters;
  • non-transparent spending on online political advertising, often outside the official campaign structures;
  • potential use of bots and fake accounts to artificially boost certain content and narratives.

Articles such as “How Romania’s presidential election became the plot of a cyber-thriller”, analyses by the European Digital Media Observatory (EDMO) and commentaries by think tanks like Friends of Europe provide detailed accounts of how TikTok content shaped voter perceptions and amplified disinformation during this period.

3.2. Annulment of the election results and foreign interference findings

In early December 2024, Romanian authorities announced the annulment of the presidential election results, following evidence of extensive foreign interference and coordinated disinformation. International media outlets, including Le Monde, reported that the election was cancelled in the context of an investigation into TikTok-driven disinformation. Analytical pieces by organisations such as the Atlantic Council highlighted that Romania became the first EU Member State whose national election was invalidated because of such hybrid operations and foreign interference.

In 2025, as Romanians returned to the polls for a re-run of the election, news agencies like Reuters documented how voters once again turned to TikTok for guidance, and how concerns about the platform’s role in spreading disinformation remained high. The “Romanian scenario” has since been referenced in debates about other European elections, often as a cautionary tale about the vulnerabilities of democracies in the age of short-form video platforms.

3.3. EU proceedings against TikTok under the DSA

Even before the Romanian case exploded, the European Commission had already taken a close interest in TikTok. In early 2024, the Commission stepped up monitoring of the platform and opened formal proceedings over suspected DSA breaches related to the protection of minors, addictive design and transparency obligations.

In December 2024, the Commission went further and opened formal proceedings under the DSA specifically regarding TikTok’s obligations to assess and mitigate systemic risks linked to election integrity, notably in the context of the Romanian elections. The proceedings focus on whether TikTok:

  • properly assessed the risks posed by its recommender systems and political content features;
  • adopted adequate and timely mitigation measures, including around the 2024 Romanian vote;
  • complied with DSA rules on advertising transparency and access to data for researchers;
  • respected obligations linked to its very large platform status.

In October 2025, the Commission published preliminary findings indicating that TikTok (and Meta) may have breached EU digital rules on advertising transparency, including by failing to ensure a sufficiently transparent and comprehensive ads library and by not detecting certain covert campaigns and hybrid threats effectively. If confirmed, such infringements could lead to fines of up to 6% of the company’s global annual turnover and to binding orders requiring structural changes in the platform’s systems.

3.4. Data retention and transparency for political advertising

The Romanian case has also shone a spotlight on data retention and political advertising transparency. Under the DSA, platforms must keep records of content moderation decisions and of advertisements displayed on their services. Under the new Regulation (EU) 2024/900 on political advertising, they must:

  • clearly label political ads, indicating their political nature;
  • identify and disclose the sponsor (who paid for the ad) and the amount spent;
  • provide information on the targeting parameters and data used for audience targeting;
  • store this information for a defined period in a publicly accessible repository so that regulators, journalists and citizens can scrutinise it.

For lawyers handling electoral disputes or DSA-related cases, access to this data is crucial. It allows them to reconstruct how a campaign was run, who financed it, which audiences were targeted and whether the platform respected its obligations. In many instances, the question is no longer only “what content was published?”, but also “how and why did the algorithm put that content in front of specific voters?”

4. Content control vs censorship: drawing the line

4.1. Why moderation is not automatically censorship

A central legal tension in this debate is the distinction between legitimate content moderation and unlawful censorship. From a human-rights perspective, both states and private platforms must respect Article 10 of the European Convention on Human Rights (ECHR) and Article 11 of the EU Charter of Fundamental Rights, which protect freedom of expression, including political speech.

At the same time:

  • platforms have the right – and sometimes the obligation – to enforce their terms of service, remove illegal content and limit harmful campaigns;
  • states have positive obligations to protect the integrity of elections and national security, including against foreign interference;
  • neither platforms nor states can impose arbitrary or disproportionate restrictions that effectively silence legitimate opinions.

The DSA tries to reconcile these interests by regulating procedures, not “truth”:

  • platforms must inform users when their content is removed or restricted and explain the reasons;
  • users must be able to appeal these decisions, first internally and then through out-of-court dispute settlement or courts;
  • platforms must publish detailed reports and provide data to researchers and regulators, enabling external scrutiny.

4.2. The risk of “over-moderation” in an electoral context

After Romania’s experience in 2024, the pressure on platforms is stronger than ever. Under threat of high fines, public criticism and political backlash, companies might be tempted to err on the side of over-moderation.

This can lead to several problematic practices:

  • removing or downranking perfectly lawful political criticism because it is “sensitive” or controversial;
  • deploying aggressive automated filters that mistakenly classify satire, investigative journalism or opposition messages as “disinformation”;
  • introducing opaque “shadow bans” where content is technically allowed but its reach is quietly suppressed.

For candidates and activists, the impact of such measures can be dramatic. Losing visibility on TikTok or similar platforms in the decisive weeks before an election can mean losing access to key voter segments. The DSA’s procedural safeguards are therefore not a luxury but a necessity: they create a legal basis to challenge arbitrary or disproportionate moderation decisions.

4.3. Risks of state overreach

The DSA also gives states new tools to request that platforms act against certain types of content, including through orders to act against illegal content or to provide information. Used wisely, these tools can help fight clearly illegal content and foreign interference. Used poorly, they can become a channel for political pressure and censorship.

Risks include:

  • using vague notions of “disinformation” to silence critical voices or investigative reporting;
  • issuing take-down orders that are not sufficiently specific or proportionate;
  • failing to ensure transparency and effective remedies for those affected by such orders.

For Romania, the lesson is clear: civil society, journalists and lawyers must monitor not only what platforms do, but also how public authorities use – or abuse – their new powers under the DSA and national implementing laws.

5. Platform liability vs liability of content creators

5.1. Intermediary liability and safe harbours

The DSA retains the core idea that hosting providers, including social media platforms, are not automatically liable for all user-generated content they host. They benefit from “safe harbours” as long as:

  • they do not have actual knowledge of illegal content; or
  • once they have such knowledge (for example via a specific notice), they act expeditiously to remove or disable access to that content.

However, for very large platforms, liability is no longer just a question of individual pieces of content. The DSA imposes a duty of care regarding systemic risks: platforms must show that they have assessed and mitigated risks such as election manipulation, regardless of whether each individual piece of content can be easily labelled “illegal” or not.

5.2. Accountability of political actors and covert networks

Platform obligations do not replace the liability of those who design and fund disinformation campaigns. Depending on the circumstances, they may face:

  • electoral sanctions – for example, for breaching campaign finance rules or advertising regulations;
  • criminal liability – for offences such as cyber-attacks, computer-related fraud, foreign interference or illegal funding of campaigns;
  • civil liability – for defamation, invasion of privacy, or other violations of personal rights.

In practice, however, uncovering the full chain of responsibility is difficult. This is where the DSA and the political advertising regulation become critical: they oblige platforms to keep detailed records and to provide data to regulators and, indirectly, to courts. Lawyers representing candidates, political parties or NGOs can rely on these obligations to request access to logs, ad repositories and other evidence that was previously hidden behind corporate secrecy.

6. National authorities and the European Commission: who does what?

6.1. The Commission as primary enforcer for very large platforms

For very large online platforms, the DSA assigns primary enforcement powers to the European Commission. The Commission can:

  • open formal proceedings to investigate suspected DSA breaches;
  • carry out inspections and audits, request extensive information and question employees;
  • adopt decisions finding an infringement and impose fines or periodic penalty payments;
  • order remedial measures, including changes to algorithms, design choices or internal procedures.

In the TikTok–Romania case, this EU-level enforcement is essential. The disinformation campaigns were cross-border, the platform is global, and the consequences of the Commission’s decisions will go far beyond one Member State.

6.2. The role of Romanian authorities in an election crisis

Romanian authorities still play a major role, even if the Commission leads the DSA proceedings. Their tasks include:

  • conducting national security and intelligence assessments regarding foreign interference;
  • managing the electoral process, including the unprecedented decision to annul the initial election results;
  • investigating potential criminal offences and violations of campaign rules;
  • cooperating with the Commission by providing evidence, contextual information and expert input.

For lawyers and litigants, understanding this multi-layered enforcement structure is crucial: complaints or legal actions may need to be directed to different institutions, at national or EU level, depending on the issue at stake.

7. Strategic litigation opportunities

7.1. Actions by candidates and political parties

Candidates and political parties affected by disinformation or by platform decisions have several potential legal avenues:

  • electoral complaints and challenges to the validity of election results, based on evidence of foreign interference and online manipulation;
  • administrative or civil actions against national authorities, if they failed to protect the integrity of the process or misused their powers under the DSA and electoral law;
  • civil actions against the platform itself, if it can be shown that it did not comply with legal obligations and that this failure caused a concrete, demonstrable harm;
  • complaints to the Digital Services Coordinator (ANCOM) and to the European Commission, which can trigger investigations and, potentially, enforcement measures.

A well-designed litigation strategy will rarely rely on a single action. Instead, it will combine electoral procedures, DSA complaints, data protection claims and even criminal investigations, with the aim of clarifying responsibilities and obtaining remedies.

7.2. Public interest litigation and representative actions

At EU level, the Directive (EU) 2020/1828 on representative actions allows qualified entities (such as consumer organisations) to bring collective actions to protect the interests of consumers. In the DSA context, this opens the door for:

  • actions against platforms that systematically fail to comply with transparency obligations;
  • litigation about systemic risks affecting large groups of users, especially minors or vulnerable communities;
  • cases challenging opaque algorithmic practices that cause widespread harm.

In Romania, NGOs, journalists and academic institutions could use such tools to challenge, for example:

  • the absence of a functional and meaningful political ads repository;
  • insufficient access for researchers to platform data relevant for studying disinformation and electoral integrity;
  • patterns of algorithmic amplification favouring extremist content or foreign-sponsored narratives.

7.3. Litigation on “overblocking” and free speech

On the other side of the spectrum, we can expect litigation concerning overblocking and alleged censorship. These cases may involve:

  • users or candidates whose content or accounts were removed, suspended or drastically downranked;
  • claims that platforms did not follow DSA procedures (no proper reasoning, no access to internal complaints or dispute mechanisms);
  • arguments based on freedom of expression under the ECHR and the EU Charter.

Over time, such cases will help define the outer limits of what platforms can do under the DSA without infringing fundamental rights. They will also clarify when states can legitimately compel platforms to act against certain content.

8. The lawyer’s role in defending users, candidates and democratic processes

8.1. For ordinary users

For ordinary users, the DSA and Law no. 50/2024 may seem abstract and technical. In reality, they create concrete, enforceable rights:

  • the right to receive an explanation when content is removed or access is restricted;
  • the right to use internal complaint mechanisms and, if necessary, out-of-court dispute settlement bodies;
  • the right to take legal action and seek compensation if platform failures cause harm;
  • the right not to be subject to generalised surveillance or indiscriminate filtering.

Lawyers can assist users by:

  • drafting focused, legally grounded complaints to platforms and regulators;
  • securing evidence (screenshots, logs, copies of communications with the platform, expert opinions);
  • assessing whether a case is suitable for litigation and what remedies are realistically available.

8.2. For candidates and political actors

For candidates, parties and campaign teams, legal advice in the digital sphere is no longer optional. The TikTok–Romania case has shown that online strategies can make or break a political campaign, and that legal risks are multifaceted.

A lawyer can help by:

  • reviewing social media strategies and contracts with agencies and influencers, to ensure compliance with DSA, political advertising rules and national electoral law;
  • designing internal procedures to document online campaigns (including spending, targeting criteria and platform interactions);
  • preparing rapid reaction plans for disinformation attacks or platform moderation incidents;
  • coordinating with experts in cybersecurity, communications and data science to build robust evidence and counter-narratives;
  • representing the client in front of national authorities, the Digital Services Coordinator, the European Commission or courts.

8.3. For journalists, NGOs and researchers

Journalists, NGOs and researchers have a special role under the DSA, particularly through the data access provisions addressed to vetted researchers and the recognition of civil society as key “watchdogs”. Legal expertise can make the difference between a rejected data access request and a successful investigation.

Lawyers can:

  • help formulate data access requests that align with DSA criteria and safeguards;
  • advise on data protection and confidentiality aspects when handling large data sets from platforms;
  • support strategic litigation aiming to open up platforms to meaningful scrutiny and to improve transparency standards.

9. Conclusions: after Romania, no more excuses

The Romanian presidential elections have transformed the DSA and related EU rules from abstract regulatory debates into concrete tools for defending democracy. TikTok’s role in amplifying disinformation and enabling foreign interference showed how vulnerable elections can be when algorithms and opaque advertising systems are left unchecked.

The DSA, Law no. 50/2024 and the new Regulation on political advertising offer a legal framework that:

  • makes platforms responsible for assessing and mitigating systemic risks;
  • provides regulators with stronger enforcement tools and higher sanctions;
  • gives users, candidates and civil society more rights and avenues for redress.

However, laws alone cannot guarantee fair elections. Their effectiveness depends on:

  • robust and independent enforcement by national authorities and the European Commission;
  • willingness of platforms to genuinely cooperate and adjust their business models;
  • active engagement by lawyers, journalists, NGOs and citizens, who must not hesitate to use the tools provided by the DSA and related regulations.

After Romania’s experience, the message is clear: Europe can no longer treat disinformation and platform opacity as secondary issues. They sit at the heart of democratic resilience. For anyone involved in law, politics or digital policy, the TikTok–Romania case will remain a reference point for years to come – a reminder that defending free speech and ensuring platform responsibility are two sides of the same constitutional coin.

FAQ – Digital Services Act, TikTok and the Romanian Elections

1. What is the Digital Services Act (DSA) in simple terms?

The DSA is an EU regulation that sets common rules for online intermediary services such as internet access providers, hosting services, social media platforms and online marketplaces. It aims to make the online environment safer, more transparent and more accountable, without imposing a general monitoring obligation on platforms.

2. How is the DSA implemented in Romania?

The DSA applies directly, but Romania adopted Law no. 50/2024 to organise supervision and enforcement. The law designates ANCOM as Digital Services Coordinator, sets procedures for investigations and sanctions, and clarifies cooperation with other national authorities and with the European Commission.

3. Why is TikTok at the centre of the Romanian elections controversy?

Because a large part of the disinformation and foreign interference campaign during the 2024 Romanian presidential election was channelled through TikTok and other social media platforms. Viral videos, covert advertising and coordinated networks of accounts promoted a far-right, pro-Russian candidate, contributing to an unexpected first-round result and, ultimately, to the annulment of the election results.

4. Is TikTok legally responsible for what happened in the Romanian elections?

The situation is complex. The decision to annul the election results was based on the overall evidence of foreign interference and disinformation, not only on TikTok’s role. Legally, TikTok is under investigation by the European Commission for possible breaches of the DSA, particularly regarding risk assessments, mitigation measures and advertising transparency. Final decisions will determine whether the platform violated EU law and, if so, what sanctions it will face.

5. What does “transparency of political advertising” mean on TikTok and other platforms?

It means that political ads must be clearly labelled as such, that users can see who paid for them and how they were targeted, and that this information is stored in a public repository for a certain period. The DSA and the new Regulation (EU) 2024/900 require platforms to provide this transparency and to give regulators and researchers access to relevant data.

6. Could the fight against disinformation lead to censorship?

Yes, there is a risk of over-moderation, especially if platforms or authorities react in a heavy-handed way. Legitimate political criticism or controversial opinions could be removed or suppressed under the broad label of “disinformation”. That is why the DSA emphasises transparency, procedural safeguards and the possibility to challenge moderation decisions, including in court.

7. What can an ordinary user do if TikTok removes or limits their content unfairly?

Under the DSA, users have the right to receive an explanation, to use internal complaint mechanisms, and to seek out-of-court or judicial redress. They can also file complaints with the Digital Services Coordinator or other competent authorities. Working with a lawyer can help structure these actions and secure the necessary evidence.

8. Why should candidates or political parties work with lawyers on their online campaigns?

Because online campaigns now involve complex interactions between electoral law, the DSA, the political advertising regulation and data protection rules. A lawyer can help design compliant strategies, avoid sanctions, react quickly to disinformation attacks and use all available legal tools – electoral complaints, DSA proceedings, civil or criminal actions – when the integrity of the campaign or of the election is at stake.