The European Union’s Digital Services Act (DSA) will come into force on Saturday 17 February 2024, with the aim of making the online environment safer. The aim is to ensure that digital services take responsibility for the content they transmit, store or distribute.
The new legislation will ensure that democratic values, such as individual rights, are better respected in the information environment.
The media giants do not consider how good the content is for us. For example, algorithms can lead people deeper and deeper into conspiracy theories or content that makes mental health problems worse. The spread of misinformation has even been identified as the greatest societal threat of our time (World Economic Forum, 2024), and the fear of defamation and online harassment has also reduced freedom of expression in Finland.
In the future, platforms must guarantee user safety and combat social risks such as the spread of misinformation and information interference. The regulation also protects the rights of individuals, especially children, on platforms. Content providers will have rights over their content and platforms but will not be able to arbitrarily moderate user accounts.
What do these changes mean for each of us? Here are four things that will change from now on.
1. You control your algorithm for major services
From now on, you can set your own algorithm on very large social services*. It’ll make the platform less addictive and allow you to see posts from the accounts you follow chronologically.
So, for very large services, you’ll have the option to turn off the algorithm altogether. The platform has to respect your changes and can’t sneakily reintroduce a maximally addictive setting.
*) Very large social services include TikTok, Instagram, Youtube, Facebook, LinkedIn, Snapchat, Twitter/X.
2. Get access to your content and accounts
The change will give European content providers basic rights. Online platforms will no longer be able to delete content or freeze accounts arbitrarily. Content can only be removed if it violates the law or the terms of service. In this case, the platform must inform the content provider and provide a clear justification.
The content provider, in turn, has the right to lodge a complaint free of charge and easily, and the complaint must be dealt with as quickly as possible. These complaints cannot be handled automatically, but must be dealt with under the supervision of a qualified person. If the complaint is not resolved in this way, it can be referred to an impartial dispute resolution body and, ultimately, to a district court.
How will the EU ensure that digital giants comply?
The algorithms have so far been inaccessible. The Digital Services Regulation is prepared for the possibility that platforms will be reluctant to accept the new rules. Companies that break the law will face hefty fines, and even be banned from operating in the EU.
3. Tackling abuses
In the future, you can be sure that if you report suspected illegal content, such as scams, the online service will have to respond quickly. You can also report anonymously. If the online service does not respond to your report, you can report it to Traficom.
4. Restrictions on advertising targeting
There will be restrictions on effective advertising based on user profiling and the use of sensitive personal data. Advertising targeted at children and the use of sensitive personal data will be completely prohibited.
This means that the profile of minors can no longer be used to target advertising. This is a big change for the platforms favoured by young people.
Even if a user’s profile reveals, for example, health information, religious or political affiliations, sensitive data cannot be used to target advertising. This has particular implications for election advertising.
Recommended
Have some more.