ALBAWABA – Tech giant Meta agreed Friday to run a stress test in July ahead of the August deadline for compliance with the European Union’s (EU) new online content rules, known as the Digital Services Act (DSA), which was approved back in October 2022.
The law came into for as of November 16, 2022, but Meta had yet to comply, despite repeated warnings by EU industry chief Thierry Breton.
Meta has 1,000 employees working on the new law, Breton said in a tweet, according to Reuters.
"Productive discussion with Meta CEO Mark Zuckerberg in Menlo Park on EU digital rules: DSA, DMA & AI Act," the EU official underlined.

One of the primary objectives of the law is to protect children from targeted content on digital platforms.
The DSA also bans certain types of targeted advertisements on online platforms. These include ads meant for children or ads that use special categories of personal data, such as ethnicity, political views and sexual orientation.
Reuters reached out to Meta, but the company did not immediately respond.
Notably, Meta owns Facebook, Instagram and Whatsapp.
In general, the law aims to protect the rights enshrined in the EU Charter of Fundamental Rights, including the principle of consumer protection, the government website explained.
It also aims to achieve greater transparency in the sector, adopt better procedures for handling take-down notices and complaints and informing users, prohibit unethical practices, and enable user control of the services.
According to the European Commission, the Digital Services Act changes the following:
Measures to counter illegal goods, services or content online. These include mechanisms for users to flag such content and cooperation with “trusted flaggers”.
New obligations to enable traceability of online business users. This helps identify sellers of illegal goods.
New safeguards for users. These include challenging the platforms’ content moderation decisions
Wide-ranging transparency measures. An example would be the algorithms used for recommendation.
Obligations for very large platforms to prevent abuse of their systems. Platforms that reach more than 10 percent of the EU’s population must take risk-based action and run independent audits of their risk management systems.
Researchers will have access to data of key platforms, in order to scrutinise how platforms work.
Codes of conduct and technical standards will assist platforms and other players in their compliance with the new rules.
Other codes will enhance measures taken to ensure accessibility of platforms for people with disabilities, or support further measures on advertising.
All online intermediaries offering their services in the single market, whether they are established in the EU or outside, will have to comply with the new rules.