But the most recent and sweeping move comes from the European Union. The EU is taking the lead on the Digital Services Act, which is overseen by the European Commission. The EU’s Digital Services Act, which came into force on August 25, imposes new rules on platforms, including social impact assessments and measures to prevent addictive behavior and targeting of vulnerable groups. . This law aims to increase platform responsibility and protect minors.
New EU rules will give researchers access to data on company servers. Platforms will be evaluated annually on the impact of their design, algorithms, advertising, and terms of use on a variety of social issues. Then, depending on the results of the assessment, we propose and implement measures that have been examined by the European Commission, researchers and auditing companies.
The EU is actively working with technology companies, industry associations and child-focused organizations to find ways to create platforms that better protect minors. They are working to develop a set of guidelines called the Age-Appropriate Design Code of Conduct by 2024. The regulation will set out a list of measures that the European Commission will require major social media companies to take in line with the new law.
however, social dilemma It is suggested that there is no one-size-fits-all solution to the complex problems that social media poses. A nuanced approach is key, focusing on the specific people, platforms, and functions that are most at risk. Moreover, it is reasonable and responsible to demand more data and precautionary principles when it comes to minors.
Only time will reveal the effectiveness and potential loopholes of these measures, but this law begins a chapter in the overhaul of social media companies’ practices.
Reviewed by Batuhan Aça, Şeymanur Melayim, Muhammet Ali Oruç, Shahd Qaid, Ezgi Yaramanoğlu.
Written by Dilara Ozer