Scoop has an Ethical Paywall
Work smarter with a Pro licence Learn More

World Video | Defence | Foreign Affairs | Natural Events | Trade | NZ in World News | NZ National News Video | NZ Regional News | Search

 

Digital Services Act: Bad Decisions Can Lead To Global Consequences

The European Union (EU) has proved to be a global trendsetter in internet legislation, but bad decisions made now can have irreversible global impacts. It’s time to set a positive example through the Digital Services Act (DSA).

To support the EU in upholding transparency and accountability, and establishing and promoting a world standard for platform governance, civil society from across the world formed the Digital Services Act Human Rights Alliance in May, 2021. Today, the group is calling on the EU to focus on the protection of fundamental rights, laying the foundations for global human rights-centric lawmaking.

“Decisions around internet legislation made within the EU have global consequences — for better or for worse,” said Eliska Pirkova, Europe Policy Analyst and Global Freedom of Expression Lead at Access Now. “If we don’t get the Digital Services Act right today, we could be watching a wave of dangerous copycat legislation wash over the rest of the globe.”

These standards set by the EU’s DSA will influence platforms’ operations far beyond the EU, and the EU Parliament must prioritize the risks to vulnerable groups or marginalized communities — both within its borders and beyond. This is not the first time Europe may be creating a new “prototype for global online censorship” — the German Network Enforcement Act (NetzDG) has served as a model for legislation in Ethiopia, Nigeria, India, and Singapore.

Advertisement - scroll to continue reading

Are you getting our free newsletter?

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.

Amongst other recommendations, the coalition is calling for the EU to:

  • Avoid disproportionate demands on smaller providers that would put people’s access to information in serious jeopardy;
  • Not impose legally mandated automated content moderation tools on online platforms, as this will lead to over-removals of legitimate speech; and
  • Consider mandatory human rights impact assessment as the primary mechanism for assessing and mitigating systemic risks stemming from platforms' operations.

Furthermore, mandatory human rights impact assessments must be strengthened in the current DSA proposal, in particular when identifying and assessing systemic risks stemming from the functioning and use of very large online platforms — such as Facebook and Twitter.

Read the full statement.

© Scoop Media

Advertisement - scroll to continue reading
 
 
 
World Headlines

 
 
 
 
 
 
 
 
 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.