How will EC plans to reboot rules for digital services impact startups? – TechCrunch


A framework for ensuring fairness in digital marketplaces and tackling abusive behavior online is brewing in Europe, fed by a smorgasbord of issues and ideas, from online safety and the spread of disinformation, to platform accountability, data portability and the fair functioning of digital markets.

European Commission lawmakers are even turning their eye to labor rights, spurred by regional concern over unfair conditions for platform workers.

On the content side, the core question is how to balance individual freedom of expression online against threats to public discourse, safety and democracy from illegal or junk content that can be deployed cheaply, anonymously and at massive scale to pollute genuine public debate.

The age-old conviction that the cure for bad speech is more speech can stumble in the face of such scale. While illegal or harmful content can be a money spinner, outrage-driven engagement is an economic incentive that often gets overlooked or edited out of this policy debate.

Certainly the platform giants — whose business models depend on background data-mining of internet users in order to program their content-sorting and behavioral ad-targeting (activity that, notably, remains under regulatory scrutiny in relation to EU data protection law) — prefer to frame what’s at stake as a matter of free speech, rather than bad business models.

But with EU lawmakers opening a wide-ranging consultation about the future of digital regulation, there’s a chance for broader perspectives on platform power to shape the next decades online, and much more besides.

In search of cutting-edge standards

For the past two decades, the EU’s legal framework for regulating digital services has been the e-commerce Directive — a cornerstone law that harmonizes basic principles and bakes in liabilities exemptions, greasing the groove of cross-border e-commerce.

In recent years, the Commission has supplemented this by applying pressure on big platforms to self-regulate certain types of content, via a voluntary Code of Conduct on illegal hate speech takedowns — and another on disinformation. However, the codes lack legal bite and lawmakers continue to chastise platforms for not doing enough nor being transparent enough about what they are doing.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *