REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online – A9-0177/2020

Source: European Parliament 2

MOTION FOR A EUROPEAN PARLIAMENT RESOLUTION

with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online

(2020/2019(INL))

The European Parliament,

 having regard to Article 225 of the Treaty on the Functioning of the European Union,

 having regard to Article 11 of the Charter of Fundamental Rights of the European Union and Article 10 of the European Convention on Human Rights,

 having regard to Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services[1],

 having regard to Directive (EU) 2019/790 of the European Parliament and of the Council of 17April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC[2],

 having regard to Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)[3] (hereinafter referred to as the “General Data Protection Regulation”),

 having regard to the Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive)[4],

 

 having regard to Directive 2008/52/EC of the European Parliament and of the Council of 21 May 2008 on certain aspects of mediation in civil and commercial matters[5],

 

 having regard to the proposal for a Regulation of the European Parliament and of the Council of 6 June 2018 establishing the Digital Europe Programme for the period 2021-2027 (COM(2018)0434),

 having regard to the Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online[6],

 having regard to the Convention on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters[7] and the Convention on the Recognition and Enforcement of Foreign Arbitral Awards, signed on 10 June 1958 in New York,

 having regard to its resolution of 3 October 2018 on distributed ledger technologies and blockchains: building trust with disintermediation[8], 

 having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 19 February 2020 on A European strategy for data (COM(2020)0066),

 having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 19 February 2020 on Shaping Europe’s digital future (COM(2020)0067),

 having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market – Opportunities and Challenges for Europe (COM(2016)0288),

 having regard to Rules 47 and 54 of its Rules of Procedure,

 having regard to the opinions of the Committee on the Internal Market and Consumer Protection and of the Committee on Culture and Education,

 having regard to the report of the Committee on Legal Affairs (A9-0177/2020),

 

A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that guarantees fundamental rights and other rights of citizens while supporting development and economic progress, the digital environment and fostering trust online, taking into account the interests of users and all market participants, including SMEs and start-ups;

B. whereas some rules regarding online content-sharing providers and audiovisual media services have recently been updated, notably by Directive (EU) 2018/18081 and Directive (EU) 2019/790, a number of key civil and commercial law aspects have not been addressed satisfactorily in Union or national law, and whereas the importance of this issue has been accentuated by rapid and accelerating development over the last decades in the field of digital services, in particular the emergence of new business models, technologies and social realities; whereas in this context, a comprehensive updating of the essential provisions of civil and commercial law applicable to online commercial entities is required;

C. whereas some businesses offering digital services enjoy, due to strong data-driven network effects, significant market power that enables them to impose their business practices on users and makes it increasingly difficult for other players , especially start-ups and SMEs, to compete and for new businesses to even enter the market;

D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market power of certain online platforms, including on fair competition in the Digital Single Market;

E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated organisms and market players, in particular social networks that harvest and exploit usage data; whereas users have legitimate grounds to expect fair terms with respect to access, transparency, pricing and conflict resolution for the usage of such platforms and for the use that platforms make of the users’ data; whereas transparency can contribute to significantly increasing trust in digital services;

 

F. whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that protects public interests, respects fundamental rights and the civil law rights of the users in particular the right to freedom of expression and information;

G. whereas upholding the law in the digital world does not only involves effective enforcement of fundamental rights, in particular freedom of expression and information, privacy, safety and security, non-discrimination, respect for property and intellectual property rights, but also access to justice and due process; whereas delegating decisions regarding the legality of content or of law enforcement powers to private companies undermines transparency and due process, leading to a fragmented approach; whereas a fast-track legal procedure with adequate guarantees is therefore required to ensure that effective remedies exist;  

H. whereas automated tools are currently unable to reliably differentiate illegal content from content that is legal in a given context and that therefore mechanisms, for the automatic detection and removal of content can raise legitimate legal concerns, in particular as regards possible restrictions of freedom of expression and information, protected under Article 11 of the Charter of Fundamental Rights of the European Union; whereas the use of automated mechanisms should, therefore, be proportionate, covering only justified cases, and following transparent procedures;

I. whereas Article 11 of the Charter of Fundamental Rights of the European Union also protects the freedom and pluralism of the media, which are increasingly dependent on online platforms to reach their audiences;

J. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the Union leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating across borders; whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union and national level with notable differences in the obligations imposed and in the enforcement mechanisms of the various civil law regimes deployed; whereas this situation has led to a fragmented set of rules for the Digital Single Market, which requires a response at Union level ;

K. whereas the current business model of certain content hosting platforms is to promote content that is likely to attract the attention of users and therefore generate more profiling data in order to offer more effective targeted advertisements and thereby increase profit; whereas this profiling coupled with targeted advertisement can lead to the amplification of content geared towards exploiting emotions, often encouraging and facilitating sensationalism in news feed and recommendation systems, resulting in the possible manipulation of users;

L. whereas offering users contextual advertisements requires less user data than targeted behavioural advertising and is thus less intrusive;

M. whereas the choice of algorithmic logic behind recommendation systems, comparison services, content curation or advertisement placements remains at the discretion of the content hosting platforms with little possibility for public oversight, which raises accountability and transparency concerns;

N. whereas content hosting platforms with significant market power make it possible for their users to use their profiles to log into third-party websites, thereby allowing them to track their activities even outside their own platform environment, which constitutes a competitive advantage in access to data for content curation algorithms;

O. whereas so-called smart contracts, which are based on distributed ledger technologies, including blockchains, that enable decentralised and fully traceable record-keeping and self-execution to occur, are being used in a number of areas without a proper legal framework; whereas there is uncertainty concerning the legality of such contracts and their enforceability in cross-border situations; 

P. whereas the non-negotiable terms and conditions of platforms often indicate both applicable law and competent courts outside the Union, which may impede access to justice; whereas Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters[9] lays down rules on jurisdiction; whereas the General Data Protection Regulation, clarifies the data subject’s right to private enforcement action directly against the controller or processor, regardless of whether the processing takes place in the Union or not and regardless whether the controller is established in the Union or not; whereas Article 79 of the General Data Protection Regulation stipulates that proceedings shall be brought before the courts of the Member State where the controller or processor has an establishment or, alternatively where the data subject has his or her habitual residence;

Q. whereas access to and mining of non-personal data is an important factor in the growth of the digital economy; whereas appropriate legal standards and data protection safeguards regarding the interoperability of data can, by removing lock-in effects, play an important part in ensuring fair market conditions ;

R. whereas it is important to assess the possibility of tasking a European entity with the responsibility of ensuring a harmonised approach to the implementation of the Digital Services Act across the Union, facilitating coordination at national level as well as addressing the new opportunities and challenges, in particular those of a cross-border nature, arising from ongoing technological developments;

Digital Services Act

 

1. Requests that the Commission submit without undue delay a set of legislative proposals constituting a Digital Services Act with an adequate material, personal and territorial scope, defining key concepts and including the recommendations as set out in the Annex to this resolution; is of the view that without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be the legal basis;

 

2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, fair, binding and uniform standards and procedures for content moderation, and guarantees accessible and independent recourse to judicial redress; stresses that legislative proposals should be evidence-based and seek to remove current and prevent potentially new unjustified barriers in the supply of digital services by online platforms while enhancing the protection of consumers and citizens; believes that the legislative proposals should aim at achieving sustainable and smart growth, address technological challenges, and ensure that the Digital Single Market is fair and safe for everyone;

3.  Further suggests that the measures proposed for content moderation only apply to illegal content rather than content that is merely harmful; suggests, to this end, that the regulation include universal criteria to determine the market power of platforms in order to provide a clear definition of what constitutes a platform with significant market power and thereby determine whether certain content hosting platforms that do not hold significant market power can be exempted from certain provisions; underlines that the framework established by the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations for all sectors;

4.  Proposes that the Digital Services Act impose an obligation on digital service providers who are established outside the Union to designate a legal representative for the interest of users within the Union, to whom requests could be addressed in order, for example, to allow for consumer redress in the case of false or misleading advertisements, and to make the contact information of that representative visible and accessible on the website of the digital service provider;

Rights as regards content moderation

 

5. Stresses that the responsibility for enforcing the law must rest with public authorities; considers that the final decision on the legality of user-generated content must be made by an independent judiciary and not a private commercial entity;

 

6. Insists that the regulation must prohibit content moderation practices that are discriminatory or entail exploitation and exclusion, especially towards the most vulnerable, and must always respect the fundamental rights and freedoms of users, in particular their freedom of expression;

7.  Stresses the necessity to better protect consumers by providing reliable and transparent information on examples of malpractice, such as the making of misleading claims and scams;

 

8. Recommends that the application of the regulation should be closely monitored by a European entity tasked with ensuring compliance by content hosting platforms with the provisions of the regulation, in particular by monitoring compliance with the standards laid down for content management on the basis of transparency reports and monitoring algorithms employed by content hosting platforms for the purpose of content management; calls on the Commission to assess the options of appointing an existing or new European Agency or European body or of coordinating itself a network of national authorities to carry out these tasks (hereinafter referred to as “the European entity”);

9. Suggests that content hosting platforms regularly submit comprehensive transparency reports based on a consistent methodology and assessed on the basis of relevant performance indicators, including on their content policies and the compliance of their terms and conditions with the provisions of the Digital Services Act , to the European entity; further suggests that content hosting platforms publish and make available in an easy and accessible manner those reports as well as their content management policies on a publicly accessible database;

10. Calls for content hosting platforms with significant market power to evaluate the risk that their content management policies of legal content pose to society, in particular with regard to their impact on fundamental rights, and to engage in a biannual dialogue with the European entity and the relevant national authorities on the basis of a presentation of transparency reports;

11. Recommends that the Member States provide for independent dispute settlement bodies, tasked with settling disputes regarding content moderation; takes the view that in order to protect anonymous publications and the general interest, not only the user who uploaded the content that is the object of a dispute but also a third party, such as an ombudsperson, with a legitimate interest in acting should be able to challenge content moderation decisions; affirms the right of users to further recourse to justice;

12. Takes the firm position that the Digital Services Act must not oblige content hosting platforms to employ any form of fully automated ex-ante controls of content unless otherwise specified in existing Union law, and considers that mechanisms voluntarily employed by platforms must not lead to ex-ante control measures based on automated tools or upload-filtering of content and must be subject to audits by the European entity to ensure that there is compliance with the Digital Services Act;

13. Stresses that content hosting platforms must be transparent in the processing of algorithms and of the data used to train them;

Rights as regards content curation, data and online advertisements

14. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements; is concerned that such practices rely on pervasive tracking and data mining; calls on the Commission to analyse the impact of such practices and take appropriate legislative measures;

15. Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require any tracking of user interaction with content and that being shown behavioural advertising should be conditional on users’ freely given, specific, informed and unambiguous consent;

16.  Notes the existing provisions addressing targeted advertising in the General Data Protection Regulation and Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)[10]; introduce transparency rules as regards the terms for accumulation of data for the purpose of offering targeted advertisements as well as their functioning and accountability

17. Recommends, therefore, that the Digital Services Act set clear boundaries and introduce transparency rules as regards the terms for accumulation of data for the purpose of offering targeted advertisements as well as regards the functioning and accountability of such targeted advertisement, especially when data are tracked on third-party websites; maintains that new measures establishing a framework for Platform-to-Consumers relations are needed as regards transparency provisions on advertising, digital nudging and preferential treatment; invites the Commission to assess options for regulating targeted advertising, including a phase-out leading to a prohibition;

18. Stresses that in line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, the Digital Services Act should provide for the right to use digital services anonymously wherever technically possible; calls on the Commission to require content hosting platforms to verify the identity of those advertisers with which they have a commercial relationship to ensure accountability of advertisers in the event content promoted is found to be illegal; recommends therefore that the Digital Services Act include legal provisions preventing platforms from commercially exploiting third-party data in situations of competition with those third parties;

19. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a streamlined exchange of necessary information; stresses that in line with the case law on communications metadata, public authorities must be given access to a user’s metadata only to investigate suspects of serious crime and with prior judicial authorisation;

20. Recommends that providers which support a single sign-on service with significant market power should be required to also support at least one open and decentralised identity system based on a non-proprietary framework; asks the Commission to propose common Union standards for national systems provided by Member States, especially as regards data protection standards and cross-border interoperability;

 

21. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing and increase transparency with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability, interconnectivity and portability of data; points out that data sharing should be accompanied by adequate and appropriate safeguards including effective anonymization of personal data;

22. Recommends that the Digital Services Act require platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience, especially through services that customise privacy settings as well as content curation preferences; suggests that platforms publicly document all application programming interfaces they make available for the purpose of allowing for the interoperability and interconnectivity of services;

23. Is strongly of the view, on the other hand, that platforms with significant market power providing an application programming interface must not share, retain, monetise or use any of the data they receive from third-party services;

 

24. Stresses that interoperability and interconnectivity obligations must not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the application programming interface providing interoperability and interconnectivity;

 

25. Recalls that the provisions on interoperability and interconnectivity must respect all relevant data protection laws; recommends, in this respect, that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Article 20(2) of the General Data Protection Regulation;

 

26. Calls for content hosting platforms to give users a real choice of whether or not to give prior consent to being shown targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third-party websites; underlines that this choice must be presented in a clear and understandable way and its refusal must not lead to access to the functionalities of the platform being disabled; stresses that consent in targeted advertising must not be considered as freely given and valid if access to the service is made conditional on data processing; reconfirms that the Directive 2002/58/EC of the European Parliament and of the Council[11] makes targeted advertising subject to an opt-in decision and that it is otherwise prohibited; notes that since the online activities of an individual allow for deep insights into their behaviour and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy; confirms that users have a right not to be subject to pervasive tracking when using digital services;

27.  Asks the Commission to ensure that, in the same spirit, consumers can still use a connected device for all its functions, even if consumers withdraw or do not give their consent to share non-operational data with the device manufacturer or third parties; reiterates the need for transparency in contract terms and conditions regarding the possibility and scope of data sharing with third parties;

 

28. Further calls for users to be guaranteed an appropriate degree of transparency and influence over the criteria according to which content is curated and made visible for them; affirms that this should also include the option to opt out from any content curation other than chronological order; points out that application programming interfaces provided by platforms should allow users to have content curated by software or services of their choice;

29. Underlines the importance for the Digital Services Act to prove legally sound and effective protection of children in the online environment, whilst refraining from imposing general monitoring or filtering obligations and ensuring full coordination and avoiding duplication with the General Data Protection Regulation and with Audiovisual Media Services Directive.

30. Recalls that paid advertisements or paid placement of sponsored content should be identified in a clear, concise and intelligent manner; suggests that platforms should disclose the origin of paid advertisements and sponsored content; suggests, to this end, that content hosting platforms publish all sponsored content and advertisements and make them clearly visible to their users in an advertising archive that is publicly accessible, indicating who has paid for them, and, if applicable, on behalf of whom; stresses that this includes both direct and indirect payments or any other remuneration received by service providers;

31. Believes that, if relevant data shows a significant gap in misleading advertising practices and enforcement between platforms based in the Union-based and platforms based in third countries, it is reasonable to consider further options to ensure compliance with the laws in force within the Union; stresses the need for a level playing field between advertisers from the Union and advertisers from third countries;

Provisions regarding terms and conditions, smart contracts and blockchains, and private international law

32.  Notes the rise of so-called smart contracts such as those based on distributed ledger technologies without a clear legal framework;

 

33. Calls on the Commission to assess the development and use of distributed ledger technologies, including blockchain and, in particular, of smart contracts, provide guidance to ensure legal certainty for business and consumers, in particular the questions of legality, enforcement of smart contracts in cross border situations, and notarisation requirements where applicable, and make proposals for the appropriate legal framework;

34. Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services must be subject to judicial review; stresses, that terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, should not be binding;

 

35. Requests that the Commission examine modalities to ensure appropriate balance and equality between the parties to smart contracts by taking into account the private concerns of the weaker party or public concerns such as those related to cartel agreements; emphasises the need to ensure that the rights of creditors in insolvency and restructuring procedures are respected; strongly recommends that smart contracts include mechanisms that can halt and reverse their execution and related payments;

 

36. Requests the Commission to in particular update its existing guidance document on Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights[12] in order to clarify whether it considers smart contracts to fall within the exemption in point (l) of Article 3(3) of that Directive, and, if so, under which circumstances, and to clarify the issue of the right to withdrawal;

 

37. Stresses the need for blockchain technologies, and smart contracts in particular, to be utilised in accordance with antitrust rules and requirements, including those prohibiting cartel agreements or concerted practices;

38. Considers that standard terms and conditions should not prevent effective access to justice in Union courts or disenfranchise Union citizens or businesses; calls on the Commission to assess whether the protection of access rights to data under private international law is uncertain and leads to disadvantages for Union citizens and businesses;

39. Emphasises the importance of ensuring that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts;

40. Concludes further that legislative solutions to these issues ought to be found at Union level if action at the international level does not seem feasible, or if there is a risk of such action taking too long to come to fruition;

41. Stresses that service providers established in the Union must not be required to remove or disable access to information that is legal in their country of origin;

o

o o

 

42. Instructs its President to forward this resolution and the accompanying detailed recommendations to the Commission and the Council.

 

 

ANNEX TO THE MOTION FOR A RESOLUTION: DETAILED RECOMMENDATIONS AS TO THE CONTENT OF THE PROPOSAL REQUESTED

A. PRINCIPLES AND AIMS OF THE PROPOSAL REQUESTED

THE KEY PRINCIPLES AND AIMS OF THE PROPOSAL:

 The proposal sets out both acts that should be included in the Digital Services Act and acts that are ancillary to the Digital Services Act.

 The proposal aims to strengthen civil and commercial law rules applicable to commercial entities operating online with respect to digital services.

 The proposal aims to strengthen and bring clarity on the contractual rights of users in relation to content moderation and curation.

 The proposal aims to further address inadmissible and unfair terms and conditions used for the purpose of digital services.

 The proposal addresses the issue of aspects of data collection being in contravention of fair contractual rights of users as well as data protection and online confidentiality rules.

 The proposal addresses the importance of fair implementation of the rights of users as regards interoperability and portability.

 The proposal raises the importance of private international law rules that provide legal clarity on the non-negotiable terms and conditions used by online platforms, as well as of ensuring the right to access data and guaranteeing access to justice.

 The proposal does not address aspects related to the regulation of online market places, which should nevertheless be considered by the Digital Services Act Package to be proposed by the Commission.

 The proposal raises the need for assessment of the necessity of proper regulation of civil and commercial law aspects in the field of distributed ledger technologies, including blockchains and, in particular, addresses the necessity of the proper regulation of civil and commercial law aspects of smart contracts.

I. PROPOSALS TO BE INCLUDED IN THE DIGITAL SERVICES ACT

The key elements of the proposals to be included in the Digital Services Act should be:

A regulation on contractual rights as regards content management’ and that contains the following elements:

 It should apply to content management, including content moderation and curation, with regard to content accessible in the Union.

 It should provide proportionate principles for content moderation.

 It should provide formal and procedural standards for a notice and action mechanism which are proportionate to the platform and the nature and impact of the harm, effective, and future-proof.

 It should provide for an independent dispute settlement mechanism in the Member States without limiting access to judicial redress.

 It should indicate a set of clear indicators to define the market power of content hosting platforms in order to determine whether certain content hosting platforms that do not hold significant market power can be exempted from certain provisions. Such indicators could include the size of its network (number of users), its financial strength, access to data, the degree of vertical integration, or the presence of lock-in effect.

 It should provide rules regarding the responsibility of content hosting platforms for goods sold or advertised on them taking into account supporting activities for SMEs in order to minimize their burden when adapting to this responsibility.

 It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options. In this regard, any measure in the Digital Services Act should concern only illegal content as defined in Union and national law.

 It should be based upon established principles as regards determining the law applicable to compliance with administrative law, and should – in light of the increasing convergence of user rights – clearly state that all aspects within its scope are governed by those principles.

 It should fully respect the Charter of Fundamental Rights of the European Union, as well as Union rules protecting users and their safety, privacy and personal data, as well as other fundamental rights.

 It should provide a dialogue between content hosting platforms with significant market power and the European entity on the risk management of content management of legal content.

The Commission should consider options for a European entity tasked with ensuring compliance with the provisions of the proposal through the following measures:

 regular monitoring of the algorithms employed by content hosting platforms for the purpose of content management;

 regular review of the compliance of content hosting platforms with the provisions of the regulation, on the basis of transparency reports provided by the content-hosting platforms and the public database of decisions on removal of content to be established by the Digital Services Act;

 working with content hosting platforms on best practices to meet the transparency and accountability requirements for terms and conditions, as well as best practices in content moderation and implementing notice-and-action procedures;

 cooperating and coordinating with the national authorities of Member States as regards the implementation of the Digital Services Act;

 managing a dedicated fund to assist the Member States in financing the operating costs of the independent dispute settlement bodies described in the regulation, funded by fines imposed on content hosting platforms for non-compliance with the provisions of the Digital Services Act as well as a contribution by content hosting platforms with significant market power;

 imposing fines for non-compliance with the Digital Services Act. The fines should contribute to the special dedicated fund intended to assist the Member States in financing the operating costs of the dispute settlement bodies described in the regulation. Instances of non-compliance should include:

o failure to implement the provisions of the regulation;

o failure to provide transparent, accessible, fair and non-discriminatory terms and conditions;

o failure to provide the European entity with access to content management algorithms for review;

o failure to submit transparency reports to the European entity;

 publishing biannual reports on all of its activities and reporting to Union institutions.

Transparency reports regarding content management should be established as follows:

The Digital Services Act should contain provisions requiring content hosting platforms to regularly publish and provide transparency reports to the European entity. Such reports should be comprehensive, following a consistent methodology, and should include in particular:

 information on notices processed by the content hosting platform, including the following:

o the total number of notices received, for which types of content, and the action taken accordingly;

o the number of notices received per category of submitting entity, such as private individuals, public authorities or private undertakings;

o the total number of removal requests complied with and the total number of referrals of content to competent authorities;

o the total number of counter-notices or appeals received as well as information on how they were resolved;

o the average lapse of time between publication, notice, counter-notice and action;

 information on the number of staff employed for content moderation, their location, education, and language skills, as well as any algorithms used to take decisions;

 information on requests for information by public authorities, such as those responsible for law enforcement, including the numbers of fully complied with requests and requests that were not or only partially complied with;

 information on the enforcement of terms and conditions and information on the court decisions ordering the annulment and/or modification of terms and conditions considered illegal by a Member State.

Content hosting platforms should, in addition, publish their decisions on content removal on a publicly accessible database to increase transparency for users.

The independent dispute settlement bodies to be established by the regulation should issue reports on the number of referrals brought before them, including the number of referrals given heed to.

II. PROPOSALS ANCILLARY TO THE DIGITAL SERVICES ACT

Measures regarding content curation, data and online advertisements in breach of fair contractual rights of users should include:

  Measures to minimise the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements and by requiring freely given, specific, informed and unambiguous prior consent of the user. Consent to targeted advertising shall not be considered as freely given and valid if access to the service is made conditional on data processing.

 Users of content hosting platforms shall be informed if they are subject to targeted advertising, given access to their profile built by content hosting platforms and the possibility to modify it, and given the choice to opt in or out and withdraw their consent to be subject to targeted advertisements.

 Content hosting platforms should make available an archive of sponsored content and advertisements that were shown to their users, including the following:

o whether the sponsored content or sponsorship is currently active or inactive;

o the timespan during which the sponsored content advertisement was active;

o the name and contact details of the sponsor or advertiser, and, if different, on behalf of whom the sponsored content or advertisement was placed;

o the total number of users reached;

o information on the group of users targeted.

The path to fair implementation of the rights of users as regards interoperability interconnectivity and portability should include:

 an assessment of the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power, in particular through the interoperability, interconnectivity and portability of data.

 a requirement for platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience, especially through services that customise privacy settings as well as content curation preferences;

 provisions ensuring that platforms with significant market power providing an application programming interface may not share, retain, monetise or use any of the data they receive from third-party services;

 provisions ensuring that the interoperability and interconnectivity obligations may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the application programming interface providing interoperability and interconnectivity;

 provisions ensuring that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Article 20(2) of the General Data Protection Regulation;

 provisions ensuring that content hosting platforms with significant market power publicly document all application programming interfaces they make available for the purpose of allowing for the interoperability and interconnectivity of services.

The path to the proper regulation of civil and commercial law aspects of distributed ledger technologies, including blockchains and, in particular, smart contracts should comprise:

 

 measures ensuring that the proper legislative framework is in place for the development and deployment of digital services including distributed ledger technologies, such as blockchains and smart contracts;

 

 measures ensuring that smart contracts are fitted with mechanisms that can halt and reverse their execution, in particular given private concerns of the weaker party or public concerns such as those related to cartel agreements and in respect for the rights of creditors in insolvency and restructuring procedures;

 

 measures to ensure appropriate balance and equality between the parties to smart contracts, taking into account, in particular, the interest of small businesses and SMEs, for which the Commission should examine possible modalities;

 

 an update of the existing guidance document on Directive 2011/83/EU in order to clarify whether smart contracts fall within the exemption in point (i) of Article 3(3) of that Directive as well as issues related to cross-border transactions, notarisation requirements and the right to withdrawal;

The path to equitable private international law rules that do not deprive users of access to justice should:

 ensure that standard terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice, in particular through the effective enforcement of existing measures in this regard;

 include measures clarifying private international law rules concerning the activities of platforms regarding data, so that they are not detrimental to Union subjects; 

 build on multilateralism and, if possible, be agreed in the appropriate international fora.

Only where it proves impossible to achieve a solution based on multilateralism in reasonable time, should measures applied within the Union be proposed, in order to ensure that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts.

B. TEXT OF THE LEGISLATIVE PROPOSAL REQUESTED

Proposal for a

REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

on contractual rights as regards content management

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 114 thereof,

Having regard to the proposal from the European Commission,

After transmission of the draft legislative act to the national parliaments,

Having regard to the opinion of the European Economic and Social Committee,

Acting in accordance with the ordinary legislative procedure,

Whereas:

(1) The terms and conditions that digital service providers apply in relations with users are often non-negotiable and can be unilaterally amended by those providers. Action at a legislative level is needed to put in place minimum standards for such terms and conditions, in particular as regards procedural standards for content management;

(2) The civil law regimes governing the practices of content hosting platforms as regards content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States at national level, and there are notable differences in the obligations imposed by those civil law regimes on content hosting platforms and in their enforcement mechanisms.

 

(3) The resulting fragmentation of civil law regimes governing content moderation by content hosting platforms not only creates legal uncertainties, which might lead such platforms to adopt stricter practices than necessary in order to minimise the risks brought about by the use of their service, but also leads to a fragmentation of the Digital Single Market, which hinders growth and innovation and the development of European businesses in the Digital Single Market.

 

(4) Given the detrimental effects of the fragmentation of the Digital Single Market, and the resulting legal uncertainty for businesses and consumers, the international character of content hosting, the vast amount of content requiring moderation, and the significant market power of a few content hosting platforms located outside the Union, the various issues that arise in respect of content hosting need to be regulated in a manner that entails full harmonisation and therefore by means of a regulation.

(5) Concerning relations with users, this Regulation should lay down minimum standards for the fairness, transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should be clear, accessible, intelligible and unambiguous and include fair, transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress and comply with fundamental rights.

(6) User-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements.

(7) Recalls that algorithms that decide on the ranking of search results influence individual and social communications and interactions and can be opinion-forming, especially in the case of media content.

(8) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of transparency and influence over the curation of content made visible to them, including the possibility to opt out of any content curation other than chronological order altogether. In particular, users should not be subject to curation without freely given, specific, informed and unambiguous prior consent. Consent in targeted advertising should not be considered as freely given and valid if access to the service is made conditional on data processing.

(9) Consent given in a general manner by a user to the terms and conditions of content hosting platforms or to any other general description of the rules relating to content management by content hosting platforms should not be taken as sufficient consent for the display of automatically curated content to the user.

 (10) This Regulation  does not oblige content hosting platforms to employ any form of automated ex-ante control of content, unless otherwise specified in existing Union law, and provide that content moderation procedures used voluntarily by platforms do not lead to ex-ante control measures based on automated tools or upload-filtering of content.

 (11) This Regulation should also include provisions against discriminatory content moderation practices, exploitation or exclusion, for the purposes of content moderation, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.

 (12) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application.

 (13) After a notice has been issued, the uploader should be informed thereof by the content hosting platform and in particular about the reason for the notice and for the action to be taken, and should be provided information about the procedure, including about appeal and referral to independent dispute settlement bodies, and about available remedies in the event of false notices. Such information should, however, not be given if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations. In such case, it should be for the relevant authorities to inform the uploader about the issue of a notice, in accordance with applicable rules.

(14) All concerned parties should be informed about a decision as regards a notice. The information provided to concerned parties should also include, apart from the outcome of the decision, at least the reason for the decision and whether the decision was made solely by a human, as well as relevant information regarding review or redress.

(15)  Content should be considered as manifestly illegal if it is unmistakably and without requiring in-depth examination in breach of legal provisions regulating the legality of content on the internet.

(16) Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to provide independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse. Such bodies should be competent to adjudicate disputes concerning the legality of user-uploaded content and the correct application of terms and conditions. However that process should not prevent the user from having the right of access to justice and further judicial redress.

(17) The establishment of independent dispute settlement bodies could relieve the burden on courts, by providing a fast resolution of disputes over content management decisions without prejudice to the right to judicial redress before a court. Given that content hosting platforms which enjoy significant market power can particularly gain from the introduction of independent dispute settlement bodies, it is appropriate that they contribute to the financing of such bodies. This fund should be independently managed by the European entity in order to assist the Member States in financing the running costs of the independent dispute settlement bodies. Member States should ensure that such bodies are provided with adequate resources to ensure their competence and independence.

(18) Users should have the right to referral to a fair and independent dispute settlement body, as an alternative dispute settlement mechanism, to contest a decision taken by a content hosting platform following a notice concerning content they uploaded. Notifiers should have that right if they would have legal standing in a civil procedure regarding the content in question.

(19) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploaded. It should always be possible for natural persons to bring complaints to the independent dispute settlement body of their Member States of residence.

(20) Whistleblowing helps to prevent breaches of law and detect threats or harm to the general interest that would otherwise remain undetected. Providing protection for whistleblowers plays an important role in protecting freedom of expression, media freedom and the public’s right to access information. Directive (EU) 2019/1937 of the European Parliament and of the Council[13] should therefore apply to the relevant breaches of this Regulation. Accordingly, that Directive should be amended. 

(21) This Regulation should include obligations to report on its implementation and to review it within a reasonable time. For this purpose, the independent dispute settlement bodies provided for by Member States under this Regulation should submit reports on the number of referrals brought before them, the decisions taken – anonymising personal data as appropriate – including the number of referrals dealt with, data on systemic problems, trends and the identification of platforms not complying with decisions of independent dispute settlement bodies.

(22) Since the objective of this Regulation, namely to establish a regulatory framework for contractual rights as regards content management in the Union, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects, be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective.

(23) Action at Union level as set out in this Regulation would be substantially enhanced by a European entity tasked with appropriate monitoring and ensuring compliance by content hosting platforms with the provisions of this Regulation. For this purpose, the Commission should consider the options of appointing an existing or new European Agency or European body or coordinating a network or national authorities in order to review compliance with the standards laid down for content management on the basis of transparency reports and the monitoring of algorithms employed by content hosting platforms for the purpose of content management (hereinafter referred to as ‘the European entity’).

 

(24) In order to ensure that the risks presented by content amplification are evaluated, a biannual dialogue on the impact of content management policies of legal content on fundamental rights should be established between content hosting platforms with significant market power and the European entity together with relevant national authorities.

(25) This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter of Fundamental Rights of the European Union as enshrined in the Treaties, in particular the freedom of expression and information, and the right to an effective remedy and to a fair trial ­ ­̶

HAVE ADOPTED THIS REGULATION:

Article 1

Purpose

 

The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to ensure that fair contractual rights exist as regards content management and to provide independent dispute settlement mechanisms for disputes regarding content management.

Article 2

Scope of application

 

1.  This Regulation applies to content hosting platforms that host and manage content that is accessible to the public on websites or through applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform.

2.  This Regulation does not apply to content hosting platforms that:

(a) are of a non-commercial nature; or

(b) have fewer than [100 000][14] users.

Article 3

Definitions

 

For the purposes of this Regulation, the following definitions apply:

(1)  ‘content hosting platform’ means an information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council

OPINION OF THE COMMITTEE ON THE INTERNAL MARKET AND CONSUMER PROTECTION (9.7.2020) [28] (“eIDAS Regulation”) in the light of the development of virtual identification technologies, in order to improve the efficiency of electronic interactions between businesses and consumers.

B. RECOMMENDATIONS

Recommendation 1. Purpose

The proposals should aim to strengthen civil and commercial law rules applicable to commercial entities operating online with respect to digital services, including, where concrete gaps are identified following a thorough impact assessment, civil and commercial law aspects of distributed ledger technologies and, in particular, smart contracts.

The proposals should also seek to make contract terms and conditions more understandable, and give individuals an effective option to opt-out of some clauses or to negotiate individual terms.

Recommendation 2. Scope

The proposals on contractual rights should only focus on civil and commercial law aspects and should not affect the E-commerce Directive. They should be consistent with the rules on advertising, set out by the Unfair Commercial Practices Directive and the rules on digital content and digital services, laid down by the Digital Content Directive.

Recommendation 3. General principles

Principle of transparency

Any terms and conditions or other clauses of use should be easily accessible and easy to understand, and clear and plain language should be used. Consumers should receive correct and adequate information about the functionalities and technical restrictions of digital content and digital services, in order to avoid incorrect and misleading advertising. If a connected product or a service depends on one or more services to function, or to function optimally, advertisers and advertising intermediaries must ensure that the consumers understand that the product or the service cannot be used without the additional service. The Commission should establish a template for a summary of the key contract terms and conditions or end-user licence agreements (EULAs) to be displayed in the beginning, in order for the consumers to be able to identify the most important points and to understand the consequences of their consent.

Principle of fairness

Any terms and conditions or other clauses of use that are not strictly essential to provide a digital service or that are not required by law should be amendable or removable before acceptance by an end-user (‘opted-out’).

Businesses should equally be able to limit some services if an individual decides to choose such ‘opt-outs’, but should not to be able to deny access altogether or restrict essential elements of a digital service or a physical product linked or otherwise connected to a digital service.

Principle of legal certainty

It should be clearly established that whenever, inter alia, contract terms and conditions and smart contracts fall under the legal definition of a contract, all relevant provisions on consumer protection, set out in the Consumer Rights Directive, should apply.

It should be clarified whether informed consent can be assumed by the mere acceptance of terms and conditions or whether use of a digital service is done without evidence that an end-user has read such terms and conditions or other clauses of use.

Enforcement and penalties

Member States should better enforce the right of consumers right to informed consent and freedom of choice when submitting data to advertisers and advertisement intermediaries. Member States should allow for consumer redress and lay down the rules on penalties applicable to infringements of rules on contractual rights and take all measures necessary to ensure that they are implemented. The penalties provided for need to be effective, proportionate and dissuasive.

INFORMATION ON ADOPTION IN COMMITTEE ASKED FOR OPINION

Date adopted

7.7.2020

 

 

 

Result of final vote

+:

–:

0:

39

1

4

Members present for the final vote

Alex Agius Saliba, Andrus Ansip, Brando Benifei, Adam Bielan, Hynek Blaško, Biljana Borzan, Vlad-Marius Botoş, Markus Buchheit, Dita Charanzová, Deirdre Clune, David Cormand, Petra De Sutter, Carlo Fidanza, Evelyne Gebhardt, Alexandra Geese, Sandro Gozi, Maria Grapini, Svenja Hahn, Virginie Joron, Eugen Jurzyca, Arba Kokalari, Marcel Kolaja, Kateřina Konečná, Andrey Kovatchev, Jean-Lin Lacapelle, Maria-Manuel Leitão-Marques, Adriana Maldonado López, Antonius Manders, Beata Mazurek, Leszek Miller, Kris Peeters, Anne-Sophie Pelletier, Christel Schaldemose, Andreas Schwab, Tomislav Sokol, Ivan Štefanec, Kim Van Sparrentak, Marion Walsmann, Marco Zullo

Substitutes present for the final vote

Pascal Arimont, Marco Campomenosi, Maria da Graça Carvalho, Edina Tóth, Stéphanie Yon-Courtin