Regulating Content in our brave new digital world

The viewing habits of consumers have shifted from traditional TV screens to portable devices, particularly among younger generations.  The revised Audiovisual Media Services Directive (“the AVMSD”) emerged out of concern for minors and EU member state citizens who are exposed to harmful content online including illegal content and content that contains incitement to hatred, violence and terrorism. In particular, there are concerns that harmful online content may impair the physical, mental or moral development of minors. These issues are highlighted in the UK Government’s comprehensive Online Safety White Paper which outlines the global scale and impact of harmful content which most recently has been linked to a terrorist attack.

Directive 2018/1808 amends the 2010 AVMS Directive[1], bringing it into line with the changing landscape in the audiovisual sector. The AVSMD formerly the Television Without Frontiers Directive[2] (“TVWF Directive”) established a single market framework for broadcasting. One of the core objectives of the TVWF was the protection of minors, and the 2018 Directive builds on this objective. The Directive brings video-sharing platforms (VSP’s) into its ambit.  This includes social media sites, video-sharing sites such as You Tube, pornography sites and live streaming services.

The measures set out in the AVMSD require VSP’s to moderate the content on their platforms. One of the challenges facing legislators is how to reconcile these requirements with the "mere conduit" exemption in the eCommerce Directive?

Video-sharing platforms

Recital 47 addresses the inclusion of VSPs

A significant share of the content provided on video-sharing platform services is not under the editorial responsibility of the video-sharing platform provider. However, those providers typically determine the organisation of the content, namely programmes, user-generated videos and audiovisual commercial communications, including by automatic means or algorithms. Therefore, those providers should be required to take appropriate measures to protect minors from content that may impair their physical, mental or moral development.

Chapter IXA of AVMSD deals with the provisions applicable to VSPs.  Article 28A sets out the rules for establishing the territory of a VSP.  Where a VSP is not established in a member state then it will be necessary to examine whether its parent or subsidiary undertaking or group are established in a member state.

Appropriate measures required to be implemented by VSPs are set out in Article 28 B (3) and include the following:

  1. having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
  2. establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider harmful content provided on its platform;
  3. establishing and operating systems through which video-sharing platform providers explain to users of video-sharing platforms what effect has been given to the reporting and flagging referred to above;
  4. establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
  5. establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate harmful content;
  6. providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
  7. establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users' complaints to the video-sharing platform provider in relation to the implementation of appropriate measures;
  8. providing for effective media literacy measures and tools, and raising users' awareness of those measures and tools.

Implementation

EU Member States have until 19 September 2020 to implement the AVMSD. The UK, Germany and Ireland had invited (and in some cases closed) public consultations in this regard.

In March 2019, the Irish Government announced that it would draft new legislation in the form of an Online Safety & Media Regulation Bill, which will also serve to transpose the revised AVMSD. There is little doubt that new legislation is necessary but it remains to be seen how member states will implement the AVMSD.

Google, Facebook, the BAI and a number of civil liberty groups submitted proposals to the Irish government. The BAI proposed that VSPs should be regulated by a statutory regulator. Facebook argued that they should not be responsible for content of which they are not aware and should not be liable to monitor information which they transmit or store.

The UK Government published a comprehensive Online Safety White Paper which outlines the global scale and impact of harmful content which most recently has been linked to a terrorist attack. There is no doubt that new legislation is necessary but it remains to be seen how member states will implement the AVMSD and implementation is of particular importance to the Irish government which is home to some of Europe’s largest providers of video-sharing platform services, such as YouTube and Facebook.  We understand that the draft bill is likely to envisage a separate regulator and could be published as early as next month.

[1] 2010/13/EU

[2] Directive 97/36/EC

Comments are closed.