Last October and November, I sketched the evolution of the government’s proposals for a digital safety commissioner. Following a consultation process last Spring, and missing the deadline of the end of the year by a few weeks, they have published their proposals for the general scheme of the Online Safety and Media Regulation Bill. In its current form, important elements are dangerously vague, and there is an unpardonable oversight in the drafting – like the famous mint, there is a hole in the middle of the Bill; unlike the mint, the hole isn’t meant to be there; and I will return to this point in the last paragraph below.
The Irish proposal is the newest in a long line of recent initiatives, at state and international level, that seek to regulate online content in various ways. For example, the EU has revised the Audiovisual Media Services Directive (AVMS II) to regulate the audiovisual sector, it is promoting a code of conduct on countering illegal hate speech online, and it has controversially expanded the reach of copyright online. The UK has proposed widespread regulation of online harms, the new government has promised to develop legislation to improve internet safety for all, and a draft Online Harm Reduction Bill is to be introduced as a Private Members Bill in the House of Lords. The German Network Enforcement Act (NetzDG) came into effect on 1 January 2018, and France is also seeking a framework (pdf) to make social media platforms more accountable (many of these proposals are critiqued here (pdf)). In Australia, the Government is seeking comments on proposals for a new Online Safety Act (critiqued here).
In Ireland, under the Government’s proposals, a new Media Commission will replace the existing Broadcasting Authority of Ireland, to regulate not just the broadcasting sector but also the audiovisual sector pursuant to AVMS II, and it will have wide-ranging and robust compliance, enforcement and sanction powers (including administrative fines (pdf) [similar to those available to the Data Protection Commission] and blocking offending online services). The press release says that the “Bill provides for the appointment of an Online Safety Commissioner as part of a wider Media Commission”, to regulate “harmful online content”, described as follows (on p78 of the pdf of the Heads of the Bill):
Head 49A – Categories of harmful online content
“harmful online content” includes –
(a) material which it is an criminal offence to disseminate under Irish [or Union law],
(b) material which is likely to have the effect of intimidating, threatening, humiliating or persecuting a person to which it pertains and which a reasonable person would conclude was the intention of its dissemination,
(c) material which is likely to encourage or promote eating disorders and which a reasonable person would conclude was the intention of its dissemination, and,
(d) material which is likely to encourage or promote [self-harm or suicide] or provides instructions on how to do so and which a reasonable person would conclude was:(i) the intention of its dissemination and
(ii) that the intention of its dissemination was not to form part of philosophical, medical and political discourse …Explanatory note:
It is not proposed to define harmful online content. Instead it is proposed to enumerate descriptions of categories of material that are considered to be harmful online content.
I wonder whether something so vague will constitute proportionate restrictions upon the freedom of political expression protected by Article 40.6.1 of the Constitution and freedom of autonomous communication protected by Article 40.3.1 (for these terms, see here). Head 9(1) (at p22 of the pdf of the Heads of the Bill) provides that one of the objectives of the Commission is to “ensure that democratic values enshrined in the Constitution, especially those relating to rightful liberty of expression are upheld”. This is all very good, and it will help to constrain the interpretation of the Act and the actions of the Commission, but it would be better if provisions of the Bill that trenched upon constitutional values and rights were drafted with care, precision and circumspection.
For the protection of children, age-inappropriate online content is described as follows (on p81 of the pdf of the Heads of the Bill):
Head 49C – Definition of age inappropriate online content
“age inappropriate online content” means material which may be unsuitable for exposure to minors and that they should not normally see or hear and which may impair their development, taking into account the best interests of minors, their evolving capacities and their full array of rights, and includes:
(a) material containing or comprising gross or gratuitous violence,
(b) material containing or comprising cruelty, including mutilation and torture, towards humans or animals, and,
(c) material containing or comprising pornography.Explanatory Note:
It is considered that there are a number of categories of material that may not be necessarily harmful but are likely inappropriate for a minor to be exposed to. A definition of inappropriate online content is included in this head on that basis to facilitate the regulator issuing online safety guidance materials, as provided for in Head 51, in relation to content rating and age-gating.
As to that “content rating and age-gating”, Head 51 provides as follows (on p85 of the pdf of the Heads of the Bill):
Head 51 – Online safety guidance materials
(1) The Media Commission may issue guidance materials in matters relevant to harmful online content and inappropriate online content.
(2) relevant and designated online services shall have regard to these guidance materials in their operations as appropriate.
(3) in preparing guidance materials the Media Commission shall have regard to, [amongst
other relevant issues], each of the following matters: …(h) the nature and prevalence of harmful online content and age inappropriate online content,
(i) the protection of minors and the general public from harmful online content and age inappropriate online content,
(j) the risk posed by harmful online content or age inappropriate online content to the users of relevant online services whereon it may be disseminated,
(k) the likelihood of users of releant online services being unintentionally exposed, by their own actions, to harmful online content or age inappropriate online content,
(l) the impact that the nature and prevalence of harmful online content age inappropriate online content may have on users of relevant online services, minors and the general public, …
These provisions comprise material already covered by the back-bench Children’s Digital Protection Bill 2018, and probably therefore supersede it. But they also raise the same vagueness concerns as Head 49A (above).
Despite the amount of time that this process has taken, these Heads seem rushed, vague, and incomplete. For example, although the supporting documentation several times says that the proposed Bill will establish an Online Safety Commissioner as part of a wider Media Commission, and that one of the one of the appointees to the Commission will serve as the Online Safety Commissioner, the Bill itself doesn’t actually say that. The only express reference to the “Online Safety Commissioner” in those terms is in Head 49B(1) (on p79 of the pdf of the Heads of the Bill), which provides that, to supplement the definition of harmful online content in Head 49A (above), the “Online Safety Commissioner may bring proposals to include or exclude further categories of material from the definition of harmful online content to the Commission”. Head 19 (from p52 of the pdf of the Heads of the Bill) provides for membership of the Media Commission; and, if one of the members is to serve as the Online Safety Commissioner, that Head would be an appropriate place to make such provision. It is neither there nor anywhere else in the draft Bill. Head 9(3) (at p23 of the pdf of the Heads of the Bill) provides that one of the objectives of the Commission is to “ensure that appropriate regulatory arrangements and systems are in place to address, where appropriate, illegal and harmful online, sound and audio-visual content”, but that objective is not expressly assigned to any particular Commissioner, such as an Online Safety Commissioner. It was expressly done in a much more straightforward fashion in the back-bench Digital Safety Commissioner Bill 2017, building on the Law Reform Commission Report on Harmful Communications and Digital Safety (2016) (pdf), and that approach might with profit have been followed in the Government’s draft Bill. Given that the Online Safety Commissioner is the whole raison d’être of the Bill, the failure to establish it properly is a shocking omission. It is to be hoped that the drafting process in the Office of the Parliamentary Counsel to the Government will bring greater precision to the text of the Bill, that pre-legislative scrutiny will bring similar precision to the policy underpinning it, and that these processes will be informed not only by developments elsewhere but also by critiques of those proposals.
Hi Eoin
Quick question. I have been through the Online Safety and Media Regulation Bill and it struck me that an unintended consequence of the bill might be that it will impact on online journalism and that an article in the print edition of a newspaper would be fine, but the online version could fall foul of the Act. Could this also impact on videos, blogs and other online content from newspapers? Michael