{"id":189,"date":"2022-04-27T19:18:01","date_gmt":"2022-04-27T19:18:01","guid":{"rendered":"https:\/\/edmohub.ie\/?p=189"},"modified":"2023-03-10T19:22:22","modified_gmt":"2023-03-10T19:22:22","slug":"regulating-harmful-content-the-narrow-scope-of-current-approaches","status":"publish","type":"post","link":"https:\/\/edmohub.ie\/index.php\/regulating-harmful-content-the-narrow-scope-of-current-approaches\/","title":{"rendered":"Regulating Harmful Content: the narrow scope of current approaches"},"content":{"rendered":"\n<p><em>Whether it be the Irish, British or Australian foray into Online Harms protection, the approach remains the same, public good being considered in the aftermath of harm already experienced, as a correction to failure.&nbsp;<\/em><\/p>\n\n\n\n<p>The recent sharp focus on damage to users caused by harmful content has seen a plethora of debates leading to new regulatory proposals including Ireland\u2019s <a href=\"https:\/\/www.gov.ie\/en\/publication\/d8e4c-online-safety-and-media-regulation-bill\/\">Online Safety and Media Regulation Bill<\/a> (OSMR). The aim of such regulatory mechanisms is to tackle content which causes harm to users, but is not in itself illegal. In effect, policymakers and regulators must define&nbsp; what amounts to harmful but legal or <a href=\"https:\/\/www.techpolicypodcast.org\/lawful-but-awful-the-complexities-of-online-content-moderation-with-elizabeth-banker-ep-254\/\">\u201clawful but awful\u201d<\/a> content. This creates a rather large grey area regarding what is harmful outside of the realm of illegal content. Due to the novelty of such a concept, there are very few other regulatory mechanisms which exist in practice. It is interesting then to compare the Irish approach with that taken in Australia and the UK and the EU\u2019s upcoming Digital Services Act (DSA). Although these regulations raise important issues regarding freedom of expression and the potential to incentivise overblocking, those are not considered here.&nbsp;<\/p>\n\n\n\n<p><strong>Defining Harm in Ireland\u2019s OSMR Bill<\/strong><\/p>\n\n\n\n<p>The Irish government first published its <a href=\"http:\/\/www.cearta.ie\/2020\/01\/the-governments-proposed-online-safety-and-media-regulation-bill-has-a-surprising-omission\/\">\u201cdangerously vague\u201d<\/a> proposal for tacking online harms in 2019. Since then, there have been rounds of submissions and consultations including <a href=\"https:\/\/fujomedia.eu\/fujo-abc-submission-on-the-online-safety-and-media-regulation-bill\/\">a joint submission by the DCU FuJo Institute and the DCU Anti-Bullying Centre.<\/a> In its current form, the OSMR Bill attempts to define specific kinds of harmful content while implementing a framework to identify other kinds of harmful content in the future. It defines \u201charmful online content\u201d summarily as: cyber-bullying; content which promotes eating disorders; and content which encourages (or details methods of) self harm or suicide.&nbsp;<\/p>\n\n\n\n<p>Attempting to define all forms of harmful content within one act would be a mammoth task.&nbsp; This bill fails to ponder much on that challenge by suggesting harms which lend themselves to a very particular kind of moral panic. Few will find fault with the aim of reducing cyber-bullying, eating disorders, or self-harm. However, while attempting to curb \u201ccontent promoting eating disorders\u201dmay be a noble aim, it may not be realistic in practice. Such a broad definition is open to subjectivity and interpretation. Consider whether any of the following could be included: a webpage titled \u201chow do I make myself lose weight fast?\u201d; content which causes feelings of body dysmorphia, or&nbsp; content that triggers such feelings unintentionally such as, for example, fitness pages on Instagram.&nbsp;<\/p>\n\n\n\n<p>Instagram is a notable concern given revelations about the <a href=\"https:\/\/www.theguardian.com\/technology\/2021\/sep\/14\/facebook-aware-instagram-harmful-effect-teenage-girls-leak-reveals\">company\u2019s internal research on harm <\/a>to teenage girls. These studies found that Instagram worsens body-image issues for \u201cone in three teen girls\u201d; that \u201cteens blame Instagram for increases in the rate of anxiety and depression\u201d; and that 13% of British users \u201c<a href=\"https:\/\/www.wsj.com\/articles\/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739\">traced the desire to kill themselves to Instagram<\/a>\u201d. A liberal interpretation of the OSMR Bill would lend support to an argument for banning access to much Instagram content.&nbsp;&nbsp;&nbsp;<\/p>\n\n\n\n<p>In response to these issues, various stakeholders have given their perspective on defining &#8216;online harms&#8217;; in many cases to a higher standard than that found in proposed bills. One succinct attempt comes from <a href=\"https:\/\/digitalaction.co\/\">Digital Action<\/a>\u2019s online harms taxonomy, which details 41 rights that can be infringed by harmful content online. This taxonomy does not attempt to distinguish between legal and illegal content and instead focuses on the rights being infringed, the impact, and examples of the harms. This taxonomy is not without flaws, but it does give an idea of the failure of the Irish OSMR Bill to create a succinct, defined and effective scope detailing the extent of online harms.&nbsp;<\/p>\n\n\n\n<p><strong>Australia\u2019s Online Safety Approach<\/strong><\/p>\n\n\n\n<p>Australia\u2019s Online Safety Act and eSafety Commissioner has received much attention.&nbsp; The Irish focus on defining specific types of harmful content is similar to that taken in Australia in recent years. Much of this work was initiated in response to&nbsp; two cases: terrorism and domestic violence. A bill in response to the 2019 Christchurch attack targeted material of <a href=\"https:\/\/www.aph.gov.au\/About_Parliament\/Parliamentary_Departments\/Parliamentary_Library\/pubs\/BriefingBook46p\/Cybersafety\">\u201cabhorrent violent material\u201d, and \u201ctechnology-facilitated domestic violence<\/a>\u201d. For the most part however, the Australian approach has focused on ensuring that \u201cstandards of behaviour online\u2026 reflect the standards that apply offline\u201d. Rather than looking at harmful, but legal content, as is seen in Ireland, Australia\u2019s \u2018online harms\u2019 regulation has shied away from directly addressing the grey area which Ireland\u2019s attempts to shine a light on.<\/p>\n\n\n\n<p><strong>The UK\u2019s Online Harms Bill<\/strong><\/p>\n\n\n\n<p>Initially, in the UK\u2019s \u2018<a href=\"https:\/\/www.gov.uk\/government\/consultations\/online-harms-white-paper\/online-harms-white-paper#the-harms-in-scope\">Online Harms White Paper<\/a>\u2019 harmful content was focused almost entirely on access to illegal content or crimes which are committed through internet use. The only focus on access to harmful, but legal content was seen in cases of \u201cunderage exposure to legal content\u201d, which covers children accessing pornogaphy or \u201cinappropriate material\u201d.&nbsp;<\/p>\n\n\n\n<p>A different approach was taken in the \u2018<a href=\"https:\/\/bills.parliament.uk\/bills\/3137\">UK Online Safety Bil<\/a>l\u2019, where a broad definition was used in targeting content which held \u201cmaterial risk\u201d of \u201ca significant adverse impact\u201d on users. This broad definition is refined in terms of the risk that the content\u2019s dissemination would have a \u201csignificant adverse physical or psychological impact on an adult or child of ordinary sensibilities\u201d and it takes into account the \u201cnumber of users that may\u2026 have encountered the content\u201d and \u201chow easily, quickly and widely the content may be disseminated by the service\u201d. This definition allows for a much broader scope than its Irish counterpart, but perhaps it is so broad that it fails to give clarity as to exactly what sort of content ought to be targeted, as such failing to meaningfully narrow the grey area which exists in defining such content. Unsurprisingly, there are <a href=\"https:\/\/medium.com\/wikimedia-policy\/early-impressions-of-the-uk-online-safety-bill-72ae8b1aedbc\">wide ranging criticisms <\/a>of the UK Bill, which was introduced on March 17th.&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>Addressing the source of harms<\/strong><\/p>\n\n\n\n<p>Regulatory mechanisms to curb online harms attempt to put out fires caused by social media without ever meaningfully attempting to address the source of such harms. As many others have argued, there is a policy failure to adequately understand the issues at hand. This failure has a number of overlapping factors including a lack of understanding of internet structures and online markets; regulatory capture by tech lobby groups; and an ideological belief that innovation and the public interest are one in the same. If we are to meaningfully address harm to users, we must identify the root causes rather than ask platforms to cooperate in firefighting missions.<\/p>\n\n\n\n<p>There are pronounced inequalities in the online environment, which call attention to the power structures of the internet, the dominance of a limited group of platforms, and the impact that their decisions have on users and the public interest. To address this a common understanding must be reached on what the public interest is in the context of digital technologies and how it should or could be served. Notably, OSMR is also about media regulation but there is no vision for what media (traditional and digital) should look like in the 21st Century. Unless policymakers take on these big conversations, it is difficult to see how the dominance of Big Tech will be challenged in anything but superficial ways.<\/p>\n\n\n\n<p>Addressing the source of online harm requires an acute understanding of how platforms create and extract wealth and a broader vision from policymakers on how they can positively control and build a digital economy and society which prioritises user safety. The obfuscation of innovation and technological advancement &#8211; with user safety and rights as a price-to-pay for such technology &#8211; risks regulators paying for such innovation without an understanding of the consequences. Whether it be the Irish, British or Australian foray into Online Harms protection, the approach remains the same, public good being considered in the aftermath of harm already experienced, as a correction to failure.&nbsp;<\/p>\n\n\n\n<p>Regulators and policymakers must be bold and challenge the current internet landscape by making public good an objective in and of itself, by attempting to create proactive protections for user rights, and curb existing and growing violations. Without an understanding and analysis of the wider implications of Big Tech, any policy which is produced is at best reactive and limited and at <a href=\"https:\/\/t.co\/Zwkl30pTpq\">worst misguided and may potentially exacerbate the harm it aims to reduce<\/a>. Large online corporations and the platforms that they offer users include design choices, which are not incidental, or neutral, nor is the impact that they have on users. Harmful content which exists on platforms would not exist without the platforms, and would not exist to the degree that it did without the design choices of such platforms allowing it to do so. Conversely, these platforms <a href=\"https:\/\/books.google.ie\/books?id=gNFAEAAAQBAJ&amp;pg=PA93&amp;lpg=PA93&amp;dq=obliging+platforms+to+accept+a+duty+of+care+lorna+woods&amp;source=bl&amp;ots=OhMTT-oP8E&amp;sig=ACfU3U1KUbfIdeNOq4gxoy6rnAQeEsR0Xw&amp;hl=en&amp;sa=X&amp;ved=2ahUKEwjBv5D6oaH3AhUUXMAKHSkxCoUQ6AF6BAgbEAM#v=onepage&amp;q=obliging%20platforms%20to%20accept%20a%20duty%20of%20care%20lorna%20woods&amp;f=false\">exploit users<\/a> through nudging, coercing and manipulating biases and interests in order to maximise revenue. The existence of harmful but legal content or other modern issues are comparable to a fire in the way in which regulators respond to them, but they are more akin to arson, as these online platforms ignore the realities of the existence of such content, even when it is proven that they know the detrimental impact it can have.&nbsp;<\/p>\n\n\n\n<p>The <a href=\"https:\/\/www.europarl.europa.eu\/news\/en\/press-room\/20220114IPR21017\/digital-services-act-regulating-platforms-for-a-safer-online-space-for-users\">Digital Services Ac<\/a>t offers some potential respite in its claims that it will \u201cenhance the accountability and transparency of algorithms and deal with content moderation\u201d, as well as introducing obligations to assess and mitigate risks, including stated aims to allow user choice in profiling, and input in algorithmic processing by platforms. Despite this, we can see the remnants of a traditional interpretation of public good, as user rights obligations are set aside for \u201cmicro and small enterprises\u201d in the interests of fostering innovation, despite the relative ease by which online platforms can grow exponentially in a short space of time.<\/p>\n\n\n\n<p><em>Cian McGrath is a PhD candidate in the DCU School of Communications and a member of the FuJo Institute.&nbsp;<\/em><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Whether it be the Irish, British or Australian foray into Online Harms protection, the approach remains the same, public good being considered in the aftermath of harm already experienced, as a correction to failure.&nbsp; The recent sharp focus on damage to users caused by harmful content has seen a plethora of debates leading to new &#8230; <a href=\"https:\/\/edmohub.ie\/index.php\/regulating-harmful-content-the-narrow-scope-of-current-approaches\/\" class=\"more-link\">Read More<span class=\"screen-reader-text\"> &#8220;Regulating Harmful Content: the narrow scope of current approaches&#8221;<\/span> &raquo;<\/a><\/p>\n","protected":false},"author":7,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_eb_attr":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"twitterCardType":"","cardImageID":0,"cardImage":"","cardTitle":"","cardDesc":"","cardImageAlt":"","cardPlayer":"","cardPlayerWidth":0,"cardPlayerHeight":0,"cardPlayerStream":"","cardPlayerCodec":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[11],"tags":[],"class_list":["post-189","post","type-post","status-publish","format-standard","hentry","category-news"],"acf":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/posts\/189","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/users\/7"}],"replies":[{"embeddable":true,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/comments?post=189"}],"version-history":[{"count":2,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/posts\/189\/revisions"}],"predecessor-version":[{"id":192,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/posts\/189\/revisions\/192"}],"wp:attachment":[{"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/media?parent=189"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/categories?post=189"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/edmohub.ie\/index.php\/wp-json\/wp\/v2\/tags?post=189"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}