{"id":3663,"date":"2016-06-18T10:27:48","date_gmt":"2016-06-18T17:27:48","guid":{"rendered":"http:\/\/www.talesfromthe.net\/jon\/?p=3663"},"modified":"2016-06-18T10:27:48","modified_gmt":"2016-06-18T17:27:48","slug":"sarah-jeong-on-online-trolling-and-harassment","status":"publish","type":"post","link":"https:\/\/2024.thenexus.today\/index.php\/2016\/06\/18\/sarah-jeong-on-online-trolling-and-harassment\/","title":{"rendered":"Sarah Jeong on Online Trolling and Harassment"},"content":{"rendered":"<p><a href=\"http:\/\/arstechnica.com\/tech-policy\/2016\/06\/what-if-we-treated-online-harassment-the-same-way-we-treat-spam\/\"><img loading=\"lazy\" decoding=\"async\" class=\"alignright\" src=\"https:\/\/pbs.twimg.com\/profile_images\/2215576731\/ars-logo_400x400.png\" alt=\"Ars Technica logo\" width=\"164\" height=\"164\" \/><\/a>What to do on a Wednesday night in San Francisco?\u00c2\u00a0 BART over to the Longitude tiki bar in Oakland, of course, for a discussion with Ars editors Annalee Newitz and Cyrus Farivar and journalist Sarah Jeong (author of \u00e2\u20ac\u0153The Internet of Garbage\u00e2\u20ac\u009d) about <a href=\"http:\/\/arstechnica.com\/tech-policy\/2016\/06\/what-if-we-treated-online-harassment-the-same-way-we-treat-spam\/\">online trolling and harassment<\/a>!\u00c2\u00a0 I had just been working on a wiki page with a handful of links about muting, blocking, and tools for people to protect themselves online as reference material for next week\u00e2\u20ac\u2122s presentation on <a href=\"http:\/\/opensourcebridge.org\/sessions\/1790\">Supporting diversity with a new approach to software<\/a>, so the lively discussion was particularly timely \ud83d\ude42<\/p>\n<p><!--more-->The discussion covered a lot of ground, and it\u00e2\u20ac\u2122s certainly worth checking out.\u00c2\u00a0 <a href=\"http:\/\/arstechnica.com\/tech-policy\/2016\/06\/what-if-we-treated-online-harassment-the-same-way-we-treat-spam\/\">Here&#8217;s the summary on Ars&#8217; site<\/a>, and I&#8217;ve got the full video below.\u00c2\u00a0 In this post I\u00e2\u20ac\u2122ll focus on the topic of technical approaches that can help with harassment \u00e2\u20ac\u201d although as Sarah pointed out, none of these are full \u00e2\u20ac\u0153solutions\u00e2\u20ac\u009d.\u00c2\u00a0\u00c2\u00a0 Still, incremental steps help, and combining enough of them could make a real impact.<\/p>\n<p>Sarah gave a couple of examples of technologies that help: shared block lists on Twitter, and Riot Games\u00e2\u20ac\u2122 use of impromptu \u00e2\u20ac\u0153juries\u00e2\u20ac\u009d of other players when somebody is reported for violating League of Legends\u00e2\u20ac\u2122 code of conduct.\u00c2\u00a0 Both of these innovations are forms of crowdsourced moderation: rather than having the site watch and control everything, they distribute the responsibility (and the workload).\u00c2\u00a0 And both have had a positive, albeit limited, impact.<\/p>\n<p>Shared block lists on Twitter are particularly interesting in that the initial implementations (BlockBot, flamin.ga, and Block Together) came from the community, rather than Twitter.\u00c2\u00a0 Harassment and abuse has been a problem on Twitter for quite a while, and the company\u00e2\u20ac\u2122s attempts to deal with it have often missed the mark \u00e2\u20ac\u201d see for example Leigh Honeywell\u00e2\u20ac\u2122s <a href=\"https:\/\/modelviewculture.com\/pieces\/another-six-weeks-muting-vs-blocking-and-the-wolf-whistles-of-the-internet\">Another Six Weeks: Muting vs. Blocking and the Wolf Whistles of the Internet<\/a>, describing Twitter\u00e2\u20ac\u2122s disastrously bad first attempt at implementing muting.\u00c2\u00a0 So when it came to shared block lists, you\u00e2\u20ac\u2122d think that Twitter might have worked with the implementers from the community to learn from them.\u00c2\u00a0 Instead, though, Twitter appear to have ignored them.\u00c2\u00a0\u00c2\u00a0 Unsurprisingly, Twitter\u00e2\u20ac\u2122s own implementation of shared block lists is not particularly useful.<\/p>\n<p>Of course, Twitter\u00e2\u20ac\u2122s not the only company that hasn\u00e2\u20ac\u2122t paid attention to abuse until it became a problem.\u00c2\u00a0 Google+ and Diaspora* both launched without any muting and blocking functionality.\u00c2\u00a0 Microsoft didn\u00e2\u20ac\u2122t even put the simplest countermeasures in place for their Tay bot.\u00c2\u00a0 Storify didn\u00e2\u20ac\u2122t consider how their notification functionality could be turned into a vector for harassment.\u00c2\u00a0 The list goes on \u00e2\u20ac\u00a6<\/p>\n<p>In fact, I can\u00e2\u20ac\u2122t think of any social software that\u00e2\u20ac\u2122s started by prioritizing giving people the tools to protect themselves from harassment and abuse.<\/p>\n<p>And it\u00e2\u20ac\u2122s probably not a coincidence that most targets of online harassment are women, blacks and Latinxs transgender people, and other marginalized groups \u00e2\u20ac\u201d while the people creating (and funding) the software we use are disproportionately cis white and asian guys.\u00c2\u00a0 If the industry devoted even a fraction as much effort to this as to ad targeting, we\u00e2\u20ac\u2122d no doubt have made a lot more progress.<\/p>\n<p>Yes, it\u00e2\u20ac\u2122s a hard problem.\u00c2\u00a0 Still, there are plenty of approaches that can help &#8211; including crowdsourced moderation (where the state-of-the-art has regressed since Slashdot\u00e2\u20ac\u2122s work almost 20 years ago) and the techniques that the Coral project is experimenting with.\u00c2\u00a0\u00c2\u00a0 So at some point, somebody\u00e2\u20ac\u2122s going to get the \u00e2\u20ac\u0153aha!\u00e2\u20ac\u009d moment that there\u00e2\u20ac\u2122s a huge opportunity to do better \u00e2\u20ac\u00a6<\/p>\n<p>I can\u00e2\u20ac\u2122t wait!<br \/>\n<script src=\"\/\/player.cnevids.com\/embedjs\/5511d70861646d2d07060000\/video\/5769ed5533948c057100003d.js\" async=\"\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Notes from an Ars Technica Live discussion with Annalee Newitz and Cyrus Fariva<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13,16],"tags":[48,174,231,372],"class_list":["post-3663","post","type-post","status-publish","format-standard","hentry","category-social-computing","category-tales-from-the-net","tag-blocking","tag-harassment","tag-muting","tag-trolls"],"_links":{"self":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3663","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/comments?post=3663"}],"version-history":[{"count":0,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3663\/revisions"}],"wp:attachment":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/media?parent=3663"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/categories?post=3663"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/tags?post=3663"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}