{"id":3741,"date":"2021-01-23T18:51:54","date_gmt":"2021-01-23T18:51:54","guid":{"rendered":"https:\/\/2024.thenexus.today\/index.php\/2021\/01\/23\/a-good-hearing-on-automated-decision-systems-sb5116\/"},"modified":"2021-01-23T18:51:54","modified_gmt":"2021-01-23T18:51:54","slug":"a-good-hearing-on-automated-decision-systems-sb5116","status":"publish","type":"post","link":"https:\/\/2024.thenexus.today\/index.php\/2021\/01\/23\/a-good-hearing-on-automated-decision-systems-sb5116\/","title":{"rendered":"A good hearing on Automated Decision Systems, a bad privacy bill advances: Washington state legislation update"},"content":{"rendered":"<figure class=\"kg-card kg-image-card\"><img decoding=\"async\" src=\"https:\/\/jedii.tech\/wp-content\/uploads\/2021\/01\/tech-equity-coalition-1.png\" class=\"kg-image\" alt=\"Tech E\" loading=\"lazy\" width=\"898\" height=\"586\" srcset=\"https:\/\/jedii.tech\/wp-content\/uploads\/size\/w600\/2021\/01\/tech-equity-coalition-1.png 600w, https:\/\/jedii.tech\/wp-content\/uploads\/2021\/01\/tech-equity-coalition-1.png 898w\" sizes=\"auto, (min-width: 720px) 720px\"><\/figure>\n<p>Last week&#8217;s<strong> <a href=\"__GHOST_URL__\/wa-privacy-legislation-2021\/\">The stakes are high: Washington state privacy, facial recognition, and automated decision making legislation 2021<\/a><\/strong> discussed several bills backed by the <a href=\"https:\/\/www.aclu-wa.org\/pages\/tech-equity-coalition\"><strong>Tech Equity Coalition<\/strong><\/a>, a group of civil liberties and civil rights-focused organizations and individuals working to hold technology companies accountable \u2013 as well as one bill we oppose.<\/p>\n<p>The Washington legislative season&#8217;s in high gear, so this week there are significant updates on two of the bills:<\/p>\n<ul>\n<li>SB 5116, Sen. Bob Hasegawa\u2019s bill regulating government use of automated decision making systems, had a hearing at the State Government and Elections committee featuring a lot of great testimony in support of the legislation.<\/li>\n<li>SB 5062, the Bad Washington Privacy Act, got a \u201cdo pass\u201d recommendation from the Senate Environment, Energy, and Technology (EET) committee. Since Sen. Carlyle is the EET chair, this wasn&#8217;t a particular surprise \u2013 even though this there was a lot of very negative feedback about SB 5062 from Tech Equity Coalition members and and others in its hearing last week.<\/li>\n<\/ul>\n<p>Rounding out the list, Rep. Shelley Kloba is expected to introduce the People&#8217;s Privacy Act in the House this week. \u00a0And SB 5104, the facial recognition moratorium (also sponsored by Sen. Hasegawa), is still waiting for a hearing in the EET committee.<\/p>\n<p>There&#8217;s a lot more to say about all of this legislation. \u00a0 I&#8217;ll return to SB 5062, the Bad Washington Privacy Act, in a separate post. \u00a0Read on for a deeper dive into SB 5116. \u00a0<\/p>\n<p>First though, here&#8217;s a video with some important context. \u00a0 <strong><a href=\"https:\/\/www.youtube.com\/watch?v=vDtOxrV9Bqc&amp;feature=youtu.be\">The Fight for the Future: Organizing The Tech Industry<\/a><\/strong>, with Dr. Timnit Gebru of Black in AI, Dr. Alex Hanna from the Ethical AI team at Google, Charlton Mcilwain of NYU and the Center for Critical Race and Digital Studies, Dr. Safiya Umoja Noble of UCLA and the <a href=\"https:\/\/www.c2i2.ucla.edu\/\" rel=\"nofollow noopener noreferrer\">Center for Critical Internet Inquiry (C2i2)<\/a>, labor organizer Adrienne Williams, and Meredith Whitaker of NYU and AI Now. \u00a0Once again, this isn&#8217;t directly about the legislation but is very important recent context for the wave of tech worker activism. <\/p>\n<figure class=\"kg-card kg-embed-card\"><iframe loading=\"lazy\" width=\"356\" height=\"200\" src=\"https:\/\/www.youtube.com\/embed\/vDtOxrV9Bqc?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/figure>\n<p>From the description:<\/p>\n<blockquote><p>In December of 2020 Google fired Timnit Gebru, the co-lead of their Ethical Artificial Intelligence Team, after she refused to accept their attempted censorship of her co-authored article questioning the ethics and environmental impact of largescale AI language models. The termination sparked a new wave of organizing among Tech workers who quickly mobilized to defend Gebru against the corporate giant\u2019s efforts to silence criticism of a key part of their business model. This organizing\u2014following on the heels of the walk-outs against defense contracts and preceding this month\u2019s announcement that Google workers have formed a union\u2014offers important lessons about workers\u2019 power within one of capitalism\u2019s most profitable and important sectors.<\/p><\/blockquote>\n<p>And now, back to the legislative update.<\/p>\n<h2 id=\"sb-5116-automated-decision-systems\">SB 5116: Automated Decision Systems<\/h2>\n<blockquote><p>&#8220;Automated \u00a0decision \u00a0system&#8221; \u00a0means \u00a0any \u00a0electronic \u00a0software, system, or process designed to automate, aid, or replace a decision-making process that impacts the welfare or rights of any Washington resident, and that would otherwise be performed by humans.<\/p>\n<p>\u2013 <a href=\"http:\/\/lawfilesext.leg.wa.gov\/biennium\/2021-22\/Pdf\/Bills\/Senate%20Bills\/5116.pdf\">SB 5116, Accountability and Transparency Standards for Automated Decision Systems<\/a><\/p><\/blockquote>\n<p>Automated decisions fill an increasingly important role in our society. \u00a0Unfortunately, many of these systems embed biases against Black and Indigenous people, trans and non-binary people, disabled people, non-native English speakers, people of color, and many other groups.<\/p>\n<p><a href=\"https:\/\/www.nytimes.com\/2020\/12\/29\/technology\/facial-recognition-misidentify-jail.html\">What happened to Nijeer Parks<\/a> is a clear example an of the harms automated decision systems can cause. \u00a0First, he was arrested for a crime he didn\u2019t commit &#8212; because an automated facial recognition decision system misidentified him. \u00a0Then, he was jailed because New Jersey\u2019s no-bail automated decision system decided he was a risk.<\/p>\n<p>SB 5116 sets standards for fairness and accountability, transparency requirements, and prohibition on government agencies developing or using automated decision systems to discriminate against people. \u00a0It\u2019s an important step forward against these kinds of abuses. \u00a0 I\u2019m delighted that the legislature is considering it \u2014 and I\u2019m thrilled that Sen. Patty Kuderer, who represents my district, is one of the co-sponsors.<\/p>\n<h2 id=\"some-great-testimony-\">Some great testimony!<\/h2>\n<p>SB 5116\u2019s hearing also included discussions of several other bills, some of which had hundreds of people signed up to testify, \u00a0so committee chair Sen. Sam Hunt limited oral testimony to only six people, for one minute each. \u00a0The good news is that this meant the section had wrapped up in time for us to watch Lady Gaga at the Inauguration!<\/p>\n<p>The <a href=\"http:\/\/lawfilesext.leg.wa.gov\/biennium\/2021-22\/Pdf\/Bill Reports\/Senate\/5116 SBR SGE TA 21.pdf?q=20210123003038\">Bill Report<\/a> includes a short summary of the very positive testimony at the hearing \u00a0(as well as background and a very readable description of the bill). \u00a0If you\u2019ve got about 15 minutes, <a href=\"https:\/\/www.tvw.org\/watch\/?eventID=2021011342\">check out the video of the hearing<\/a> \u2013 SB 5116 is the first topic. \u00a0Here&#8217;s my somewhat-longer summary:<\/p>\n<ul>\n<li>Jay Cunningham, a PhD Student and Ethical AI Researcher at University of Washington, made an especially powerful case, and had the best soundbite of the hearing: &#8220;What we need in AI is less artificial and more human.\u201d<\/li>\n<li>Jennifer \u00a0Lee of ACLU-Washington cited the <a href=\"https:\/\/www.theverge.com\/2018\/3\/21\/17144260\/healthcare-medicaid-algorithm-arkansas-cerebral-palsy\">Arkansas healthcare algorithm that cut benefits to disabled people.<\/a> <\/li>\n<li>Hillary Haden of Washington Fair Trade Coalition noted this would be the first bill of its kind and would set a sound standard for transparency and fairness \u2013 especially important since Washington is justifiably looked to as a tech leader.<\/li>\n<li>Ben Winters of Electronic Privacy Information Center brought national perspectives \u2013 and \u00a0emphasized the timeliness of the issue.<\/li>\n<li>Witnesses from Washington Association of Sheriffs and Police Chiefs and the Internet Association-Washington agreed with \u00a0the goals of the \u00a0legislation \u00a0and the need, although also wanted to make sure that there are not unintended consequences on systems that are in standard use today.<\/li>\n<\/ul>\n<p>Several written testimonies were also submitted in support of the bill. \u00a0A few highlights:<\/p>\n<ul>\n<li>Alka Roy (founder of the <a href=\"https:\/\/responsibleproject.com\/\">Responsible Innovation Project and a technology and AI expert<\/a>) noted that bills like these are needed &#8220;to redefine innovation to include responsibility and accountability&#8221; \u2013 and highlighted the importance of independent and periodic audits since automated systems continue to be trained on new data and evolve. \u00a0 \u00a0 \u00a0<\/li>\n<li>Shasta Willson, a software engineer who&#8217;s spent over two decades working in the industry discussed how unless care is taken to avoid bias, &#8220;these systems simply reproduce the disparities they would most idealistically remove.&#8221; \u00a0<\/li>\n<li>I noted that the requirements for an algorithmic accountability report should be straightforward for any responsible systems vendor to supply. \u00a0Conversely, if a vendor doesn\u2019t have this information available, &#8220;it\u2019s a sign that they are not applying industry-standard best practices, and so their system is very likely to have significant biases.&#8221;<\/li>\n<\/ul>\n<p>I\u2019m not sure where the written testimony is available is on the legislature\u2019s site \u2014 I\u2019ll update this page with more testimony (and hopefully a link to a repository somewhere) as I find it.<\/p>\n<h2 id=\"there-s-an-opportunity-here-\">There&#8217;s an opportunity here!<\/h2>\n<p>I agree with pretty much everything the people I&#8217;ve quoted above have brought up &#8230; and one aspecy I especially want to highlight is the timeliness of this legislation. \u00a0 Stories like Nijeer Parks&#8217;, or the Stanford algorithm that didn\u2019t allocate Covid vaccines to front-line workers, are increasing awareness of the risks and harms of automated decision systems. \u00a0At the same time, there&#8217;s the broader context of calls for accountability, ethics, and justice for tech companies and in the AI community \u2013 and the wave of tech worker activism. \u00a0<\/p>\n<p>Washington is well-positioned here, with a lot of great AI researchers and allies who have been pushing hard for these principles of accountability, fairness, transparency and justice within the AI community. \u00a0It&#8217;s also an advantage that many legislators began discussing this with last year&#8217;s AI Profiling bill and multiple years of discussion of various facial recognition bills.<\/p>\n<p>So as I said in <a href=\"__GHOST_URL__\/wa-privacy-legislation-2021\/\">The stakes are high<\/a>, there&#8217;s a real chance for Washington to be a leader here. \u00a0<\/p>\n<p>Then again, it&#8217;s early days, so we shall see. \u00a0 Stay tuned for more!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last week&#8217;s The stakes are high: Washington state privacy, facial recognition, and automated decision making legislation 2021 discussed several bills backed by the Tech Equity Coalition, a group of civil liberties and civil rights-focused organizations and individuals working to hold technology companies accountable \u2013 as well as one bill we oppose. The Washington legislative season&#8217;s [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-3741","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3741","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/comments?post=3741"}],"version-history":[{"count":0,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3741\/revisions"}],"wp:attachment":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/media?parent=3741"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/categories?post=3741"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/tags?post=3741"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}