{"id":3738,"date":"2021-01-16T22:48:25","date_gmt":"2021-01-16T22:48:25","guid":{"rendered":"https:\/\/2024.thenexus.today\/index.php\/2021\/01\/16\/wa-privacy-legislation-2021\/"},"modified":"2021-01-16T22:48:25","modified_gmt":"2021-01-16T22:48:25","slug":"wa-privacy-legislation-2021","status":"publish","type":"post","link":"https:\/\/2024.thenexus.today\/index.php\/2021\/01\/16\/wa-privacy-legislation-2021\/","title":{"rendered":"The stakes are high: Washington state privacy, facial recognition, and automated decision systems legislation 2021"},"content":{"rendered":"<figure class=\"kg-card kg-image-card\"><img decoding=\"async\" src=\"https:\/\/jedii.tech\/wp-content\/uploads\/2021\/01\/BLDC_LegislationCoronaVirus_03182020.jpg\" class=\"kg-image\" alt=\"The word &quot;legislation,&quot; all in caps, with a notebook, graph, pen, and reading glasses in the background.\" loading=\"lazy\" width=\"1400\" height=\"750\" srcset=\"https:\/\/jedii.tech\/wp-content\/uploads\/size\/w600\/2021\/01\/BLDC_LegislationCoronaVirus_03182020.jpg 600w, https:\/\/jedii.tech\/wp-content\/uploads\/size\/w1000\/2021\/01\/BLDC_LegislationCoronaVirus_03182020.jpg 1000w, https:\/\/jedii.tech\/wp-content\/uploads\/2021\/01\/BLDC_LegislationCoronaVirus_03182020.jpg 1400w\" sizes=\"auto, (min-width: 720px) 720px\"><\/figure>\n<p><em>Most recent update: January 30.<\/em><\/p>\n<p>Washington is justifiably looked to as a tech-savvy state, and legislation that gets passed here will influence other states and the national debate. \u00a0 It&#8217;s a challenging time to be doing state legislative work, but the mostly-remote session kicked off as scheduled on January 11 and things are off to a fast start. <\/p>\n<p>Some really exciting tech bills, with a strong focus on equity and justice, really do give Washington a chance for leadership. \u00a0On the other hand, the last two legislative sessions have started off with high hopes for passing a privacy bill &#8230; and it didn&#8217;t happen. \u00a0So the stakes are high!<\/p>\n<p>Here are three important bills that can establish Washington as a leader.<\/p>\n<ul>\n<li><strong><a href=\"https:\/\/jedii.tech\/wp-content\/uploads\/2021\/01\/tech-equity-coalition.png 898w\" sizes=\"(min-width: 720px) 720px\"><\/figure>\n<p>The Tech Equity Coalition supports SB 5104 and SB 5116 as well as \u00a0the People&#8217;s Privacy Act. \u00a0 One thing I want to highlight is the connections between the issues the three bills cover. \u00a0 As AI accountability expert Deborah Raji <a href=\"https:\/\/twitter.com\/rajiinio\/status\/1351930368950153216\">says<\/a><\/p>\n<blockquote><p>A reminder that all of machine learning, not just facial recognition, encourages the proliferation of surveillance infrastructure of all sorts. Our hunger to hoard &amp; capture data of all forms, whether through web cookies or a camera, is tied to an increasing data requirement.<\/p><\/blockquote>\n<p>A couple of specific examples:<\/p>\n<ul>\n<li><a href=\"https:\/\/www.vice.com\/en\/article\/xgz4n3\/muslim-app-location-data-salaat-first\">Data brokers acquire data from Muslim prayer apps and share it with a contractor who&#8217;s part of the ICE &#8220;supply chain&#8221;<\/a>, who then plug it to algorithmic decision making systems sold by companies like <a href=\"https:\/\/nymag.com\/intelligencer\/2020\/09\/inside-palantir-technologies-peter-thiel-alex-karp.html\">Palantir<\/a>. \u00a0 The People&#8217;s Privacy Act requires consent for this kind of data sharing, covers government agencies like ICE and the military (who SB 5062 exempts). \u00a0 SB 5116&#8217;s transparency requirements force these kinds of abuses to be disclosed.<\/li>\n<li><a href=\"https:\/\/www.nytimes.com\/2020\/12\/29\/technology\/facial-recognition-misidentify-jail.html\">Nijeer Parks was arrested for a crime he did not commit based on a bad facial recognition match<\/a>, and then denied bail because New Jersey\u2019s <a href=\"https:\/\/www.thecity.nyc\/justice\/2020\/3\/6\/21210469\/new-jersey-no-bail-system-eyed-by-new-york-leaders-reckons-with-bias-risk\" rel=\"noopener noreferrer\">no-bail algorithmic decision system<\/a> decided he was a risk. \u00a0The People&#8217;s Privacy Act prohibits facial recognition systems in places of public accomodations. \u00a0SB 5104&#8217;s facial recognition moratorium stops additional deployment of these biased and inaccurate systems. \u00a0SB 5116&#8217;s transparency requirements force disclosure of the algorithms used in the risk assessment system, and prohibit ongoing use if they are biased.<\/li>\n<\/ul>\n<p>Other important Washington state bills also tackle some of the ramifications of this. \u00a0SB 5010, for example, prohibits insurers&#8217; from the discriminatory practice of using of credit scores to determine rates. \u00a0HB 1078 restores voting rights to Washingtonians directly impacted by our biased criminal legal system. \u00a0The three bills I&#8217;m highlighting here complement these vital targeted fixes with broader solutions to the systemic issues.<\/p>\n<h2 id=\"background-links\">Background links<\/h2>\n<p>This is an incredibly complex space, and especially for people who are fairly new to it, there&#8217;s a lot to catch up on. \u00a0 Here&#8217;s a list of short articles (with a couple of longer pieces thrown in) that are useful background for the different bills.<\/p>\n<ul>\n<li>The Boston Celtics players&#8217; <a href=\"https:\/\/www.bostonglobe.com\/2020\/12\/16\/opinion\/governor-baker-regulating-facial-recognition-technology-is-racial-justice-issue\/\">Regulating facial recognition technology is a racial justice issue<\/a>, Veena Dubal&#8217;s <a href=\"https:\/\/www.theguardian.com\/commentisfree\/2019\/may\/30\/san-francisco-ban-facial-recognition-surveillance\">San Francisco was right to ban facial recognition. Surveillance is a real danger<\/a>, \u00a0and Microsoft researcher Luke Stark&#8217;s <a href=\"https:\/\/static1.squarespace.com\/static\/59a34512c534a5fe6721d2b1\/t\/5cb0bf02eef1a16e422015f8\/1555087116086\/Facial+Recognition+is+Plutonium+-+Stark.pdf\">Facial Recognition is the Plutonium of AI<\/a> make the case for a facial recognition moratorium. \u00a0EFF&#8217;s <a href=\"https:\/\/www.eff.org\/deeplinks\/2020\/12\/banning-government-use-face-recognition-technology-2020-year-review\">Banning Government Use of Face Recognition Technology: 2020 Year in Review<\/a> looks at the momentum for this kind of legislation in Portland, Massachusetts, and New Orleans.<\/li>\n<li>Deborah Raji&#8217;s <a href=\"https:\/\/www.technologyreview.com\/2020\/12\/10\/1013617\/racism-data-science-artificial-intelligence-ai-opinion\/\">How our data encodes systematic racism<\/a> and Julia Angwin&#8217;s <a href=\"http:\/\/www.pulitzer.org\/finalists\/julia-angwin-jeff-larson-surya-mattu-lauren-kirchner-and-terry-parris-jr-propublica\" rel=\"noopener nofollow\">Machine Bias<\/a> series are good introductions into algorithmic inequities. \u00a0 Ruha Benjamin&#8217;s <em>The New Jim Code <\/em>has lots more detail. \u00a0 <\/li>\n<li>The Office of the Insurance Commissioner&#8217;s \u00a0<a href=\"https:\/\/medium.com\/commissioners-eye-on-insurance\/now-is-the-time-to-ban-credit-scoring-d506fe47ec29\">Now is the time to ban credit scoring<\/a> and Melissa Santos&#8217; <a href=\"https:\/\/crosscut.com\/news\/2020\/12\/bad-credit-shouldnt-mean-higher-insurance-rates-wa-official-says\">Bad credit shouldn&#8217;t mean higher insurance rates, WA official says<\/a> discuss SB 5010 and highlight how credit scores (computed by algorithmic decision systems, using data from data brokers that was acquired without people&#8217;s knowledge) discriminate aginst lower-income people and communities of color. <\/li>\n<li>The Parent Coalition for Student Privacy&#8217;s <a href=\"https:\/\/studentprivacymatters.org\/washington-privacy-act-sb5062-does-not-go-far-enough-to-protect-consumers-or-students\/\">SB5062 does not go far enough to protect consumers or students<\/a> looks at some of the problems with the Bad Washington Privacy Act.<\/li>\n<li><a href=\"https:\/\/www.vice.com\/en\/article\/jgqm5x\/us-military-location-data-xmode-locate-x\">How the U.S. Military Buys Location Data from Ordinary Apps<\/a> and <a href=\"https:\/\/www.vice.com\/en\/article\/xgz4n3\/muslim-app-location-data-salaat-first\">Leaked Location Data Shows Another Muslim Prayer App Tracking Users<\/a>, both by Joseph Cox in VICE, discuss some scary examples of apps data being sold without users&#8217; knowledge or informed consent. \u00a0 <a href=\"https:\/\/www.theguardian.com\/world\/2018\/apr\/05\/nypd-muslim-surveillance-settlement\">NYPD settles lawsuit after illegally spying on Muslims<\/a> and the San Francisco Human Rights Commssion&#8217;s <a href=\"https:\/\/sf-hrc.org\/post-911-surveillance-and-profiling-arab-african-middle-eastern-muslim-and-south-asians-aamemsa\">Post 9\/11 Surveillance and Profiling of Arab, African, Middle Eastern, Muslim and South Asians (AAMEMSA) Communities<\/a> are important context, looking at suspicionless surveillance programs that went on for years after 9\/11.<\/li>\n<li><a href=\"https:\/\/www.brennancenter.org\/our-work\/analysis-opinion\/new-yorks-contact-tracing-privacy-bill-promising-model\">New York\u2019s Contact Tracing Privacy Bill: A Promising Model<\/a>, from the Brennan Center, talks about the New York state Covid legislation that several of us suggested the legislature look to as a model.<\/li>\n<li><a href=\"https:\/\/www.technologyreview.com\/2020\/12\/21\/1015303\/stanford-vaccine-algorithm\/\">This is the Stanford vaccine algorithm that left out frontline doctors<\/a>, on <em>MIT Technology Review,<\/em> looks at how even a simple algorithm can have disastrous consequences. \u00a0 <a href=\"https:\/\/www.statnews.com\/2020\/12\/21\/stanford-covid19-vaccine-algorithm\/\">3 lessons from Stanford\u2019s Covid-19 vaccine algorithm debacle<\/a> in STAT notes that &#8220;the tool appears not to have accounted for workers\u2019 actual exposure to the virus and changes to hospital rules and protocol during the pandemic.&#8221; \u00a0Transparency requirements help surface these kinds of problems. \u00a0<a href=\"https:\/\/www.ft.com\/content\/16f4ded0-e86b-4f77-8b05-67d555838941\">Algorithms and the Coronavirus Pandemic<\/a>, in the <em>Financial Times<\/em>, looks at other pandemic-related algorithmic issues and the growing &#8220;backlash&#8221; to automated decision systems. \u00a0<\/li>\n<li><a href=\"  https:\/\/arxiv.org\/pdf\/2001.00973.pdf\">Closing the AI Accountability Gap: Defining an End-to-End Framework for Internal Algorithmic Auditing<\/a>, by multiple researchers including Deborah Raji, \u00a0Timnit Gebru, and Margaret Mitchell. \u00a0 At the time, Gebru and Mitchell led Google&#8217;s Ethical AI initiative. \u00a0<a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/23\/google-timnit-gebru-ai-ethics\/\">Gebru has since been fired<\/a>, and <a href=\"https:\/\/venturebeat.com\/2021\/01\/20\/google-targets-ai-ethics-lead-margaret-mitchell-after-firing-timnit-gebru\/\">Google is currently targeting Mitchell as well<\/a>.<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Most recent update: January 30. Washington is justifiably looked to as a tech-savvy state, and legislation that gets passed here will influence other states and the national debate. \u00a0 It&#8217;s a challenging time to be doing state legislative work, but the mostly-remote session kicked off as scheduled on January 11 and things are off to [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-3738","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3738","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/comments?post=3738"}],"version-history":[{"count":0,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/posts\/3738\/revisions"}],"wp:attachment":[{"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/media?parent=3738"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/categories?post=3738"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/2024.thenexus.today\/index.php\/wp-json\/wp\/v2\/tags?post=3738"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}