The September 8 And the details of the rulemaking matter a lot. Too often, well-intended regulation has weaknesses that commercial surveillance companies, with their hundreds of lawyers, can easily exploit. Looking at proposals through an algorithmic justice lens can highlight where they fall short. For example, here’s how the proposed American Data Privacy and Protection Act (ADPPA) consumer privacy bill stacks up against AJL’s recommendations: Update, September 13: Color of Change’s Black Tech Agenda notes “By forcing companies to undergo independent audits, tech companies can address discrimination in their decision-making and repair the harm algorithmic that bias has done to Black communities regarding equitable access to housing, health care, employment, education, credit, and insurance.” ADPPA’s algorithmic impact assessments are too weak to protect civil rights — but it’s not too late to strengthen them goes into more detail. More positively, other regulation in the US and elsewhere highlights that these issues can be addressed. Rep. Yvette Clarke’s Algorithmic Accountability Act of 2022, for example, requires companies performing AIAs “to the extent possible, to meaningfully consult (including through participatory design, independent auditing, or soliciting or incorporating feedback) with relevant internal stakeholders (such as employees, ethics teams, and responsible technology teams) and independent external stakeholders (such as representatives of and advocates for impacted groups, civil society and advocates, and technology experts) as frequently as necessary.” AJL’s recommendation of directly involving the stakeholders most likely to be harmed by AI systems also applies to the process of creating regulations. In Hiding OUT: A Case for Queer Experiences Informing Data Privacy Laws, A. Prince Albert III (Policy Counsel at Public Knowledge) suggests using queer experiences as an analytical tool to test whether proposed a privacy regulation protects people’s privacy. Stress-testing privacy legislation with a queer lens is an example of how effective this approach can be at highlighting needed improvements. It’s also vital to look at how proposed regulations affect pregnant people, rape and incest survivors, immigrants, non-Christians, disabled people, Black and Indigenous people, unhoused people, and others who are being harmed today by commercial surveillance. So I implore you, as you continue the rulemaking process, please make sure that the historically underserved communities most harmed by commercial surveillance are at the table – and being listened to.