⚖️ The ACLU wins a lawsuit that lets us peek into algos and Washington state gets a so-so face rec tech law
🔫🌡️Computer, is that a gun or a hand-held thermostat? Depends on your skin color!
|jackie snow||Apr 3|
The ACLU won a lawsuit that challenged the constitutionality of the Computer Fraud and Abuse Act, which makes it a federal crime to access a computer in a manner that “exceeds authorized access.”
This law stopped researchers and journos from digging into a company to see if it had any algos that were discriminating. The ACLU specifically challenge a part of the CFAA that gave a lot of power to a company’s terms of service:
Those terms, which are unilaterally set by individual sites and can change at any time, often prohibit researchers and journalists from creating tester online identities or recording what content is served up to those identities. These practices were used by, for example, investigative journalists who exposed that advertisers were using Facebook’s ad-targeting algorithm to exclude users from receiving job, housing, or credit ads based on race, gender, age, or other classes protected from discrimination in federal and state civil rights laws.
I wonder what sort of investigations we will see coming out over the next few months because of this ruling. The Intercept was a named plaintiff in the lawsuit, and I can see ProPublica and The Markup having a spring in their step over the news.
Washington state passed and signed into a law that regulates face rec tech. This WSJ article describes it as “Microsoft-backed,” which is a euphemism for “written by Microsoft and sponsored by a legislator who is also a senior program manager at the company.”
It’s better than nothing, I guess, but the previous House Bill 1654 would have temporarily banned government use of face rec tech until Washington voters had a chance to come to grips with it, which is what the ACLU and other privacy-minded folks were pushing for.
Other states are following suit, even lifting language directly from the bill. The ACLU is pretty apoplectic over the passage (and makes some good points!) but I’m not convinced that no legal framework is better than this. But if anyone could disappoint me, it’s big tech.
On Twitter, Nicolas Kayser-Bril showed what Google’s computer vision API thought of a couple of photos showing different people using hand-held thermostats into Google’s computer vision API. If you’re black, the algo sees a gun. Since the pose might be influencing the algo, someone else even cropped the photo on the left down just to the hand, photoshopped a copy to make the hand white, and it still happened.
Speaking of bias, it’s also in voice rec systems, which some researchers highlighted with a spoken word piece.
This is a fantastic look at why using AI to spot COVID-19 in CT scans right now is not a great idea.
Your phone plus some computer vision algos can check your urine for all sorts of issues, all from your own bathroom.
Oh wow, you mean algorithms, and even fancy machine learning ones, aren’t great when it comes to sussing out social service or criminal justice outcomes?
From a Buzzfeed article that grapples with what civil liberty folks want to do when it comes to balancing surveillance and public health right now:
My frustration is that we have this giant surveillance network deployed and working. We have location tracking. We have people carrying tracking devices on them all the time. But we’re using it to sell skin cream — you know, advertising. And we’re using it to try to persuade investors to put more money into companies. Since that exists and we have this crisis right now, let’s put it to use to save lives.
Until there are so few guns out there that computer vision is stumped when it sees one,