🏙️ Cities set up rules around police surveillance tools and the Feds confirm face rec tech is biased

Google's AI gives researchers more time to look at cute bear photos snapped by camera traps

Almost 24 hours late! I blame the jetlag. Sorry!

The Good

Illustration Robocop GIF by Robin Davey

I wrote about how cities across the country are passing laws to give local control over police surveillance tech. The laws, called Community Control Over Police Surveillance (CCOPS), were created by the ACLU in response to increasing use of surveillance tech that were getting put in place with no input by the people it was being used against. Face rec tech made the need even more urgent.

Each city (or state, in the case of Maine), writes its own law tailored to the specific need of the community. It’s passed in about 10 cities, with more in the works (the ACLU lawyer who runs the program told me he can’t keep track of all the places working on CCOPS at this point).

Some places are outright banning face rec tech, while others, like St. Louis, which I profiled, are making it so all purchases have to be approved by locally elected officials without explicitly outlawing anything. Critics say that there is a place for this tech (they love mentioning finding kidnapped kids) and the rules should be more lenient for a developing tech.

Fortunately, even places that have banned face rec tech can revisit its laws after the AI gets better. This is an overdue course correction to rein in the increasingly militarized police departments across the country. It’s also just in time to limit law enforcement’s use of autonomous systems.

The Bad

face the truth GIF

We’ve known face rec tech is not good when it comes to dark skin, especially women. Now the Feds are adding their own study that shows Asian and African Americans are 100 (!!!) times more likely to be misidentified than their white counterparts. Native Americans, kids, women, and the elderly also didn’t fare all that well.

The test looked at 189 algorithms voluntarily submitted by 99 companies, academic institutions, and other developers. Amazon’s system, used by at least two police departments, wasn’t part of the study. While the company hasn’t responded to this specific study, in the past they’ve said the cloud-based software is hard to test, which is very confusing since other cloud-based software has been tested without major issues. And if its hard to test, maybe it shouldn’t be out there in the hands of the non-tech savvy? Just a suggestion.

Anyway, it’s not clear exactly what the mounting evidence will achieve. We see CCOPS laws taking off, but will probably be forced to do more biometrics by private companies (which are usually not covered in CCOPS) if people don’t start making a big fuss about it, right now.

More News

bear wildlife GIF by San Diego Zoo

Google’s AI is helping supercharge a camera trap effort by automating all the boring parts.

ImageNet is an important dataset used to train a lot of computer vision algos. It also has some bias issues, which the creators are trying to fix.

An NYT doc of autonomous weapon systems. And unlike two films produced last year about AI concerns, this one includes women working on the issue!

Seattle is not so keen on biometric and face rec tech at its airport.

A major review of all the surveillance tech in China, and how it is all being weaved into a giant Big Brother apparatus.

Dems want the department of Housing and Urban Development to review face rec tech in federal housing.

The US needs to spend about $25 billion on AI by 2025 or risk losing its status at the preeminent world power.

At the big AI conference earlier this month, there was a lot of talk about machine learning combating climate change, how to think about the carbon footprint of training/running algos, and how not to make sure AI development doesn’t reinvent the wheel “badly.”


Just went through my first plane boarding with face rec tech :(,