🤖 Robots and other AI stepping up to fight COVID-19 and will police give into the face rec tech temptation?

🤦AI to stop you from touching your face!!!

Welcome new subscribers! Let me know what you think of the newsletter. I’ll see you here again next week, same time, same tech.

The Good

A World Health Organization report released last month said that AI has been used in “preparedness, readiness and response” for the COVID-19 outbreak. Here are some of the ways the tech has already been used:

Protecting human workers. I love how these robots were put to work so quickly to disinfect and improve telemedicine. We will only see more roll out (probably literally) in the future. When it comes to pandemics, including this one, medical workers are on the front lines and die at high rates. Any pressure off of them means not only saving their lives but protecting others since healthcare doesn’t get interrupted.

Better maps. One AI detected the outbreak before it was announced and now a different project maps the outbreaks as they happen. This is an improvement from daily maps, which for a fast-moving pandemic, doesn’t quite cut it.

A faster vaccine. This one hasn’t quite happened yet, but Google’s DeepMind used AI to figure out the structure of the proteins that are likely behind covid-19. Knowing the viral protein structures will be necessary for creating a vaccine.

The Bad

A company is trying to sell body cameras to American police departments with real-time face rec tech built-in with at least one beta test happening, according to a story in OneZero. This has been a line in the sand not crossed in the U.S. (as far as we know).

There is more we don’t know:

There are still unanswered questions about Wolfcom’s live facial recognition software. The company has made no claims about how accurate its software actually is or what databases it accesses to match suspects. It’s unclear whether the company made the facial recognition algorithm itself or is licensing it from a third party. And as for the beta test, it’s unknown how many police departments have opted in to trial the technology.

Beyond the work that privacy and human rights groups have done to get this still not very robust tech banned in cities and states across the country, even companies have been wary of rolling out the technology. The Axon (formerly known as Taser) AI ethics board suggested not adding this tech into body cameras sold by Axon, which the company has so far followed.

But if some companies start to include it, the chance that others will continue to hold back is unlikely. Congress is looking at several laws that would regulate it, but let’s not hold our breath for that legislative body to save us.

While this isn’t quite body cams, there are some examples of what more surveillance looks like that is closer to home than the China examples always trotted out:

In another OneZero story (Dave has been busy), Buenos Aires citizens have lived under real-time face rec tech like this for the last year.

And while not quite real-time (I think) access for the police, Utah residents are under constant AI surveillance with access to a lot of data for predictive policing.

More News

Animated GIF

An AI to detect if you’re touching your face. Watch the video (with an added honk warning) here.

Clearview AI updates:

The face rec tech company tried to get 15 years of American mug shots for its dataset. Australian police were using Clearview AI and lied about it. The beta testers of the tool were a bunch of rich folks who did whatever they wanted with it. Oh, they also want to get into the new business of making surveillance cameras.

An algo came up with every possible melody, and then the researchers behind the tech released it all into the public domain.

The face rec tech experiment by the London police so far has an 88% false positives rate, with seven people wrongly flagged.

A biased algorithm program is being sold to lawyers as a tool to pick the “best” jurors by looking at social media and public records.

The ACLU is suing ICE over an algo that is supposed to decide if people with immigration violations should be let go or detained but went from keeping 53 percent of people locked up to 97 percent.


Until we get a new AI-powered untitled goose game where we drop the coronavirus into a lake,