🛥️ Autonomous boats map unexplored lake bottoms and India's embrace of face rec tech
👩AI creates photos of topless women that look like mugshots...if that is your thing
|jackie snow||Mar 13|
Hi all! There aren’t many updates to AI’s impact on COVID-19 this week, but my breakdown last Friday is still relevant. May this covid-free newsletter be a break from the nonstop pandemic updates.
Driverless cars get most of the news coverage, but researchers want boats to go captainless, too. This story in The Verge is a nice profile of one ship named BEN and all the issues that come with trying to autonomously move a speck of welded metal safely through expansive, volatile waters, all while collecting data.
My favorite narrative about AI is how it is creating a new era of discovery. Not only in what datasets, algos, and GPUs can do, but revealing parts of our physical world that were previously beyond our reach. Ocean exploration is probably the best example of this. Compared to terra firma, oceans are unknown:
Mapping is still largely done by boat, and unlike satellites, boats need crews. It’s expensive, time-consuming work, and especially difficult in water that is shallow, rough, or remote. It’s why we know comparatively little about what lies beneath the surface of our oceans and lakes — by some estimates we’ve mapped just 9 percent of the world’s oceans to modern standards — and why BEN and vehicles like it hold so much promise.
The article does gloss over a major push behind the rush to map the ocean floor: deep-sea mining. It’s already happening, even without detailed maps or a great understanding of what blowing holes into the ocean floor might do. Shell funded an XPRIZE on creating ocean floor maps, and not out of the kindness of their hearts. It might be better if we have good maps so these companies don’t destroy important ecological areas, or it might just turbo-charge the race to extract more resources.
India owned up to using face rec tech to identify 1,100 people who were at a recent protest (called riots or violence in some publications). An official said the photos come from images from identity cards, driving licenses, and “other databases,” which likely refer to the biometric records of over one billion Indians put together by the government.
Despite the rising sectarian conflict over a new law that some say would disenfranchise Muslim citizens and rising Hindu extremism under Prime Minister Modi, officials promise that the face rec tech is fair:
“This is a software. It does not see faith. It does not see clothes. It only sees the face and through the face the person is caught.” - Amit Shah, India’s home minister.
The arguments for face rec tech almost always starts with how it can be useful for finding missing children. That was the justification for India’s program, which, as the Internet Freedom Foundation (IFF) pointed out, had an accuracy rate of one percent (!) and couldn’t identify boys from girls (!!).
The tech, however, works well enough that it found over 10,000 missing children, but they were in child care homes and discovered via matching photos to images that the institutions put together, not CCTVs or from videos taken by police officers.
That success does not justify the scope creep. Indians don’t currently have laws protecting them from this sort of tech and the tabled Personal Data Protection Bill, which has been compared to the GDPR, would actually empower the government to get even more data.
I’m not sure why Tupac is the testing ground for tech (remember the Coachella hologram performance?), but we now got a deepfake of him rapping along to Snoop Dogg lyrics.
Google created a shoe insole that tracks soccer players and analyzes their footwork via machine learning algos.
A new way to subtly change your online photos so once they inevitably get scooped up by the next Clearview AI, they are less useful to the algos trying to identify you.
AI and quantum computing are getting a Google tool to work more closely together, even though it’s unclear if there is any benefit (yet).
The ACLU wants to know what the hell the government is doing with face rec tech at airports and the border.
A startup thinks AI could “nudge” us into better, less-biased behavior.
Vermont’s Attorney General is taking the first shot in the (lawsuit) war on Clearview AI, saying the company is in violation of multiple state laws.
Literally no one:
Some anonymous developers: We built an AI that generates nude photos of (fake) women.
Until we have AI that makes working from home social enough that I stop talking to the stove after a long day,