🧒 AI identifies underage sexual abuse victims, Tesla removes software from a used car with no heads up
🐶 Robodog get put to work
|jackie snow||Feb 16|
Sorry for the lateness :(
Clearview AI, the tech world’s current punching bag (for good reason), just got some partially good coverage. This NYT story talks about how different law enforcement agencies are using the tech to identify children in photos being sexually abused.
This earlier, non-AI-focused story in NYT makes the case for how out-of-hand the sharing of child abuse images has gotten on the web and the need for help. It’s due to photos being easier to disseminate, abusers learning how to beat the system, and law enforcement being stretched thin. But one investigator called Clearview “the biggest breakthrough in the last decade” in the field of child sexual abuse crimes. It’s apparently already identifying children, who then can be pulled out some really horrific situations.
This sort of use (along with finding kidnapping victims) is what cops trot out any time there is a critique of this type of tech. This is not a good or even honest rebuttal for many reasons, including what a small subset of crimes those two use cases cover. The article also rightly points out there can be a lot of misidentifications, since face rec tech is not great for kids, who have rapidly changing features and aren’t well represented in the training sets to begin with.
What was a little confusing was that the CEO said they were first trying to use the tech to identify the perpetrators. As far as I know, the first rule of child abuse image-making is not to show any identifying parts of your own body, like faces, unique scars, or tattoos. And Clearview says the software only looks at faces, anyway. I’m not sure if he said this to try to cover his butt for having so many photos of kids on his platform or what, but this particular point makes no sense.
This story was from last week but is so wild I’m covering it now. A man bought a used Tesla that had the Enhanced Autopilot and Full Self Driving Capability (which doesn’t actually do autonomous driving, despite the name) installed and turned on. For some reason, Tesla took Autopilot off after he brought the car home without notice.
The two features cost $8,000 together and were clearly on the car when it was sold. While Tesla hasn’t commented on the situation, it’s possible the company views the software as something that has to be rebought every time the car changes ownership, which is a very computer industry way of thinking of things, but not a very automotive industry way of doing business.
I’ve seen a quip that it doesn’t make sense to buy from a billionaire with a sensitive ego who could brick your car over a tweet. Too many cases of software being snatch from your car will piss off a lot of people and probably get more than just the SEC breathing down Musk’s neck.
Maybe the Boston Dynamics Spot robodog has its uses after all.
A cute lil AI bot is helping keep Wikipedia up-to-date.
A Michigan state algo falsely accused more than 40,000 people of fraud and now those folks are fighting back.
Students are planning a National Day of Action to ban face rec tech from college campuses.
A new “girls-only” app uses AI to screen a user’s gender by analyzing bone structure in a photo, and you’re out of luck if you’re trans!
Homeland Security has used face rec tech on over 43 million people, and if I’m reading this correctly, only found 252 (not a typo) cases of people using fake docs?
Until we have a robodog Iditarod,