You’ve gone home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to contract an STI, so… what
You’ve gone home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to contract an STI, so… what now?
A company called Calmara wants you to snap a photo of the guy’s penis, then use its AI to tell you if your partner is “clear” or not.
Let’s get something out of the way right off the bat: You should not take a picture of anyone’s genitals and scan it with an AI tool to decide whether or not you should have sex.
If you can't trust AI medical startups operating out of Silicon Valley with pictures of your genitals, well...THEN WHO CAN YOU TRUST?
I mean, to be fair, it also looks like they might be partially financially backed by a foreign authoritarian regime, and they usually have pretty good AI models....so...
We are reaching the phase where ai is de facto a magic spell to be cast on reality, and ai startup are hyping this up. That and taking pics of stranger's genitals is a dick move.
Man, what about false positives? Ruined date night at minimum, possibly ruined reputation, relationships.
Sorry, we put a picture of your junk into this box. We don't know what's in it, or what it does with the picture, but it says you have chlamydia, and I think the box looks trustworthy. Here's your divorce papers.
Or if still concerned after the fact, a doctor. Despite what your GOP neighbor might tell you, they're not all evil quacks and don't typically take pictures of your stuff either.
Single reason why this is suspicious from the start:
Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.
This definitely won't be misused in any way that would completely destroy the good name of the person taking/in the frame of the image. It's just one "probable cause" search from a bad day.
Maybe they will use the photo to match it with doctor notes and photos medically taken of the same penis or vagina to then illegally match them to illegally obtained health records. Probably not though.
“With lab diagnosis, sensitivity and specificity are two key measures that help us understand the test’s propensity for missing infections and for false positives,” Daphne Chen, founder of TBD Health, told TechCrunch.
HeHealth is framed as a first step for assessing sexual health; then, the platform helps users connect with partner clinics in their area to schedule an appointment for an actual, comprehensive screening.
HeHealth’s approach is more reassuring than Calmara’s, but that’s a low bar — and even then, there’s a giant red flag waving: data privacy.
“It’s good to see that they offer an anonymous mode, where you don’t have to link your photos to personally identifiable information,” Valentina Milanova, founder of tampon-based STI screening startup Daye, told TechCrunch.
This sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who assist in service operation, including data hosting, analytics, marketing, payment processing, and security.” They also don’t specify whether these AI scans are taking place on your device or in the cloud, and if so, how long that data remains in the cloud, and what it’s used for.
Calmara represents the danger of over-hyped technology: It seems like a publicity stunt for HeHealth to capitalize on excitement around AI, but in its actual implementation, it just gives users a false sense of security about their sexual health.
The original article contains 773 words, the summary contains 228 words. Saved 71%. I'm a bot and I'm open source!