• Ogmios
      link
      English
      362 months ago

      Because idiots salivate any time something novel promises to confirm their biases.

        • @[email protected]
          link
          fedilink
          English
          52 months ago

          The bias is believing any nonsense about “AI”. It’s widespread and hegemonic at this point.

        • Ogmios
          link
          English
          -122 months ago

          That shacking up with random people can/should be perfectly safe.

            • Ogmios
              link
              English
              32 months ago

              Nobody said a bias can’t be stupid.

    • @best_username_ever
      link
      English
      92 months ago

      Yet another AI scam is never a bad idea if you want to get funds.

  • @[email protected]
    link
    fedilink
    English
    372 months ago

    How was it supposed to work? Was it supposed to scan received dick pics of anything gross because people do have eyes they could use…

  • @[email protected]
    link
    fedilink
    English
    332 months ago

    How will we ever figure out who has an STI without predictive A.I.? If only there were tests.

  • @[email protected]
    link
    fedilink
    English
    112 months ago

    I read “daters” as “dealers” and I ran the whole gamut of emotions in about a half second.

  • @[email protected]
    link
    fedilink
    English
    82 months ago

    I think this is a valuable app… Not the app itself, but an API that other dating apps could link to to allow you to filter out anyone with poor enough judgement to have sent photos of their crotch to his company.

  • @[email protected]
    link
    fedilink
    English
    2
    edit-2
    2 months ago

    I have to admit It was a solid idea, though. Dick pics should be one of the best training sets you can find on the internet and you can assume that the most prolific senders are the ones with the lowest chance of having an STI (or any real-life sexual activity).

  • AutoTL;DRB
    link
    fedilink
    English
    22 months ago

    This is the best summary I could come up with:


    HeHealth’s AI-powered Calmara app claimed, “Our innovative AI technology offers rapid, confidential, and scientifically validated sexual health screening, giving you peace of mind before diving into intimate encounters,” but now it’s shut down after an inquiry by the Federal Trade Commission (FTC).

    The letter lays out some of the agency’s concerns with the information HeHealth relied on for its claims, including one saying that it could detect more than 10 sexually transmitted infections with up to 94 percent accuracy.

    Given that most STIs are asymptomatic, according to the World Health Organization, medical professionals have questioned the reliability of the app’s tactics.

    One Los Angeles Times investigation found that Calmara couldn’t even discern inanimate objects and failed to identify “textbook images” of STIs.

    The FTC issued a civil investigative demand (similar to a subpoena) seeking information about Calmara’s advertising claims and privacy practices and put HeHealth on notice that it’s illegal to make health benefit claims without “reliable scientific evidence.”

    The FTC said it would not pursue the investigation further since HeHealth agreed to those terms and because of “the small number of Calmara users and sales in the U.S.” But, it warned, “The Commission reserves the right to take such further action as the public interest may require.”


    The original article contains 523 words, the summary contains 207 words. Saved 60%. I’m a bot and I’m open source!