Does warmer mean temperature? Color? Something else?

  • @Mandy
    link
    English
    137 months ago

    my man, its a blizzard and indoors, what part of that has any more nuance than being beaten over the head with the answer

    • @[email protected]
      link
      fedilink
      English
      20
      edit-2
      7 months ago

      The different snow images have different color tones, some matching that of the example image. The center image has a cool color tone, which doesn’t match. Captchas are made to defeat AI logic, so sometimes it’s not the obvious thing. It could very well possibly be selecting all images that match the color tone, something a bot may not work out. It could be just selecting indoor images. I wouldn’t know for certain until I got one of these and succeeded or failed. Personally I think it would be too easy for a bot to just ignore all images that have snow, or are mostly white, because that doesn’t resemble the example image at all.

      edit: and in case it needs to be said, getting beaten over the head by anything doesn’t involve nuance. That’s the opposite of nuance.

      • @[email protected]
        link
        fedilink
        English
        107 months ago

        Captchas aren’t made to “defeat AI logic”, the human detection happens in part outside the picture selection part. The picture selection is for training AI. In this case you are training an AI to distinguish the (potentially abstract) concept of warmth.

        • @[email protected]
          link
          fedilink
          English
          -3
          edit-2
          7 months ago

          Semantics, whatever. In truth it’s both, if you stop long enough to actually think about it instead of parroting other replies that solely focus on AI training.

      • @[email protected]
        link
        fedilink
        English
        77 months ago

        I couldn’t get past “pick the smallest animal”

        There was a large picture of a hummingbird, and a tiny panda. Both choices were wrong, apparently. They probably meant that I should pick the pettiest animal.

      • @[email protected]
        link
        fedilink
        English
        27 months ago

        Captchas are made to defeat AI logic, so sometimes it’s not the obvious thing. It could very well possibly be selecting all images that match the color tone, something a bot may not work out.

        IMO the idea here is that most users are not thinking very hard, so they are going to see the word “warmer”, think “snow = cold” and leave their analysis at that. AI on the other hand is going to put more effort into interpreting the specific meaning of the request in context of the images. The primary challenge for captchas now is to defeat AI, so the captcha ideas that get through probably did so because they gave the AI trouble in testing, but did not give most users trouble.

        I think that going forward, people who put thought into following specific directions accurately are going to have a lot of trouble with captchas.

        • @[email protected]
          link
          fedilink
          English
          27 months ago

          I can agree with most of this. Still don’t know for certain which interpretation would be the correct one for this captcha. Thank you for acknowledging that defeating AI is one of the goals, which seems obvious since they’re meant to determine if you’re human. idk why that’s difficult for some people to accept.

      • @[email protected]
        link
        fedilink
        English
        17 months ago

        There’s a sample picture of a living room, then pictures of living rooms and snowy houses.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          7 months ago

          Yes, thank you for your entirely original response, and demonstrating again that many people lack depth in their thinking. Honestly it’s sad that you replied to that explanation with this.

          I’m also starting to believe that some of you don’t understand what “sometimes the obvious looking answer isn’t the correct one” means.