• dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      3
      ·
      6 months ago

      Look at Apple. They announced a pretty similar thing to recall but managed to get praised as creative innovators by using the correct combination of buzzwords. Creating a sense of privacy and security though from a technical point of view they offer neither. Google learned that it is not the tech, it is the marketing. MS botched the optics when they were on a downward reputational spiral, Apple nailed the optics banking on their locked in sla…users inside the walled garden. Google just has to figure their own strategy to good optics on the tech.

        • dustyData@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          1
          ·
          6 months ago

          Not one to one, but it is an AI that sees and records everything on the screen and device data to predict user actions and so the AI can work the prompts with some context. It’s still an app that sees your screen 24/7 then feeds it to an LLM. Sure, Apple says it is local (but it will phone home if the task is too complex sending your, encrypted, data along with it), and they claim OpenAI will sandbox chatgpt to prevent profiling (even though we have absolutely no reason to believe Altman is being sufficiently candid), and that it will be opt-in (though we know Apple will present the thing specifically designed for maximum FOMO).

          • ArbiterXero@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            6 months ago

            Do you have sources on this?

            I haven’t seen any suggestion that Apple’s intelligence is recording what you do beyond when you call for it…

            • dustyData@lemmy.world
              link
              fedilink
              English
              arrow-up
              12
              arrow-down
              1
              ·
              6 months ago

              I mean, you can look all over their marketing material. They’re not coy about it, they just gave it an euphemism. They called it “awareness of your personal context”. That is just code for “it sees and records every single thing you do”. The other euphemistic term is the “product knowledge about your devices’ features and settings”. They even throw some contradictions “it is aware of your personal information without collecting your personal information”??? How? How could I be aware of the plot of Frankenstein without ever “collecting” some form of record containing the plot of Frankenstein?

              • ArbiterXero@lemmy.world
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                2
                ·
                6 months ago

                Sooo that could be things like “address” fields on websites, which is mildly creepy, but not “screenshot every 30 seconds” creepy, but it’s certainly vague enough to make me feel uneasy.

                • dustyData@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  10
                  ·
                  edit-2
                  6 months ago

                  It’s vague on purpose. They saw the peasants ravaging MS for giving too many details. It’s vague enough that most people won’t give it a second thought and opt in anyway. Because it is the new shiny thing from Apple that they totally just invented and not copied from other companies.

            • dustyData@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              6 months ago

              No, QuickLook is fancy preview, it has nothing to do with AI and there’s no need to send or process the content. Apple Intelligence is more like recall, recording everything to feed Siri and ChatGPT.

    • bigmclargehuge@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      6 months ago

      That’s not why it was made. Data collection is a titanically large industry. Why just collect data from specific programs when you can literally just set up a screen recorder to collect all data?

      This is what happens when people are flippant about data collection. First, data collection isn’t even there. Next, it’s there, but is off by default, then it’s on by default but you can opt out, then only certain aspects are opt out, flash forward 10 years and here we are.

      This stuff isn’t coming out of nowhere, it’s a slow build because consumers consistently allow more and more egregious privacy violations to slip past because they “don’t care, the big corporations already have the data”

        • bigmclargehuge@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 months ago

          I get it, my tinfoil hat is showing. But, am I wrong? If people said “wait, this is creepy, don’t let this slide” when data collection first became a thing, I really do believe recall wouldn’t have happened.

          • s38b35M5@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            7
            ·
            6 months ago

            To be clear, I fully agree with your comment I replied to. It just reminded me of the poem by Martin Niem”ller.

          • iopq@lemmy.worldM
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            We said it, they didn’t listen, we switched to Linux

            If enough people voted with their feet they would have listened

  • apfelwoiSchoppen@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    6 months ago

    Somehow knew this would be a trend. It is another body of data to store for supposed user benefit and then exploit later.