• deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    11 months ago

    In an alternative dimension:

    Facebook: “announces mind reading headset to animate imaginary body parts”

    People: “Nice try, CIA!”, “That’s big gender propaganda!”, “I’m not going to connect my brain to the internet!” “Not guilty, your honor. Facebook made me do it via the headset.”

    • Sloan the Serval@pawb.social
      link
      fedilink
      English
      arrow-up
      34
      ·
      11 months ago

      To be fair, if Metaverse did integrate something like this they would definitely record telemetry data “for development purposes”.

      • Semi-Hemi-Demigod@kbin.social
        link
        fedilink
        arrow-up
        21
        ·
        11 months ago

        No, they’d straight up say they were doing it to target ads at you. They think it’s a good thing, because we get “relevant” ads.

        • amio@kbin.social
          link
          fedilink
          arrow-up
          12
          ·
          11 months ago

          They’d do both, and most likely be intentionally dishonest about which is which. Or just not give a shit, the fines aren’t that large for a behemoth like them.

  • Dukeofdummies@kbin.social
    link
    fedilink
    arrow-up
    17
    ·
    11 months ago

    … oh that’s interesting. Creation of a phantom “limb” with a brain control interface? I wonder how much control there is? Does it just wiggle? Is it purely binary up/down? Can they control the angle?

    I actually have a set of LED eyes that I control with puppetry, last I looked at BCIs it was woefully incapable of what I wanted but maybe I should look at this again…

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      I have a pair of Necomimi ears and I have no idea how they work, but I wonder if this VRC mod is like that?

    • Ethanol@pawb.social
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 months ago

      Edit: The twitter link shows a video which completely invalidates my previous comment. The ears do seem to be fluidly controllable.

      Previous comment:
      I would assume it’s just two states (ears up and ears down) that will be switched to. Most VRChat avatars I have seen do exactly this but through pressing a button rather than mind controls.
      Even something simple as this adds a lot of immersion! There are probably specific faces to go with the ears as well.

      • TheColorRed@pawb.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        I assume the opposite, as in the video the ears move in a much more fluid manner. The same guy also made a seperate component for emotions.

  • Protofox Riley@pawb.social
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 months ago

    This is both freakin rad but also kinda insane, gotta love how far furries / vrchat users will go to make something insane work

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I’m hella curious how eeg stuff works now, as opposed to the cheap piece of crap I had in the late 90’s/early 2000’s that worked on the same principal way in it’s infancy. Thing I had you could set 3 inputs for and even just recording the right “thoughts” that you sent to trigger them was a PITA, let alone getting it to work while using it in a game or something.