As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • ubergeek77@lemmy.ubergeek77.chat
    link
    fedilink
    arrow-up
    8
    ·
    9 months ago

    Even if this gets implemented, I can’t imagine it will last very long with something as completely ridiculous as removing the keyboard. One AI API outage and the entire office completely shuts down. Someone’s head will roll when that inevitably happens.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Ah sorry, I mean removing the option of using the keyboard as an input method in the medical records system. The keyboard itself isn’t physically removed from the computer clients.

      But I agree that in the event of a system failure the hospital will halt.

      • ubergeek77@lemmy.ubergeek77.chat
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        Also, if you get the permission of someone in leadership to clone their voice, one angle could be to voice clone someone on ElevenLabs and make the voice say something particularly problematic, just to stress how easily voice data can be misused.

        If this AI vendor is ever breached, all they have to do is robocall patients pretending to be a real doctor they know. I don’t think I need to spell out how poorly that would go.