Hello c/Selfhosted!

Although I’m still new with truenas, I’ve been a happy truenas scale hoster for a year more or less and I’ve been increasing the reach of my self hosted server little by little.

The problem came when I decided to add jellyfin and a GPU for encoding. My server is mostly made of old parts and the GPU is not different. The GPU is recognized by truenas scale as a “Advanced Micro Devices, Inc. [AMD/ATI] Cape Verde PRO [Radeon HD 7750/8740 R7 250E”, which AFAIK has hardware encoding/decoding as per Jellyfin wiki.

But the only place I can see the GPU is in lspci and in System Settings/Isolated GPU PCI Ids (and it’s not isolated). Whenever I try to change the configuration of an app to allocate the GPU I can only select “Allocate 0 amd.com/gpu GPU”, there are no more options.

I’ve searched for this a lot but I found very little info about AMD GPUs and how to debug this issue.

I’am missing something? Could anybody point me in the right direction? Any commands I can run to diagnose?

Thanks for reading!

  • FenixinOP
    link
    English
    12 months ago

    Not sure how to change the driver to AMDGPU, I blacklisted the Radeon driver but the kernel didn’t load the other one. I read somewhere that I have to do a initframs update but the command doens’t exist in truenas scale. How do I force to load the other driver?

    • chameleon
      link
      fedilink
      32 months ago

      For that card, you probably have to set the radeon.si_support=0 amdgpu.si_support=1 kernel options to allow amdgpu to work. I don’t have a TrueNAS system laying around so I don’t know what the idiomatic way to change them is.

      Using amdgpu on that card has been considered experimental ever since it was added like 6 years ago, and nobody has invested any real efforts to stabilize it. It’s entirely possible that amdgpu on that card is simply never gonna work. But yeah I think the radeon driver isn’t really fully functional anymore either, so I guess it’s worth a shot…

      • @[email protected]
        link
        fedilink
        English
        32 months ago

        Using amdgpu on that card has been considered experimental ever since it was added like 6 years ago

        If I recall right, it hasn’t been enabled by default simply because it is missing some features like analog TV out support (which most people don’t want or need in 2024).

      • FenixinOP
        link
        English
        12 months ago

        I really want try it but I had a few difficult days… I hope I can try this tomorrow

      • FenixinOP
        link
        English
        11 month ago

        I’m giving up on this. I have tried everything and I can’t make it work, so bye bye GPU.