We are excited to announce that Arch Linux is entering into a direct collaboration with Valve. Valve is generously providing backing for two critical projects that will have a huge impact on our distribution: a build service infrastructure and a secure signing enclave. By supporting work on a freelance basis for these topics, Valve enables us to work on them without being limited solely by the free time of our volunteers.

This opportunity allows us to address some of the biggest outstanding challenges we have been facing for a while. The collaboration will speed-up the progress that would otherwise take much longer for us to achieve, and will ultimately unblock us from finally pursuing some of our planned endeavors. We are incredibly grateful for Valve to make this possible and for their explicit commitment to help and support Arch Linux.

These projects will follow our usual development and consensus-building workflows. [RFCs] will be created for any wide-ranging changes. Discussions on this mailing list as well as issue, milestone and epic planning in our GitLab will provide transparency and insight into the work. We believe this collaboration will greatly benefit Arch Linux, and are looking forward to share further development on this mailing list as work progresses.

  • wallmenis@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 months ago

    That sounds awesome. I never understood how a TPM can figure out if an attacker can get the keys if the tpm is on the same machine. Does it check independently the signature of the application that asked for the keys?

    • ricecake
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      Depends on the vendor for the specifics. In general, they don’t protect against an attacker who has gained persistent privileged access to the machine, only against theft.
      Since the key either can’t leave the tpm or is useless without it (some tpms have one key that it can never return, and will generate a new key and return it encrypted with it’s internal key. This means you get protection but don’t need to worry about storage on the chip), the attacker needs to remain undetected on the server as long as they want to use it, which is difficult for anyone less sophisticated than an advanced persistent threat.

      The Apple system, to its credit, does a degree of user and application validation to use the keys. Generally good for security, but it makes it so if you want to share a key between users you probably won’t be using the secure enclave.

      Most of the trust checks end up being the tpm proving itself to the remote service that’s checking the service. For example, when you use your phones biometrics to log into a website, part of that handshake is the tpm on the phone proving that it’s made by a company to a spec validated by the standards to be secure in the way it’s claiming.