I have a few Linux servers at home that I regularly remote into in order to manage, usually logged into KDE Plasma as root. Usually they just have several command line windows and a file manager open (I personally just find it more convenient to use the command line from a remote desktop instead of directly SSH-ing into the system), but if I have an issue, I’ve just been absentmindedly searching stuff up and trying to find solutions using the preinstalled Firefox instance from within the remote desktop itself, which would also be running as root.

I never even thought to install uBlock Origin on it or anything, but the servers are all configured to use a PiHole instance which blocks the vast majority of ads. However, I do also remember using the browser in my main server to figure out how to set up the PiHole instance in the first place, and that server also happens to be the most important one and is my main NAS.

I never went on any particularly shady websites, but I also don’t remember exactly which websites I’ve been on as root, though I do seem to remember seeing ads during the initial pihole setup, because it didn’t go very smoothly and I was searching up error messages trying to get it to work.

This is definitely on me, but it never crossed my mind until recently that it might be a bad idea to use a browser as root, and searching online everyone just states the general cybersecurity doctrine to never do it (which I’m now realizing I shouldn’t have) but no one seems to be discussing how risky it actually is. Shouldn’t Firefox be sandboxing every website and not allowing anything to access the base system? Between “just stop doing it” and “you have to reinstall the OS right now there’s probably already a virus on there,” how much danger do you suppose I’m in? I’m mainly worried about the security/privacy of my personal data I have stored on the servers. All my servers run Fedora KDE Spin and have Intel processors if that makes a difference?

  • @taladar
    link
    -25 months ago

    Then enlighten me, what is the easy way to do tasks that do require some amount of manual oversight? Tasks that can be completely automated are easy of course but with our relatively heterogeneous servers automation a la “do it on this one test system and if it works there run it completely automatically on the 100 identical production systems” is not available to us.

    • @ElderWendigo
      link
      05 months ago

      Not my circus, not my monkeys. You’re doing things the hard way and now it’s somehow my responsibility to fix your mess? I’m SUPER glad I don’t work with you.

      • @taladar
        link
        05 months ago

        You are the one who insists that there is a better way to do things but refuse to say what that better way is.

        • @[email protected]
          link
          fedilink
          35 months ago

          None of us can tell you the right approach for your specific system and use-case. People are simply pointing out that what you stated you’re doing is insecure and not recommended

          • @taladar
            link
            -45 months ago

            And nobody in any of these threads has ever pointed out why it is considered to be insecure. The most probable origin for that idea I have come upon so far is that it is a left-over from pre-SSH days when people thought using the root password with su at something other than the start of their connection would make it harder to sniff. Literally nobody lists even one good reason why sudo should be more secure than direct root login with SSH public keys and password login disabled for full root access (as in not limited to just one or two commands).

            • @[email protected]
              link
              fedilink
              45 months ago

              It’s not about someone sniffing your passwords, it’s about reducing your attack surface. If you use su then the entire session has root privileges and any piece of software you run could do system level damage if it has a bug. Using sudo limits the privilege escalation to just one command.

              • @taladar
                link
                -35 months ago

                That is only really true of you use sudo with a zero second password caching timeout.

                • @[email protected]
                  link
                  fedilink
                  35 months ago

                  You seem to be looking at the issue in black and white. Any reduction in root access is beneficial. Using sudo with password cache lasting an hour is still preferable to signing in as root. As many people have said, it’s about minimizing attack surface

                  • @taladar
                    link
                    05 months ago

                    Any reduction in root access is beneficial.

                    Such as having fewer users who are allowed to use sudo to become root and whose compromise can thus lead to a root compromise?

                • @[email protected]
                  link
                  fedilink
                  25 months ago

                  Not true. While you won’t always have to enter your password, not every command will have elevated rights.

                  • @taladar
                    link
                    05 months ago

                    The vast majority of commands when debugging actual issues on the system or performing administrative tasks do require root. Out of the others some give you incomplete results when called as a regular user and 90% of the rest shouldn’t be run on the server in the first place if you can avoid it but directly on your client computer (e.g. looking up documentation).