• eltimablo@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    I did miss that, but again, it’s additional fines on top of an almost guaranteed lawsuit for something that may not even be their fault. If they got owned by a Heartbleed exploit back when it was first announced and a fix wasn’t available yet, should a company be responsible for that? What about when they get hit by a vuln that’s been stockpiled for a couple years and purposely has no fix due to interference from bad actors? There are a lot of situations where fining someone for getting breached doesn’t make sense.

      • eltimablo@kbin.social
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        8 months ago

        And I’ll counter with this: no system is perfect, especially when major parts are made by non-employees. Mistakes can and do happen because corporations, regardless of size, are made up of humans, and humans are really good at fucking up.

          • eltimablo@kbin.social
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            8 months ago

            Your bridge analogy falls apart because there already are standards (FIPS, among others) that are shockingly insecure despite having been updated relatively recently, and yet we still have breaches. If the standards were effective, places like AmerisourceBergen, the country’s largest pharmaceutical distributor, wouldn’t be supplementing them with additional safeguards. No standard is going to work perfectly, or even particularly well, for everyone. Bridges still fall down.

            EDIT: Alternatively, there would need to be a provision that allows companies to sue the government if they get breached while following their standards, since it was the government that said they were safe.

              • eltimablo@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                8 months ago

                When you say “corporations,” it seems like you’re exclusively counting companies like Google, Meta, etc, whereas I’m also including the mom and pop, 15-person operations that would be impacted by the same regulations you suggest. Those underdogs are the ones I want to protect, since they’re the only chance the world has at dethroning the incumbents and ensuring that the big guys don’t outlive their usefulness.

                  • eltimablo@kbin.social
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    edit-2
                    8 months ago

                    See, I figure all of those things would be accounted for in whatever civil suit gets brought against the company. Frankly, I think that’s much more fair to companies both big and small because it involves a group of people working together to figure how much of a fine to levy in each individual instance, rather than having a blanket policy that may or may not account for edge cases. If the company is huge and the fuckup egregious, then the jury is (theoretically) going to throw the book at them.

                    At the very least, I’d want a jury in between the company and whichever government body is fining them, because regulatory bodies are prime targets for corporate shills to take over and it’s harder for that to run rampant if you have a bunch of regular jackoffs acting as gatekeepers.

                    There’s also the issue of ongoing compliance for small companies. Cybersecurity engineers are not cheap, and being all but required by law to employ one could (1) drive small companies out of business (180k a year may be cheap for Facebook, but it’s definitely not for Joe Buttsniffer and Sons Catering), and (2) cause market saturation so bad that the average salary makes nobody want to do the job anymore.