you have to put in extra work just to make your website not work with privacy measures. like you have to put in the work to use some bloated javascript framework that doesn’t work with noscript instead of just sticking with plain html and css, which would work. on top of that, i’ve encountered way too many big websites that don’t even have a noscript tag so all you see is a ghost layout or a blank page.
That’s something I would disagree with though. “Sticking with plain HTML and CSS” is way more work, and often has significantly less functionality, than building a website with a framework.
you can build it with a framework, but maybe build it on the server side instead. I’ve seen many nice sites that hardly use any javascript and instead of a bunch of api calls, the server just returns new html to render.
I don’t mind frameworks, but some features that seem super useful to devs, like google analytics, and various diagnostic/logging tools, social media integrations, I would prefer to “opt in” when I decide they are necessary.
Not the person you’re asking and I’m running uMatrix instead of noscript to block scripts. But I do it to get more granular control over what my browser loads and runs. Why run scripts if a website works perfectly fine without them? These days I ain’t trusting shit out there on the web.
Technically it’s uBO, but I use the extreme setting that blocks all scripts by default. Truthfully I wasn’t aware just how many scripts get loaded especially on ecommerce and social media sites, there are too many heavy frameworks being used. Much of it is unnecessary bloat, slowing down my browser, and no small amount of it is devoted to tracking and data collection.
In general, I find less than half of loaded scripts are required to make a page functional. It’s a process requiring trial-and-error, but I have a good set of base rules in place for trusted sites and scripts.
For me, it’s about not giving websites free reign over my browser and by extension my computer and personal data, but having some measure of control over them.
And occasionally there are suspicious sites where I truly don’t want any scripts to run. I don’t even have to worry about them.
Internet in 2024 (for me):
The percentage of websites that “just work” with privacy measures in place is depressingly small.
you have to put in extra work just to make your website not work with privacy measures. like you have to put in the work to use some bloated javascript framework that doesn’t work with noscript instead of just sticking with plain html and css, which would work. on top of that, i’ve encountered way too many big websites that don’t even have a noscript tag so all you see is a ghost layout or a blank page.
That’s something I would disagree with though. “Sticking with plain HTML and CSS” is way more work, and often has significantly less functionality, than building a website with a framework.
you can build it with a framework, but maybe build it on the server side instead. I’ve seen many nice sites that hardly use any javascript and instead of a bunch of api calls, the server just returns new html to render.
I don’t mind frameworks, but some features that seem super useful to devs, like google analytics, and various diagnostic/logging tools, social media integrations, I would prefer to “opt in” when I decide they are necessary.
honest question, what is the point of having noscript on at all times?
Not the person you’re asking and I’m running uMatrix instead of noscript to block scripts. But I do it to get more granular control over what my browser loads and runs. Why run scripts if a website works perfectly fine without them? These days I ain’t trusting shit out there on the web.
Tldr: I prefer to opt-in.
Technically it’s uBO, but I use the extreme setting that blocks all scripts by default. Truthfully I wasn’t aware just how many scripts get loaded especially on ecommerce and social media sites, there are too many heavy frameworks being used. Much of it is unnecessary bloat, slowing down my browser, and no small amount of it is devoted to tracking and data collection.
In general, I find less than half of loaded scripts are required to make a page functional. It’s a process requiring trial-and-error, but I have a good set of base rules in place for trusted sites and scripts.
For me, it’s about not giving websites free reign over my browser and by extension my computer and personal data, but having some measure of control over them.
And occasionally there are suspicious sites where I truly don’t want any scripts to run. I don’t even have to worry about them.
Are there even some left? Good old text+image-websites with pure information. Ahh the good old times.
But why #5? What do have against https?
I require https, but not every website is secure, and sometimes the certificate has a problem or is expired.