I highly recommend disabling JavaScript by default in your browser and then whitelisting the websites that you use frequently and need JavaScript to function.

The privacy benefit of this is that when you read articles online or visit new websites, most of the time it will not need JavaScript to function which will stop loading a lot of ads and tracking scripts.

The security benefit here is massive, first if you visited a bad link that contains a malware that is dependent on JavaScript it would not work, secondly if you visited a link for a service that you use and JavaScript did not work there, then you can see in real time that this is a fake page and not the real websitewebsite you intended to visit.

Bonus tip: try to replace the unnecessary websites that can’t work without JavaScript and you need by JavaScript free websites or open source apps.

Disclaimer: Stay cautious. This recommendation will improve your privacy and security, but it does not protect you from everything.

  • Emotet@slrpnk.net
    link
    fedilink
    arrow-up
    103
    arrow-down
    1
    ·
    3 months ago

    15-20 years ago, I’d have agreed with you. But apart from a select few news sites and exceedingly rare static sites, what percentage of websites most users use day to day actually function even minimally without JavaScript?

    I’m convinced that in practice, most users would be conditioned to whitelist pretty much every site they visit due to all the breakage. Still a privacy and security improvement, but a massive one? I’m not sure.

    Very happy to be convinced otherwise.

    • montar@lemmy.ml
      link
      fedilink
      arrow-up
      22
      ·
      3 months ago

      Tried and can confirm almost every webpage even static ones which could be simple as rock needs truckload of bloat js code to be loaded from ext servers.

    • smeeps@lemmy.mtate.me.uk
      link
      fedilink
      arrow-up
      22
      ·
      3 months ago

      Yep, software dev here for a static marketing site for a product. We are in a constant battle with PMs and SEO who want Google tracking, Facebook, TikTok, A/B testing, cursor tracking, etc. We’re trying to keep page-speeds fast while loading megabytes of external JS code…

      Luckily all that can be blocked by the end user without affecting the site though, all you’d lose is some carousels and accordions etc that are done with a few lines of code.

    • AnAmericanPotato@programming.dev
      link
      fedilink
      English
      arrow-up
      16
      ·
      3 months ago

      It’s incredibly annoying, but it gets easier over time as you fill out you whitelist.

      One of the big advantages to something like NoScript is that it lets you enable scripts only from certain domains. So you can enable the functionally-required scripts while still blocking other scripts.

      But yes, it’s a giant pain in the ass. It’s absurd that the web has devolved into such a state.

    • ModerateImprovement@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      edit-2
      3 months ago

      I had been doing this since long time and I am super comfortable with this as most of the websites that I come across does not need JavaScript to function.

      Even for the websites that need JavaScript and I need them, I try to replace them with open source apps and clients.

      You can list the websites you use and require JavaScript and I can give you alternatives if you want.

      Happy to help you.

      • Emotet@slrpnk.net
        link
        fedilink
        arrow-up
        8
        ·
        3 months ago

        It’s great that it works for you and that you strive to spread your knowledge. Personally, I’m quite happy with my DNS filtering/uBlock Origin and restrictive browser approach and already employ alternatives where feasible in my custom use case.

        Thanks for your offer, though!

    • s38b35M5@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      I agree that most websites don’t load without JavaScript, but you don’t need seven or more different domains with java allowed for the main site to work. Most sites have their own, plus six google domains, including tag manager, Facebook, etc. I whitelist the website and leave the analytics and tracking domains off.

  • refalo@programming.dev
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Here’s a counter-argument to yours… disabling javascript can actually make you stand out like a glowing sun. Just like how ad-blockers can be used for fingerprinting, the fact that you’re not loading any JS, or any resources it might have fetched, can greatly increase your fingerprint. Along with combining TLS fingerprinting, HTTP headers and HTML/CSS tricks you can still be singled out pretty well without any JS. The fact that you have JS disabled automatically puts you in a very small list of people, so not as many data points are even needed for an accurate fingerprint.

    • moonpiedumplings@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      3 months ago

      Disabling javascript increases security, and offers a little bit of privacy. Those are both separate from anonymity, but people conflate the three often.

      For example, javascript can be made to do arbitrary websoccket or http connections to any ip/hostname your computer has access to — even local networks or localhost.

      I use the browser extension Port authority to block it.

      Of course, port scanning is used by ebay to scan users computers, and discord.

      Disabling javascript prevents websites from tracking exactly what you do on each site, or what local ports you have open. This is definitely an increase in privacy, as it relates to hiding what you’re doing. However, you noted it comes at the cost of anonymity, as you become uniquely identifiable.

      • refalo@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        Of course if you’re not blocking js entirely but using something like port authority, then that can potentially be detected and used against you just like I mentioned, so yeah it’s a tradeoff you just have to decide on based on your own individual threat model.

  • moreeni@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    3 months ago

    I’ve been doing this for a few years and eventually got tired of whitelisting websites. I’ve went as far as using NoScript for fine-grained control, but what’s the point? If you need a single feature JS, or a single article on a domain, you will let everything run if you grant the permissions, so why bother?

    Better keep JS on and run an up-to-date browser with a custom DNS to filter out known malicious websites. Also, don’t visit random links, that’s an actually good advice.

  • The Hobbyist@lemmy.zip
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    3 months ago

    You’re suggesting a whitelisting approach which I’ve used for a long time. But in the end, I I was so upset that most websites required me to enable JavaScript for their unique website because they would otherwise be broken. And I was only interested in blocking it for specific webpages so I ended up having a blacklisting approach which I recommend to keep some sanity, but that’s my opinion :)

  • exu@feditown.com
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 months ago

    You’d end up whitelisting sob many sites that it makes this approach worthless in my opinion.

    Instead I’ve settled on blocking scripts by default and whitelisting subdomains until the site works. It does require more time and effort, but it’s probably the only way to meaningfully block parts of javascript apart from just not using that website.
    Depending on how exactly you so this, you’ll end up with a huge filter list. Mine in uBlock Origin has 245kB when exported.

    • Aggravationstation@feddit.uk
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      You do end up whitelisting a lot of sites, but seeing that a site is using Javascript this way lets you weigh up if you feel it’s worth the security risk to enable it or not.

  • shortwavesurfer@lemmy.zip
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    3 months ago

    This combined with using a DNS over HTTPS or DNS over TLS that filters out things such as Control D is golden

    • Peffse@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      3 months ago

      I had to switch to uMatrix after NoScript broke all embedded media. It seems to still block all the 3rd party scripts, but allows 1st party by default so less severe website breakage.

  • Blizzard@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 months ago

    I prefer to block the bad scripts with uBlock/AdGuard and use the rest of them.

  • Mikelius@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    The security part is the reason I use NoScript to do this. We’ve all typo squatted sites we visit, I’m sure. But if I typo squat a site I frequently visit and see the JavaScript disabled, it forces me to recheck I’m on the right site. Granted it’s only happened once where I didn’t realize I typo’d until seeing it was disabled, but it only takes 1 time to lose everything…

    Not sure the fingerprint concerns are too major for me either. Hopefully most scenarios, I’m flagged as a bot or crawler and out of some data that would otherwise have been collected. Who knows. I imagine that JavaScript makes up for way more fingerprinting though.

    • Tangent5280@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      For critical functionality like banking and stuff it’s also a good idea to put a bookmark to the right site on the toolbar and then only ever access it via bookmarks.