Change Proposal

Short: fwupd users download small but in total too much metadata over the internet. This is a beginning of something important, and the tech can be used for local updates and a lot more.

A solution for local distribution is needed. IPFS is too slow, Bittorrent is immediately suspicious on many Networks.

Passim is a new protocol for this purpose, users can opt out, it is secure and the metadata is hashed, and the hashes still downloaded over the internet for verification.

    • boredsquirrel@slrpnk.netOP
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      Its about deduplicating sent data.

      Currently all clients need to contact the servers somewhere in the internet. Instead, they just fetch the hash over the internet, and Metadata (and hopefully more relevant data like system updates) can be sent peer-to-peer

      • jwt@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        5 months ago

        They still need to contact (actually look for) and download from peers though. I can see how it can save money on CDN costs.

        But with claims of climate friendliness, I would at least expect some energy consumption metrics to back that up (from all participants in the network)

        • boredsquirrel@slrpnk.netOP
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          5 months ago

          They contact the servers to get a hash, which is small and always small.

          This is scaleable so it could also be used for more important stuff.

          Sending traffic through the LAN is extremely quicker and saves a lot of steps, you dont need statistics for that, it is obvious.

          But I wonder if this also works for nearby people, as only LAN is a bit useless for many.

          But thats how Windows does it since forever too, only in LAN.

          • jwt@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            Sending traffic through the LAN is extremely quicker and saves a lot of steps, you dont need statistics for that, it is obvious.

            That’s an overly simplistic way of looking at it, and in no way does it say anything about the energy efficiency of the system as a whole. Next to that, you still need the CDN server running 24/7 to serve hashes and fw that isn’t available in the p2p-network (just think how much less power efficient it will be to first crawl the p2p-network, make the conclusion the fw isn’t available on it, only then to still have to contact the CDN and download the fw the ‘old school’ way)

            Don’t get me wrong, it’s a cool new feature and a great way to get less dependent on CDNs and save money. But I’m just not buying the energy saving argument.

            • boredsquirrel@slrpnk.netOP
              link
              fedilink
              arrow-up
              2
              ·
              5 months ago
              1. The energy consumption is changed from big providers to individual company networks etc. And assuming the tech uses roughly similar energy, not needing all the relay servers saves power.
              2. A running CDN server ≠ a server doing stuff, sending data
              3. if p2p says there is no update, there is no update. Unless I didnt see that he mentioned something like that as a fallback, ALL the time. There are 2 connections but the data is sent locally when possible.

              There is lots of tech innovation, ARM, low TDP, etc. But I stick with the assumption that local data traffic saves more energy.

              • jwt@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                5 months ago

                Number 2 is exactly where my hesitancy lies. Is a CDN still chugging along - not serving stuff to a select user group that has passim enabled is actually finding the fw - saving enough energy for it to cancel out a whole p2p network. I don’t think so (and again, I’d need some metrics before I will. you can’t just waive that away with 'local == fast&less steps == obvious; don’t need statistics)

                As for number 3: p2p can only say if there are peers. if there are no peers, there still can be an update (what about the first person to download the firmware for example). It would be a security risk for the system to not give you updates if there are no peers, so I highly doubt that’s the case.