• AlexWIWA@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I’m willing to be we’ll see something to train language models on the user’s hardware soon enough. Folding at home, but instead of helping science, Google steals your electricity.

      • vvv@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I really think that’s the secret end game behind all the AI stuff in both Windows and MacOS. MS account required to use it. (anyone know if you need to be signed in to apple ID for apple ai?) “on device” inference that sometimes will reach out to the cloud. when it feels like it. maybe sometimes the cloud will reach out to you and ask your cpu to help out with training.

        that, and better local content analysis. “no we aren’t sending everything the microphone picks up to our servers, of course not. just the transcript that your local stt model made of it, you won’t even notice the bandwidth!)”

    • zqwzzle@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      The shitty reboot of Office Space where some low level Google employee realizes they can stick a crypto miner in every browser and generate a couple cents from everyone’s browser.

    • Pechente@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      Probably already installed. That would at least explain the high resource usage of chrome