• 1 Post
  • 50 Comments
Joined 5 months ago
cake
Cake day: December 9th, 2024

help-circle
  • I get people’s intentions behind this, ignorant though it is. I think medicated ADHD folks get a little defensive about it too though. I took adderall and then vyvanse for about 15 years total. Now I don’t take anything for it; I meditate and do THC recreationally (which was how I discovered the ADHD in the first place.)

    I don’t think medication is bad, I think it helps people live they way they feel like they want or must. I realized that I was caught up in the hustle trap, taking meds to optimize my brain for the purpose of being a better capitalist worker.

    I actually really like my default state. I’m extremely flexible and creative, I get a mix of tasks done, and my emotions are well regulated. On Vyvanse I got a lot of work done, but i was also a rage zombie, and I was prone to falling into “productivity mode” where I could hammer out line after line of code that was all boilerplate or data entry, other easy work to focus on. The kind of thing my ADHD brain would force me to find an easier (better designed) way to do the task if I wasn’t medicated into docile compliance.

    So I’m not an advocate for either way: treat your mind the unique way you need to. But i really think the majority of ADHD folks are medicating themselves into acceptance of a broken and diseased system, when our brains have already been adapting to the actual needs of our information-overdense society.




  • I got fired from a programming job because I wrote code for maybe 30 minutes a day, but spent all my other time going from desk to desk helping other devs find problems and get unstuck. It was maybe the most productive I’ve ever been on a job.

    That day, I learned a valuable lesson: do an hour of work a day (or week), then sit at your desk pretending to work while parceling out the stuff you did. Never help anyone. My career has been much more successful since.








  • Further, “Whether another user actually downloaded the content that Meta made available” through torrenting “is irrelevant,” the authors alleged. “Meta ‘reproduced’ the works as soon as it made them available to other peers.”

    A “peer” in bittorrent is someone else who is downloading the same file as you. This is opposed to a “seeder” which is also a peer but is only sending data, no longer receiving.

    You don’t have to finish the file to share it though, that’s a major part of bittorrent. Each peer shares parts of the files that they’ve partially downloaded already. So Meta didn’t need to finish and share the whole file to have technically shared some parts of copyrighted works. Unless they just had uploading completely disabled, but they still “reproduced” those works by vectorizing them into an LLM. If Gemini can reproduce a copyrighted work “from memory” then that still counts.

    Now, to be clear, fuck Meta but also fuck this argument. By the same logic, almost any computer on the internet is guilty of copyright infringement. Proxy servers, VPNs, basically any compute that routed those packets temporarily had (or still has for caches, logs, etc) copies of that protected data.

    I don’t think copyrights and open global networks are compatible concepts in the long run. I wonder which the ruling class will destroy first? (Spoilers, how “open” is the internet anymore?)




  • If by more learning you mean learning

    ollama run deepseek-r1:7b

    Then yeah, it’s a pretty steep curve!

    If you’re a developer then you can also search “$MyFavDevEnv use local ai ollama” to find guides on setting up. I’m using Continue extension for VS Codium (or Code) but there’s easy to use modules for Vim and Emacs and probably everything else as well.

    The main problem is leveling your expectations. The full Deepseek is a 671b (that’s billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.

    They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren’t as impressive as the cloud hosted big versions though.