• 0 Posts
  • 4 Comments
Joined 1 year ago
cake
Cake day: July 27th, 2023

help-circle
  • I’d be interested in setting up the highest quality models to run locally, and I don’t have the budget for a GPU with anywhere near enough VRAM, but my main server PC has a 7900x and I could afford to upgrade its RAM - is it possible, and if so how difficult, to get this stuff running on CPU? Inference speed isn’t a sticking point as long as it’s not unusably slow, but I do have access to an OpenAI subscription so there just wouldn’t be much point with lower quality models except as a toy.




  • Box is (basically) just the way to have memory on the heap. Here's a direct comparison of how to do heap memory in C/++ and in rust:

    int* intOnHeap = (int*)malloc(sizeof(int));
    *intOnHeap = 0;
    MyClass* classOnHeap = new MyClass();
    
    let intOnHeap: Box = Box::new(0);
    let structOnHeap: Box = Box::new(MyStruct::default());
    

    There can be a bit more to it with custom allocators etc. but that's the only way most people will use boxes. So Box basically just means "a T is allocated somewhere and we need to free that memory when this value is dropped".