• 0 Posts
  • 53 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle

  • They have a secondary motherboard that hosts the Slot CPUs, 4 single core P3 Xeons. I also have the Dell equivalent model but it has a bum mainboard.

    With those 90’s systems, to get Windows NT to use more than 1 core, you have to get the appropriate Windows version that actually supports them.

    Now you can simply upgrade from a 1 to a 32 core CPU and Windows and Linux will pick up the difference and run with it.

    In the NT 3.5 and 4 days, you actually had to either do a full reinstall or swap out several parts of the Kernel to get it to work.

    Downgrading took the same effort as a multicore windows Kernel ran really badly on a single core system.

    As for the Sun Fires, the two models I mentioned tend to be highly available on Ebay in the 100-200 range and are very different inside than an X86 system. You can go for 400 or higher series to get even more difference, but getting a complete one of those can be a challenge.

    And yes, the software used on some of these older systems was a challenge in itself, but they aren’t really special, they are pretty much like having different vendors RGB controller softwares on your system, a nuisance that you should try to get past.

    For instance, the IBM 5000 series raid cards were simply LSI cards with an IBM branded firmware.

    The first thing most people do is put the actual LSI firmware on them so they run decently.


  • Oh, I get it. But a baseline HP Proliant from that era is just an x86 system barely different from a desktop today but worse/slower/more power hungry in every respect.

    For history and “how things changed”, go for something like a Sun Fire system from the mid 2000’s (280R or V240 are relatively easy and cheap to get and are actually different) or a Proliant from the mid to late 90’s (I have a functioning Compaq Proliant 7000 which is HUGE and a puzzlebox inside).

    x86 computers haven’t changed much at all in the past 20 years and you need to go into the rarer models (like blade systems) to see an actual deviation from the basic PC alike form factor we’ve been using for the past 20 years and unique approaches to storage and performance.

    For self hosting, just use something more recent that falls within your priceclass (usually 5-6 years old becomes highly affordable). Even a Pi is going to trounce a system that old and actually has a different form factor.






  • Even as far back as XP/Vista Microsoft has wanted to run the file system as more of an adaptive database than a classical hierarchical file system.

    The leaked beta for Vista had this included and it ran like absolute shit, mostly because harddrives are slow and ram was at a premium, especially in Vista as it was such a bloated piece or shit.

    NTFS has since evolved to include more and more of these “smart” file system components.

    Now they want to go full on with this “smart” approach to the filesystem.

    It’ll still be slow and shit, just like it was 2 decades ago.


  • Besides that, mrna tech started to be developed in the 1970’s with the first labrat trials in the late 80’s or early 90’s.

    Clinical trials on humans, to test their safety and effectiveness in combating various diseases and viruses have been ongoing for the past decade.

    And as you said, the first several widely used vaccines based on mrna tech have been deployed to literally billions of people.

    This is an incredibly gigantic sample size for data and there have been very few issues for the past 3 years.

    And what bernieecclestoned brings up about herd immunity simply means the people they are talking to are, like most antivaxxers, blithering idiots that know some catch phrases and not a single meaning behind them.

    You only obtain herd immunity with minimal casualties through hardening the herd with vaccines and then hope the immune systems of the herd adjust to further combat the disease. If data doesn’t show that new variants are easily countered by the immune systems of the herd, you know you need to develop more vaccines.

    If you try to obtain herd immunity by letting a brand new disease like COVID run its course, you will probably obtain it eventually, but instead of 7 million dead worldwide (and lord knows how many with long covid or other long term disabilities due to the disease), you’ll have 70 million or more.

    Herd immunity doesn’t mean you should just let shit hit the fan and see who’s left standing. If you miscalculate the severity of the disease, you can have another situation like with the plague where it killed over 25 million out of the 180 million people on earth.

    In todays numbers that would mean like 1.1 billion people die. Probably far more since we’re extremely more connected than people were in 400AD.

    And you’d think that the better general healthcare and hygiene these days would lessen it, but the sheer increase in how we’re connected would easily wipe that advantage off the board.






  • Nature itself is literally making new covid versions.

    And our immune system detects and fights most of them similar to how virus scanners can detect a virus it doesn’t know. By detecting similarities.

    If a new variant comes along that is so different from the OG virus that your immune system doesn’t know what to do with it, they develop a new vaccine, which you have “install on the client side” by getting the vaccine, to protect you from getting sick from it.

    If new methods are developed to cheat, the cheat engine gets updated to detect those too.

    As for “brief dip”, that’s the only thing needed for a product launch.

    If a game is rife with cheating day one, it’ll fail.

    If it only gets rife with cheating when people are already invested in it, the cheating is much lower priority.

    That doesn’t change that fact that at the server side, you’re unable to detect most prevalent forms of cheating.

    Wallhacks and aimbots are nigh impossible to detect on the server side.



  • The problem with the server only solution in that they can never detect the source of cheating, only the result of it.

    And detecting the result is inaccurate as there are perfectly natural network latency and other issues that can generate the same result as a cheat, as that’s actually how many cheats are discovered and implemented, by noticing that network latency or weird traffic creates an exploitable condition.

    You need to run it on the client side to see if the natural circumstances are happening or someone is using tools to cause the circumstances. The first isn’t cheating, the later is.

    You can’t detect from the server side what the client side is doing without running anticheat on the client side.





  • Part of the issue is that it also took them 2 months to get any sort of qualified patch out the door.

    The previous ones only fixed like 2-3 game breaking bugs each and that was that.

    The most recent one does a tad more, but still nothing to write home about.

    Some people keep parroting that Bethesda has always had a absolutely horrid trackrecord of patching their messes, so you shouldn’t complain about that, but I refuse to give them a pass on that.

    People that keep saying that are pretty much saying “yeah, me dog shits in my bed every single day and I’m not going to do anything about it because they’ve always done that.”