• 5 Posts
  • 218 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle
  • Moving/copying/reading/deleting tonnes of tiny files isn’t significantly faster on an ssd because the requirements for doing so are not limited by HDDs in the first place.

    You mean the physical actuator moving a read/write head over a spinning platter? Which limits its traversal speed over its physical media? Which severely hampers its ability to read data from random locations?

    You mean that kind of limitation? The kind of limitation that is A core part of how a hard drive works?

    That?

    I would highly recommend that you learn what a hard drive is before you start commenting about its its performance characteristics. 🤦🤦🤦


    For everyone else in the thread, remember that arguing with an idiot is always a losing battle because they will drag you down to their level and win with experience.


  • This is like asking for a source for common sense statements.

    HDDs are pretty terrible at random IO, which is what reading many small files tends to be. This is because they have a literal mechanical arm with a tiny magnet on the end that needs to move around to read sectors on a spinning platter. The physical limitations of how quickly the read right head can traverse limits it’s random I/O capabilities.

    This makes hard drives, abysmal, at random I/O. And why defragmenting is a thing.

    This is common knowledge for anyone in it and easy knowledge to obtain by reading a Wikipedia page.

    SSDs are great at random I/O. They do not have physical components that need to move in order to read from random locations they generally perform equally as well from reading any location. Meaning their random I/O capabilities are significantly better.



  • douglasg14b@lemmy.world
    cake
    toSelfhosted@lemmy.worldJellyfin over the internet
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    edit-2
    4 days ago

    These are all holes in the Swiss cheese model.

    Just because you and I cannot immediately consider ways of exploiting these vulnerabilities doesn’t mean they don’t exist or are not already in use (Including other endpoints of vulnerabilities not listed)


    This is one of the biggest mindset gaps that exist in technology, which tends to result in a whole internet filled with exploitable services and devices. Which are more often than not used as proxies for crime or traffic, and not directly exploited.

    Meaning that unless you have incredibly robust network traffic analysis, you won’t notice a thing.

    There are so many sonarr and similar instances out there with minor vulnerabilities being exploited in the wild because of the same"Well, what can someone do with these vulnerabilities anyways" mindset. Turns out all it takes is a common deployment misconfiguration in several seedbox providers to turn it into an RCE, which wouldn’t have been possible if the vulnerability was patched.

    Which is just holes in the swiss cheese model lining up. Something as simple as allowing an admin user access to their own password when they are logged in enables an entirely separate class of attacks. Excused because “If they’re already logged in, they know the password”. Well, not of there’s another vulnerability with authentication…

    See how that works?





  • You can’t really host your own AWS, You can self-host various amalgamations of services that imitate some of the features of AWS, but you can’t really self-host your own AWS by any stretch of the imagination.

    And if you’re thinking with something like localstack, that’s not what it’s for, and it has huge gaps that make it unfit for live deployment (It is after all meant for test and local environments)









  • The sad part is is that you’re right.

    And the reason that it’s sad is that most of the individual veneers on proprietary projects deeply about a project itself and have the same goals as they do with open source software, which is just to make something that’s useful and do cool shit.

    Yep, the business itself can force them not take care of problems or force them to go in directions that are counter to their core motivations.





  • Oh, you get the benefit of explicit scanning?

    We get the beauty of every file that’s modified being scanned before the write “completes”. It’s an absolute joy starting a build and watching ~80% of the available compute be consumed by antivirus software.

    Or, you know, normal filesystem caching as part of your tool’s workflow.

    Or dependency installing and unpacking…

    Or anything actually that touches a lot of files.