• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle









  • KubeRoot@discuss.tchncs.detolinuxmemes@lemmy.worldIT Department's Plan
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 months ago

    Does windows come preinstalled and preconfigured with more potentially vulnerable software on open ports?

    I personally don’t value an antivirus that much, since it can only protect you from known threats, and even then, it only matters when you’re already getting compromised - but fair point for Windows, I suspect most distros come without antivirus preinstalled and preconfigured.

    A firewall, on the other hand, only has value if you already have insecure services listening on your system - and I’m pretty sure on Windows those services aren’t gonna be blocked by the default settings. All that said though… Most Linux distros come with a firewall, something like iptables or firewalld, though not sure which ones would have it preconfigured for blocking connections by default.

    So while I would dispute both of those points as not being that notable, I feel like other arguments in favor of Linux still stand, like reduced surface area, simpler kernel code, open and auditable source.

    One big issue with Linux security for consumers (which I have to assume is what you’re talking about, since on the server side a sysadmin will want to configure any antivirus and firewall anyways) could be that different distributions will have different configurations - both for security and for preference-based things like desktop environments. This does unfortunately mean that users could find themselves installing less secure distros without realizing it, choosing them for their looks/usage patterns.



  • I think you’re actually agreeing with me here. I was disputing the claim that software should be made available in “a native package format”, and my counterpoint is that devs shouldn’t be packaging things for distros, and instead providing source code with build instructions, alongside whatever builds they can comfortably provide - primarily flatpak and appimage, in my example.

    I don’t use flatpak, and I prefer to use packages with my distro’s package manager, but I definitely can’t expect every package to be available in that format. Flatpak and appimage, to my knowledge, are designed to be distro-agnostic and easily distributed by the software developer, so they’re probably the best options - flatpak better for long-term use, appimage usable for quickly trying out software or one-off utilities.

    As for tar.gz, these days software tends to be made available on GitHub and similar platforms, where you can fetch the source from git by commit, and releases also have autogenerated source downloads. Makefiles/automake isn’t a reasonable expectation these days, with a plethora of languages and build toolchains, but good, clear instructions are definitely something to include.



  • Those are fair points, I actually bought the switch pretty early on after seeing praise for Odyssey and BotW. I play on PC otherwise, but I enjoyed the experience, playing docked with joycons with motion controls.

    I’m not personally frustrated, while the games definitely seem overpriced, I always felt like Nintendo is just sitting in their niche doing their thing, not trying to one-up others and instead providing various gimmicks with their devices. They’re selling consoles and games for a certain price, and it feels like if you think the deal is bad or unfair, you can just pass on it.

    I don’t think I really have a point here, just saying my thoughts. I have my issues with Nintendo, but I do feel like their consoles and games provide value that is hard to get elsewhere.




  • And reinstalling the packages, moving over all the configs, setting up the partitions and moving the data over? (Not in this order, of course)

    Cloning a drive would just require you to plug both the old and new to the same machine, boot up (probably from a live image to avoid issues), running a command and waiting until it finishes. Then maybe fixing up the fstab and reinstalling the bootloader, but those are things you need to do to install the system anyways.

    I think the reason you’d want to reinstall is to save time, or get a clean slate without any past config mistakes you’ve already forgotten about, which I’ve done for that very reason, especially since it was still my first, and less experienced, install.


  • Well, some games that come to mind are Stellaris, RimWorld, Oxygen Not Included, and I think the upcoming Factorio expansion. And from those, I think it might be possible to buy RimWorld DLC off-steam and install it in a steam copy.

    Fun fact, you can check - on steamdb, you can check depots for a game, and see if it has one for a DLC. If it does, then it is downloading extra files for it.

    All that said, I wouldn’t say it’s 100% a developer issue. The way I see the accusation, Valve is very comfortable providing convenient libraries for various things, including working with DLC, that only work on their platform, making it hard to release the game elsewhere in the future.

    I’m generally fine with that for a simple reason - Steam really does have great features that just work. However, if somebody forced Valve to make features like Steam Input available independent of Steam, it could be a great boon for gaming.


  • I think the DLC point is the one valid argument, although nontrivial to implement.

    How do you think DLC works on DRM-free games works, like GOG? The game is just gonna check if you have the DLC installed, without any real DRM.

    The main issue is, this is entirely possible right now for games to do, but it won’t be integrated with steam, and needs to be done by developers themselves. I don’t know how feasible it would be for Steam to realistically do something about it, but it’d definitely be nice if you could buy a game on steam, and later decide you want to buy DLC on another platform and install it onto your steam game.