• 0 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: August 19th, 2023

help-circle

  • It’s always been bad practice to just blindly update software. That’s why we have different distros.

    Ubuntu and Mint hold your hand and make it easy for newcomers. Great way to dive into Linux. I completely agree these are great for “it just works” and no fuss. I’ve not had one break on me.

    Arch and Gentoo expect you to have experience and know what you’re doing. You build it up how you want it. That’s what makes these so great. But you need the experience and knowledge.

    I’ve personally tried openSUSE and in my opinion it feels like a good middle ground between both ends. In the past I’ve recommended Mint to get started, openSUSE once you’ve got experience, and then Arch for when you want total control.










  • Arch is not meant to be a daily driver if you’re expecting “shit just works” stability long term when you just blindly run updates. You have to understand what you’re updating and sometimes why.

    It is targeted at the proficient GNU/Linux user, or anyone with a do-it-yourself attitude who is willing to read the documentation, and solve their own problems.

    If you want to use Arch, you need to invest in snapshots using rsync or dd. Given how it’s a rolling release, you should do this weekly. If something fucks up, grab all your logs and put them somewhere safe. Roll back and look at your logs to see what broke. Then apply updates as needed. You can ignore packages for quite a while. If you’re not smart enough to understand it now, you may in the future. It takes time and practice.

    Debian based is only “out of date” feature wise because they do a package freeze. They ensure stability before release. Updates are largely security related.


  • Your use of the Platform is licensed, not sold, to you, and you hereby acknowledge that no title or ownership with respect to the Platform or the Games is being transferred or assigned and this Agreement should not be construed as a sale of any rights.

    From the Blizz terms.

    WoW has always revolved around having a server handle everything and your client is just the textures/models viewer where you tell the server what to do, I have been fine with this. But I do agree, it should say something else on the button. Other games that are not MMO shouldn’t be a “license” to play. If you buy it, you can play it whenever and wherever. Features that are not multiplayer should work regardless. Some things just shouldn’t be tied to a server. I really despise modern gaming because of this.

    Anecdotal experience: Gran Turismo Sport recently lost its servers. When they went down, the Mileage Exchange shop went with it. This means all the cosmetics for cars. and a few unique cars, are now unobtainable for future players. PD could have patched the shop to be a complete list of everything and you buy it with the plethora of points you will collect in the future as you race. But no, they didn’t.



  • It’s not odd at all. It’s well known this is actually the truth. Ask any video editor in the professional field. You can search the Internet yourself. Better yet, do a test run with ffmpeg, the software that does encoding and decoding. It’s available to download by anyone as it’s open source.

    Hardware accelerated processing is faster because it takes shortcuts. It’s handled by the dedicated hardware found in GPUs. By default, there are parameters out of your control that you cannot change allowing hardware accelerated video to be faster. These are defined at the firmware level of the GPU. This comes at the cost of quality and file size (larger) for faster processing and less power consumption. If quality is your concern, you never use a GPU. No matter which one you use (AMD AMF, Intel QSV or Nvidia NVENC/DEC/CUDA), you’re going to end up with a video that appears more blocky or grainy at the same bitrate. These are called “artifacts” and make videos look bad.

    Software processing uses the CPU entirely. You have granular control over the entire process. There are preset parameters programmed if you don’t define them, but every single one of them can be overridden. Because it’s inherently limited by the power of your CPU, it’s slower and consumes more power.

    I can go a lot more in depth but I’m choosing to stop here because this can comment can get absurdly long.


  • icedterminal@lemmy.worldtoAndroid@lemdro.idRam in phones?
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    8 months ago

    Linux uses half the RAM Windows does in a fresh install. 8GB can absolutely be done on a Linux system without worry. To aid systems with 4-8GB RAM, Windows compresses. This has allowed OEMs to ship systems with 8GB as a minimum. This just isn’t enough for multitasking. The CPU is tasked with constantly compressing and decomposing if you’re attempting to multitask. This can make an already cheap laptop feel a little more sluggish. 16GB has always been the minimum for gaming systems and these days it’s becoming apparent 32GB is needed. 8GB is just pitiful for a computer these days.


    Addressing the OP, mobile devices used to only need 2-4GB for the longest time. The OS wasn’t that heavy because the ARM CPU could only do so much. As the CPUs improved, higher resolutions were used, prettier animations and more features got added. This all needs more RAM. Android developer options will tell you how much RAM you’re using. A feature of Android is to keep a process cached in RAM that’s been recently used. This is present to aid in battery life. Even if you swipe the app away from recents list, a portion is cached so the next time you start it, the CPU doesn’t have to work as hard to load it up. You can see this under Running services > Cached processes. This means it’s more beneficial for the mobile device to have more RAM.



  • You always will. Welcome to the Internet. The difference is whether or not you’ve taken steps to secure your stuff. You need to understand what this malware is looking for. It’s explicitly looking for unsecured services. Such as WordPress, SQL, etc. There are inexperienced users out there that inadvertently expose themselves. I see this type of probing at work and at home. Don’t overly stress it. My home server has been running for a decade without issues. Just keep it updated and read before you make any changes if you don’t fully understand the implications.

    My home based server is behind a pfsense firewall. Runs Arch. Everything is in a non-root docker container. SELinux is enforced. All domains are routed through Cloudflare. Some use Cloudflare Zero Trust.


  • Oh my. You’re doing it wrong. Exposing the unencrypted connection without the proper security measures is putting yourself at risk. Regardless of how strong you set the password, the connection can still be abused in all manner of ways. If you read the jellyfin documentation, you’d see the developers clearly state you should never do this. You need to put Jellyfin behind server software. Specifically a reverse proxy. I use NGINX. You can setup your connection to be secure this way. You can now also use Cloudflare if you have cache turned off. And if you really wanna go the extra mile, route it behind a VPN. Though this makes it harder for those you share it with or some devices that don’t support VPN.

    Please revise your connection. If you need help, feel free to reach out.



  • Jellyfin gives you 100% control. You’re responsible for setting up remote access. Which actually isn’t that hard. Several IT and network admins of the community (myself included) hand out documentation on how to do this. Without completely ruining your security.

    With Plex, some of the application communication is routed through their network. It requires an active internet connection and you must create an account with them. They have third party analytics embedded, use tracking pixels, beacons and device fingerprinting. Whatever personal data you have supplied is used to serve ads. This being their promoted content that isn’t part of your library.