Someone using Linux Mint would be a good guess as I don’t think they default to Google.
Someone using Linux Mint would be a good guess as I don’t think they default to Google.
Then you use DuckDuckGo like I do. Not every search engine has gone to complete shit. Google was just an example. Obviously it’s not the current meta in terms of search engines.
Are you honestly telling me there aren’t people asking basic questions that could be solved with a google search? Don’t get me wrong the kind of question you are talking about does exist, but that’s now what I am discussing here.
I don’t think you have interpreted that correctly. People tend to reinstall when changing versions, for example from Ubuntu 22.04 to 24.04. That isn’t the same as doing updates.
Honestly if you are that worried about updates breaking stuff, you might be better off using an immutable distro. These work using images and/or snapshots so it’s easy to rollback if something goes wrong. It’s also just less likely to go wrong as you aren’t upgrading individual packages as much, but rather the base system as a whole. Both Fedora and Open Suse have atomic/immutable variants with derivatives like Universal Blue providing ready to go setups for specific use cases like gaming and workstation use.
Alternatively the likes of Debian rarely break because of updates as everything is thoroughly tested before deployment. Gentoo and void are the same deal but in rolling release format so they are at least somewhat up to date while still being quite well tested.
Yeah unfortunately this is a real issue. I also think it’s an issue that experienced users don’t really want to help newbies, especially those who can’t or won’t do research by themselves. Ideally experienced users would be more helpful, but at the same time that isn’t their job. There are many who learned Linux more or less on their own so it’s understandable they don’t want to help given they didn’t use any help when it was their turn. I think now that the community is growing this might start to change a bit, as the newcomers are more likely to have had help and be willing to help others.
I sometimes try to advocate for using Linux, and I don’t mind giving friends advice from time to time. That being said I don’t want to be stuck answering stupid questions all the time that could have been solved with a google search or a YouTube video. I have my own stuff to worry about both technical and otherwise.
That’s why I think teaching new users how to access resources like man pages, gnu info pages, google, and so on is the correct approach to take. It is empowering having the skills to work through your own issues. That being said I also think it’s important for experienced people to give advice on more complex questions.
People see AI and immediately think of ChatGPT. This is despite the fact that AI has been around far longer and does way more things including OCR and data mining. It’s never been AI that’s the problem, but rather certain uses of AI.
Yes, blink is the engine Chromium uses. Since KHTML was an open source project any project based on it will have to be open source, unless of course it’s just used as a library. Even in that case though blink the engine is forced to be open source even if the browser as a whole isn’t. GNU licenses are considered infectious because anything containing any GNU code automatically and legally becomes open source. So KHTML being unmaintained is irrelevant.
If I remember correctly it’s under a copy left license which makes sense given it’s ultimately a derivative of KHTML.
Yeah so I also use CachyOS on a couple things and one of them also uses Cachy Browser.
Don’t Firefox and Chromium already have that?
I’ve seen teachers use this stuff and get actually decent results. I’ve also seen papers where people use LLMs to hack into a computer, which is a damn sophisticated task. So you are either badly informed or just lying. While LLMs aren’t perfect and aren’t a replacement for humans, they are still very much useful. To believe otherwise is folly and shows your personal bias.
I am not talking about things like ChatGPT that rely more on raw compute and scaling than some other approaches and are hosted at massive data centers. I actually find their approach wasteful as well. I am talking about some of the open weights models that use a fraction of the resources for similar quality of output. According to some industry experts that will be the way forward anyway as purely making models bigger has limits and is hella expensive.
Another thing to bear in mind is that training a model is more resource intensive than using it, though that’s also been worked on.
Bruh you have no idea about the costs. Doubt you have even tried running AI models on your own hardware. There are literally some models that will run on a decent smartphone. Not every LLM is ChatGPT that’s enormous in size and resource consumption, and hidden behind a vail of closed source technology.
Also that trick isn’t going to work just looking at a comment. Lemmy compresses whitespace because it uses Markdown. It only shows the extra lines when replying.
Can I ask you something? What did Machine Learning do to you? Did a robot kill your wife?
Even if it didn’t improve further there are still uses for LLMs we have today. That’s only one kind of AI as well, the kind that makes all the images and videos is completely separate. That has come on a long way too.
How did you end up with that lack of emotional capacity?
From what I heard they do actually put a lot of effort into simulating airplane aerodynamics at least for the smaller planes. So the flying part is kind of important.
I don’t think this is strictly true. They do tweak parts of the kernel such as the CPU scheduler to deal with new CPU designs that come out which have special scheduling requirements. That’s actually happened quite a bit recently with AMD and Intel both offering CPUs with asymmetric processors with big and little cores, different clock speeds, different cache, sometimes even different instructions on different cores. They also added ReFS not long ago, which may have required some kernel work.
I can understand though if they have few experienced people and way more junior devs. It would probably explain a lot to be honest. A lot of Microsoft stuff is bloated and/or unreliable.
I mean for one it supports a lot less hardware. Second it’s significantly less reliable. Third it has thing like Co-Pilot built-in. I don’t know how people aren’t criticizing it more frankly.