Well I didn’t say they were perfect.
Well I didn’t say they were perfect.
While the 9000 series looks decent, I honestly think Intel has a really interesting platform to build off of with the core ultra chips. It feels like Intel course correcting with poor decisions made for the 13th and 14th gen chips. Wendel from Level1 techs made a really good video about the good things Intel put into the chips while also highlighting some of the bad things, things like a built-in NPU and how they’re going to use that to pull in profiles for applications and games with ML, or the fact that performance variance occurs between chipset makers more often with the core ultra. It’s basically a step forwards in tech but a step backwards in price/performance.
Not sure you know what swap is…I looked at my m1 after a night of gaming on GeForce now and filling out forms on Google Chrome. My swap was at 0, my used ram was at 4GB used out of the 8GB and didn’t show down at all. I’m sorry you have had a terrible experience with your Mac, I love my Mac mini and will enjoy it as a really cool piece of tech.
This is some real doom and gloom shit right here. You really think it’s a single-vendor market? I guess if you want to buy a 4090 then yes it is, but everything else is a three horse race. This doesn’t mean AMD is saying buy consoles, all this means is they’re focusing on the midrange market that I’m 100% confident Nvidia will completely ditch sometime in the future and tell most customers to use Geforce Now if they want midrange prices. To be completely real here, Nvidia only has ray tracing holding them up right now, as soon as the competition catches up they won’t have anything to gouge and will be kicking themselves for not really innovating any further.
This is how it’s always been for the past 30 years, Nvidia makes a good card and prices it high with 4 or 5 generational updates, Radeon makes a good price/performance midrange card that undercuts Nvidia, everyone wins. The only difference now is Intel has created a very compelling product with their GPUs and I’m pretty confident battle mage will be a big improvement over the current Arc cards and give AMD a run for their money.
Intel is learning from AMD and playing the long game with their hardware, the latest core ultra CPUs are great in Linux vs Windows and will hopefully get better over time, and the battle mage cards will hopefully have day 1 support for Linux and good support for Windows. You simply have to change your expectations here, the market is shifting, Nvidia makes more money hand over fist with AI/ML chips and GFN than they do their consumer graphics cards, they don’t have to make their cards cheap anymore to compete, they can price them however they want due to the cost offset of their server market. I personally find the midrange market to be way more compelling these days than overpriced high end Nvidia chips, maybe you should rethink your position as well.
MacOS, no matter what anyone says, has extremely efficient memory management. It’s seriously impressive how efficient that OS truly is, and it’s no surprise they stuck with 8GB for so long. The thing these clickbait articles don’t really bring to light is that the 16GB increase is really for Apple intelligence. If that wasn’t a thing these Macs would stick to 8GB.
Still running an m1 Mac mini right now, it’s a damn good machine, but the performance gains over the years on the m series chips haven’t really forced me to upgrade yet. As for gaming, I just use GeForce now to play my steam library and it’s awesome, it’s a really great combo. The 8GB of ram is lacking, but I’m using GFN and not pushing it too hard, so I don’t notice any meaningful performance problems. I’m also not editing photos or videos, so that probably helps.
Thank fuck I use Linux as a daily driver. I won’t touch this AI infested bullshit that windows and Google are becoming. This is just an IT security nightmare.
They did, AMD holds the x64 license, Intel holds the x86.
We’ve reached the power limits of what AI and LLMs are capable of, that’s why Google, Microsoft and Amazon are investing in nuclear power and funding projects like reopening three mile Island. They need a good clean source of energy to fuel these data centers running copilot and Gemini. The thing is they don’t want us to know they’re at their limits right now, because when they admit that, the AI bubble will burst and investment money will dry up. That’s where we are right now, humanity has created something that requires so much energy to run that nuclear fuel is the only option to keep up with power demands. At least it’s clean and efficient energy.
If there was ever a time for valve to push advertising out for the steam deck and steamOS it’s now. The final piece of the gaming puzzle is anticheat. If valve gets the proprietary anticheat makers on board then it’s all over. Every major hurdle would’ve been overcome, but games like valorant and call of duty still don’t work because of vanguard and ricochet.
With how terrible windows handhelds are, imagine how awesome it would be for those cod players to be able to play a round of warzone on the toilet? I joke, but seriously, that’s the demographic that needs to adopt a platform like the steam deck. That’s the barrier valve has to overcome, and I’m worried they just don’t care or something even more legally gray is happening, like Microsoft giving game devs incentive to use proprietary anticheat or to just not flip that EAC flag in their code.
AI peaked a while ago IMO, the nail in the coffin for me was Microsoft making deals for nuclear power plants to power their data centers for ML and AI. It’s great they’re using nuclear power since it’s at least a clean source of energy, but it’s also extremely telling of the limitations and power requirements for these languages models. Without some kind of power reduction breakthrough, AI will continue to stall while these companies think of new ways to sell snake oil and gimmicks.
It’s irresponsible because making it sound like it’s true AI when it’s not is going to make it difficult to pull the plug when things go wrong and you’ll have the debate of whether it’s sentient or not and if it’s humane to kill it like a pet or a criminal. It’s more akin to using rainbow tables to help crack passwords and claiming your software is super powerful when in reality it’s nothing without the tables. (Very very rudimentary example that’s not supposed to be taken at face value).
It’s dangerous because talking about AI like it’s a reasoning/thinking thing is just not true, and we’re already seeing the big AI overlords try to justify how they created it with copyrighted material, which means the arguments over copyrighted material are being made and we’ll soon see those companies claim that it’s no different than a child looking up something on Google. It’s irresponsible because it screws over creative people and copyright holders that genuinely made a product or piece of art or book or something in their own free time and now it’s been ripped away to be used to create something else that will eventually push those copyright holders out.
The AI market is moving faster than the world is capable of keeping up with it, and that is a dangerous precedent to set for the future of this market. And for the record I don’t think we’re dealing with early generations of skynet or anything like that, we’re dealing with tools that have the capability to create economical collapse on a scale we’ve never seen, and if we don’t lay the ground rules now, then we will be in trouble.
Edit: A great example of this is https://v0.dev/chat it has the potential to put front end developers out of work and jobless. It’s simple now but give it time and it has the potential to create a frontend that rivals the best UX designs if the prompt is right.
OpenAI doesn’t want you to know that though, they want their work to show progress so they get more investor money. It’s pretty fucking disgusting and dangerous to call this tech any form of artificial intelligence. The homogeneous naming conventions to make this tech sound human is also dangerous and irresponsible.
Does it improve the bandwidth so higher quality codecs can be used without having to switch between good quality sound and shitty mics to shitty sound and good mics? I mean seriously, we’re in 2024 and we still can’t have quality parity with a wired headset when using Bluetooth because the bandwidth sucks so much ass that better codecs just can’t be used. Bluetooth can die in a fucking fire.
That’s odd, I installed the Gemini app on my 6a like a week ago. Unless you’re specifically talking about grapheneOS.
So are they going to reassess the capability of kernel level drivers like crowdstrike and anticheat solutions like vanguard? Because of they keep this capability open then they’re just asking for another fuck up.
For those who want to escape this bullshit, Linux welcomed you with open arms and gives you control of your PC. Microsoft doesn’t respect you, ditch them and move to something that will.
I also would love to know why cachy
Linux mint exists, switch and never look back. They just released version 22 and it’s probably the best version of mint I’ve ever used. Switch to mint and use flatpaks instead.
At least Xbox made some sense, it was originally going to be called the DirectXbox, thankfully they shortened the name to something catchier.