Yes…? All are except Microsoft, which is why most companies I work with aren’t looking that way.
Yes…? All are except Microsoft, which is why most companies I work with aren’t looking that way.
I know several large companies looking to Microsoft, Xen, and Proxmox. Though the smart ones are more interested in the open source solutions to avoid future rug-pulls.
2009 era was also when Intel leveraged their position in the compiler market to cripple all non-Intel processors. Nearly every benchmarking tool used that complier and put an enormous handicap on AMD processors by locking them to either no SSE or, later, back to SSE2.
My friends all thought I was crazy for buying AMD, but accusations had started circulating about the complier heavily favoring Intel at least as early as 2005, and they were finally ordered to stop in 2010 by the FTC… Though of course they have been caught cheating in several other ways since.
Everyone has this picture in their heads of AMD being the scrappy underdog and Intel being the professional choice, but Intel hasn’t really worn the crown since the release of Athlon. Except during Bulldozer/Piledriver, but who can blame AMD for trying something crazy after 10 years of frustration?
I host my own to avoid running into timeouts, fairly easy
MRSA infection following hospital admittance for Pneumonia. That shit is serious and way more prevalent than people think, it’s just that it usually kills people who are already terminally ill.
Unlikely to be an assassination. But not impossible. Either way, looks very bad.
The recommendation to shareholders from the independent advisor who proxies Boeing is to vote out several board members who are responsible for safety and QA. Crazy to see at a Fortune 100.
I use FreshRSS. Can’t say I love the interface, but with the open and standardized API, there are dozens of beautiful front ends to choose on any device.
For real? Damn it that’s going to be painful.
Never ask a man his pay, a woman her weight, or a data horder the contents of their stash.
Jk. Mostly.
I have a similar-ish set up to @Davel23 , I have a couple of cool use cases.
I seed the last 5 arch and opensuse (a few different flavors) ISOs at all times
I run an ArchiveBot for archive.org
I scan nontrivial mail (the paper kind) and store it in docspell for later OCR searches, tax purposes etc.
I help keep Sci-Hub healthy
I host several services for de-googling, including Nextcloud, Blocky, Immich, and Searxng
I run Navidrome, that has mostly (and hopefully will soon completely) replace Spotify for my family.
I run Plex (hoping to move to Jellyfin sometime, but there’s inertial resistance to that) that has completely replaced Disney streaming, Netflix streaming, etc for me and my extended family.
I host backups for my family and close friends with an S3 and WebDAV backup target
I run 4x14TB, 2x8TB, 2x4TB, all from serverpartsdeals, in a ZFS RAID10 with two 1TB cache dives, so half of the spinning rust usable at ~35TB, and right now I’m at 62% utilization. I usually expand at about 85%
You found one video supporting your viewpoint. Kaspersky’s role in Russian intelligence has been an open secret since the mid 2010s. This is Facebook Anti-Vaxxer “research” methodology.
No, I’m not conflating “a” with “b”. I’m using stability exactly as it’s used in physics.
https://phys.libretexts.org/Bookshelves/College_Physics/College_Physics_1e_(OpenStax)/09%3A_Statics_and_Torque/9.03%3A_Stability
My point is, it’s a completely valid use of the word. And yes, so is reliable, though I think “reliable” fails to capture the essence of the system changing but maintaining it’s state, hence why we don’t study “reliable systems” in physics.
I recommend picking something else to be pedantic about.
Amazingly, for someone so eager to give a lesson in linguistics, you managed to ignore literal definitions of the words in question and entirely skip relevant information in my (quite short) reply.
Both are widely used in that context. Language is like that.
Further, the textbook definition of Stability-
the quality, state, or degree of being stable: such as
a: the strength to stand or endure : firmness
b: the property of a body that causes it when disturbed from a condition of equilibrium or steady motion to develop forces or moments that restore the original condition
c: resistance to chemical change or to physical disintegration
Pay particular attention to “b”.
The state of my system is “running”. Something changes. If the system doesn’t continue to be state “running”, the system is unstable BY TEXTBOOK DEFINITION.
Both are widely used in that context. Language is like that.
I think the confusion comes from the meaning of stable. In software there are two relevant meanings:
Unchanging, or changing the least possible amount.
Not crashing / requiring intervention to keep running.
Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.
Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.
Honestly, both strategies work well. I’ve had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.
It really depends on the user’s needs and maintenance frequency.
Author doesn’t seem to understand that executives everywhere are full of bullshit and marketing and journalism everywhere is perversely incentivized to inflate claims.
But that doesn’t mean the technology behind that executive, marketing, and journalism isn’t game changing.
Full disclosure, I’m both well informed and undoubtedly biased as someone in the industry, but I’ll share my perspective. Also, I’ll use AI here the way the author does, to represent the cutting edge of Machine Learning, Generative Self-Reenforcement Learning Algorithms, and Large Language Models. Yes, AI is a marketing catch-all. But most people better understand what “AI” means, so I’ll use it.
AI is capable of revolutionizing important niches in nearly every industry. This isn’t really in question. There have been dozens of scientific papers and case studies proving this in healthcare, fraud prevention, physics, mathematics, and many many more.
The problem right now is one of transparency, maturity, and economics.
The biggest companies are either notoriously tight-lipped about anything they think might give them a market advantage, or notoriously slow to adopt new technologies. We know AI has been deeply integrated in the Google Search stack and in other core lines of business, for example. But with pressure to resell this AI investment to their customers via the Gemini offering, we’re very unlikely to see them publicly examine ROI anytime soon. The same story is playing out at nearly every company with the technical chops and cash to invest.
As far as maturity, AI is growing by astronomical leaps each year, as mathematicians and computer scientists discover better ways to do even the simplest steps in an AI. Hell, the groundbreaking papers that are literally the cornerstone of every single commercial AI right now are “Attention is All You Need” (2017) and
“Retrieval-Augmented Generation for Knowledge -Intensive NLP Tasks” (2020). Moving from a scientific paper to production generally takes more than a decade in most industries. The fact that we’re publishing new techniques today and pushing to prod a scant few months later should give you an idea of the breakneck speed the industry is going at right now.
And finally, economically, building, training, and running a new AI oriented towards either specific or general tasks is horrendously expensive. One of the biggest breakthroughs we’ve had with AI is realizing the accuracy plateau we hit in the early 2000s was largely limited by data scale and quality. Fixing these issues at a scale large enough to make a useful model uses insane amounts of hardware and energy, and if you find a better way to do things next week, you have to start all over. Further, you need specialized programmers, mathematicians, and operations folks to build and run the code.
Long story short, start-ups are struggling to come to market with AI outside of basic applications, and of course cut-throat silicon valley does it’s thing and most of these companies are either priced out, acquired, or otherwise forced out of business before bringing something to the general market.
Call the tech industry out for the slime is generally is, but the AI technology itself is extremely promising.
Yeah honestly no idea regarding moderation. But the codebase is maintained by a team.
There is a team, not a sole dev.
I’m not saying everything is roses and rainbows, but this is FUD messaging being spread openly by the mbin dev team.
I’ve had great experiences with exactly one vendor of second hand disks.
Currently running 8x14TB in a striped & mirrored zpool.
Really all I do is setup fail2ban on my very few external services, and then put all other access behind wireguard.
Logs are clean, I’m happy.
Over the years of using Vim both professionally and for my own uses, I’ve learned to just install LunarVim and only add a handful of packages/overrides. Otherwise I just waste too much time tinkering and not doing the things I need to.