• 0 Posts
  • 46 Comments
Joined 1 year ago
cake
Cake day: June 1st, 2023

help-circle



  • Kata1yst@kbin.socialtoTechnology@lemmy.worldThe decline of Intel..
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    edit-2
    6 months ago

    2009 era was also when Intel leveraged their position in the compiler market to cripple all non-Intel processors. Nearly every benchmarking tool used that complier and put an enormous handicap on AMD processors by locking them to either no SSE or, later, back to SSE2.

    My friends all thought I was crazy for buying AMD, but accusations had started circulating about the complier heavily favoring Intel at least as early as 2005, and they were finally ordered to stop in 2010 by the FTC… Though of course they have been caught cheating in several other ways since.

    Everyone has this picture in their heads of AMD being the scrappy underdog and Intel being the professional choice, but Intel hasn’t really worn the crown since the release of Athlon. Except during Bulldozer/Piledriver, but who can blame AMD for trying something crazy after 10 years of frustration?







  • Never ask a man his pay, a woman her weight, or a data horder the contents of their stash.

    Jk. Mostly.

    I have a similar-ish set up to @Davel23 , I have a couple of cool use cases.

    • I seed the last 5 arch and opensuse (a few different flavors) ISOs at all times

    • I run an ArchiveBot for archive.org

    • I scan nontrivial mail (the paper kind) and store it in docspell for later OCR searches, tax purposes etc.

    • I help keep Sci-Hub healthy

    • I host several services for de-googling, including Nextcloud, Blocky, Immich, and Searxng

    • I run Navidrome, that has mostly (and hopefully will soon completely) replace Spotify for my family.

    • I run Plex (hoping to move to Jellyfin sometime, but there’s inertial resistance to that) that has completely replaced Disney streaming, Netflix streaming, etc for me and my extended family.

    • I host backups for my family and close friends with an S3 and WebDAV backup target

    I run 4x14TB, 2x8TB, 2x4TB, all from serverpartsdeals, in a ZFS RAID10 with two 1TB cache dives, so half of the spinning rust usable at ~35TB, and right now I’m at 62% utilization. I usually expand at about 85%




  • Kata1yst@kbin.socialtolinuxmemes@lemmy.worldArch with XZ
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    8 months ago

    Amazingly, for someone so eager to give a lesson in linguistics, you managed to ignore literal definitions of the words in question and entirely skip relevant information in my (quite short) reply.

    Both are widely used in that context. Language is like that.

    Further, the textbook definition of Stability-

    the quality, state, or degree of being stable: such as

    a: the strength to stand or endure : firmness

    b: the property of a body that causes it when disturbed from a condition of equilibrium or steady motion to develop forces or moments that restore the original condition

    c: resistance to chemical change or to physical disintegration

    Pay particular attention to “b”.

    The state of my system is “running”. Something changes. If the system doesn’t continue to be state “running”, the system is unstable BY TEXTBOOK DEFINITION.



  • I think the confusion comes from the meaning of stable. In software there are two relevant meanings:

    1. Unchanging, or changing the least possible amount.

    2. Not crashing / requiring intervention to keep running.

    Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.

    Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.

    Honestly, both strategies work well. I’ve had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.

    It really depends on the user’s needs and maintenance frequency.


  • Author doesn’t seem to understand that executives everywhere are full of bullshit and marketing and journalism everywhere is perversely incentivized to inflate claims.

    But that doesn’t mean the technology behind that executive, marketing, and journalism isn’t game changing.

    Full disclosure, I’m both well informed and undoubtedly biased as someone in the industry, but I’ll share my perspective. Also, I’ll use AI here the way the author does, to represent the cutting edge of Machine Learning, Generative Self-Reenforcement Learning Algorithms, and Large Language Models. Yes, AI is a marketing catch-all. But most people better understand what “AI” means, so I’ll use it.

    AI is capable of revolutionizing important niches in nearly every industry. This isn’t really in question. There have been dozens of scientific papers and case studies proving this in healthcare, fraud prevention, physics, mathematics, and many many more.

    The problem right now is one of transparency, maturity, and economics.

    The biggest companies are either notoriously tight-lipped about anything they think might give them a market advantage, or notoriously slow to adopt new technologies. We know AI has been deeply integrated in the Google Search stack and in other core lines of business, for example. But with pressure to resell this AI investment to their customers via the Gemini offering, we’re very unlikely to see them publicly examine ROI anytime soon. The same story is playing out at nearly every company with the technical chops and cash to invest.

    As far as maturity, AI is growing by astronomical leaps each year, as mathematicians and computer scientists discover better ways to do even the simplest steps in an AI. Hell, the groundbreaking papers that are literally the cornerstone of every single commercial AI right now are “Attention is All You Need” (2017) and
    “Retrieval-Augmented Generation for Knowledge -Intensive NLP Tasks” (2020). Moving from a scientific paper to production generally takes more than a decade in most industries. The fact that we’re publishing new techniques today and pushing to prod a scant few months later should give you an idea of the breakneck speed the industry is going at right now.

    And finally, economically, building, training, and running a new AI oriented towards either specific or general tasks is horrendously expensive. One of the biggest breakthroughs we’ve had with AI is realizing the accuracy plateau we hit in the early 2000s was largely limited by data scale and quality. Fixing these issues at a scale large enough to make a useful model uses insane amounts of hardware and energy, and if you find a better way to do things next week, you have to start all over. Further, you need specialized programmers, mathematicians, and operations folks to build and run the code.
    Long story short, start-ups are struggling to come to market with AI outside of basic applications, and of course cut-throat silicon valley does it’s thing and most of these companies are either priced out, acquired, or otherwise forced out of business before bringing something to the general market.

    Call the tech industry out for the slime is generally is, but the AI technology itself is extremely promising.