cross-posted from: https://lemmy.world/post/13805928

It’s a long vid. I suggest prepping your fav drink before viewing.

It’s re Nvidia’s new gpu architecture for ai, NVlink switch, RAS diagnostics and other Nvidia announcements.

Nvidia knows it’s the star of the backbone of the current ai boom and seems to be going full steam. I’m hoping for more innovations on tools for ai and gaming in the future.

    • Bobby Turkalino@lemmy.yachts
      link
      fedilink
      English
      arrow-up
      65
      arrow-down
      13
      ·
      8 months ago

      Happily playing modern games and developing shaders on my AMD GPU. 5120x1440 120 Hz issue free

      I wish people would get their shit together and realize they’ve fallen victim to marketing

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        4
        ·
        8 months ago

        It’s not marketing, AMD sucks for ML stuff. I don’t just play games. Everything is harder, with fewer features and less performant with AMD.

        • deadbeef@lemmy.nz
          link
          fedilink
          English
          arrow-up
          27
          arrow-down
          4
          ·
          8 months ago

          The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won’t work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.

          I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).

        • ichbinjasokreativ@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          3
          ·
          8 months ago

          Really? I’ve only dabbled with locally run AI for a bit, but performance in something like ollama or stable diffusion has been really great on my 6900xt.

          • accideath@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            8 months ago

            The problem isn’t, that it isn’t great, but that nvidia cards are just better at a given price point, partially thanks to cuda.

            And for gaming and general use, my experience in the last few years has been, that nvidia still has the leg up, when it comes to drivers on windows. Never had a nvidia card make any problems. AMD, not so much.

            Would still happily trade my GTX 1650 with a RX 6400 because I recently switched to Linux and it’s a whole different world there…

      • IsThisAnAI@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        6
        ·
        edit-2
        8 months ago

        AMD successful at the mid tier?! I’m shocked!

        NVIDIA prints money in the enterprise where business will literally lose money over the extra compute, and lesser so in high end gaming with details turned up. AMD simply can’t complete, it’s not marketing, it’s a better product.

        • anyhow2503@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          8 months ago

          AMD has never gotten more than 50% of the market, even in the years where their entire product lineup offered better performance/features for less money. I’m talking about the “good old days” here, where software features weren’t a big factor for consumers and ML was nonexistent. You have to be delusional to think that Nvidia doesn’t hold a very clear mindshare and marketing advantage.

          • IsThisAnAI@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            13
            ·
            edit-2
            8 months ago

            Oh you mean because of the shit buggy drivers they had? NVIDIA has nearly always had the most compelling product at most upper price points. When ATIs product line was briefly straight up faster for a brief period (years lololol👌) and folks choose Nvidia because ATI drivers were a god awful buggy mess.

            But yeah everyone is just a little brainwashed leming 🙄.

            • mb_@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              8 months ago

              They still have, I replaced my 3070 with a 7900 xtx and the 7900 is constantly freezing with ring GPU errors and drivers completely effing up the system. I have already replaced it twice, and I am using workarounds to not hit bugs, but they happen every few days…

                • mb_@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  Powercolor, red devil.

                  Under 6.7 I was able to find some a combination that was usable for a few days.

                  With 6.8, timeouts would happen within 30 minutes.

                  I fiddles with sched_job module option and the system seems stable now.

    • micka190@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      7
      ·
      8 months ago

      Seriously. AI aside, if you’re doing anything 3D-related you’re basically shooting yourself in the foot by not going Team Green. The difference in render time/quality is exponentially better. I’d kill to see AMD or Intel pull a Ryzen in the GPU market.

      • nexusband@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        Yeah, the quality of Nvidia pro drivers is a crapshoot though. We’ve had so many issues, especially with OpenGL, it just isn’t funny anymore. Granted, AMD isn’t really that much better, but at least the cards cost a fraction and I have more confidence in AMD fixing the problem, than I have in Nvidia.

    • alihan_banan@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      Success of nvidia now is pretty much the same as the success of Intel back when cisc was losing to RISC(so they became partially risc, lol) and Intel got developers attention because of IBM PC and its clones. So is Nvidia that was chosen by Open AI. Intel competition was quite on par with blue giant(not only amd), but one big player has decided the winner and same with Nvidia since both AMD and Intel and Google have their ai accelerators that are not really worse and even better from time to time. And now after Blackwell Intel presents Gaudi 2, AMD prepares new CDNA generation, Chinese companies create more cost effective solutions, tenstorrent too starts catching up.

      Nvidia’s and Intel’s wins in their respective fields were not really products of their innovations, but rather good cooperation

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    8 months ago

    Nvidia has the short term covered, but I’m skeptical they are going to end up leading the AI chip market in about 5 years or so without acquisitions.

    Recent research has shown not only efficiency gains but also actual performance gains with binary or ternary weights instead of floats.

    This means you don’t need FP calculations or matrix multiplication.

    It requires being trained from scratch with that architecture in mind, so it will probably be 12 to 18 months before we see leading models with light weights, but once we do the market may go more towards faster and more energy efficient options that don’t need to rely on Nvidia’s legacy of IP for FP ops.

    So while an unmatched king in how things are currently done, the magic phrase that brings any monarch to tears is “this too shall pass.”

    • test_profile1@api.clubsall.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      While I agree with everything you said, I thought the same that after Crypto, Nvidia is done. They did have a dip, before AI jackpot hit.

      Once models hit, new chips are bought and installed, it will be another year or two. So they are good for 2-3 years. They are also aware of this issue, I think they are working on CPUs and other businesses, so it won’t be total loss. But I agree, sustaining stock at this price level seems unrealistic as of now.