cross-posted from: https://lemmy.world/post/13805928

It’s a long vid. I suggest prepping your fav drink before viewing.

It’s re Nvidia’s new gpu architecture for ai, NVlink switch, RAS diagnostics and other Nvidia announcements.

Nvidia knows it’s the star of the backbone of the current ai boom and seems to be going full steam. I’m hoping for more innovations on tools for ai and gaming in the future.

    • Bobby Turkalino@lemmy.yachts
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Happily playing modern games and developing shaders on my AMD GPU. 5120x1440 120 Hz issue free

      I wish people would get their shit together and realize they’ve fallen victim to marketing

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        It’s not marketing, AMD sucks for ML stuff. I don’t just play games. Everything is harder, with fewer features and less performant with AMD.

        • deadbeef@lemmy.nz
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won’t work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.

          I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).

        • ichbinjasokreativ@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          Really? I’ve only dabbled with locally run AI for a bit, but performance in something like ollama or stable diffusion has been really great on my 6900xt.

      • IsThisAnAI@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        2
        ·
        edit-2
        6 months ago

        AMD successful at the mid tier?! I’m shocked!

        NVIDIA prints money in the enterprise where business will literally lose money over the extra compute, and lesser so in high end gaming with details turned up. AMD simply can’t complete, it’s not marketing, it’s a better product.

        • anyhow2503@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          AMD has never gotten more than 50% of the market, even in the years where their entire product lineup offered better performance/features for less money. I’m talking about the “good old days” here, where software features weren’t a big factor for consumers and ML was nonexistent. You have to be delusional to think that Nvidia doesn’t hold a very clear mindshare and marketing advantage.

          • IsThisAnAI@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            2
            ·
            edit-2
            6 months ago

            Oh you mean because of the shit buggy drivers they had? NVIDIA has nearly always had the most compelling product at most upper price points. When ATIs product line was briefly straight up faster for a brief period (years lololol👌) and folks choose Nvidia because ATI drivers were a god awful buggy mess.

            But yeah everyone is just a little brainwashed leming 🙄.

            • mb_@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              They still have, I replaced my 3070 with a 7900 xtx and the 7900 is constantly freezing with ring GPU errors and drivers completely effing up the system. I have already replaced it twice, and I am using workarounds to not hit bugs, but they happen every few days…

                • mb_@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  Powercolor, red devil.

                  Under 6.7 I was able to find some a combination that was usable for a few days.

                  With 6.8, timeouts would happen within 30 minutes.

                  I fiddles with sched_job module option and the system seems stable now.

    • micka190@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      Seriously. AI aside, if you’re doing anything 3D-related you’re basically shooting yourself in the foot by not going Team Green. The difference in render time/quality is exponentially better. I’d kill to see AMD or Intel pull a Ryzen in the GPU market.