Announcement

Collapse
No announcement yet.

Blender 4.1 To Support Cycles Renderer On AMD RDNA3 APUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Blender 4.1 To Support Cycles Renderer On AMD RDNA3 APUs

    Phoronix: Blender 4.1 To Support Cycles Renderer On AMD RDNA3 APUs

    Blender 4.0 will be releasing in early November but already there is something to look forward to with Blender 4.1 next year...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I wish APU's would be used more often to support certain calculations (including ML, AI stuff).

    Comment


    • #3
      Let me guess - it will run so bad it will essentially be less of a hand out to amd users and more of a commercial ad for nvidia?

      Comment


      • #4
        Somewhat offtopic:
        Are consumer grade quantum computers due out in how much, 20-30 years? Will it mean that the current split into CPU/GPU will become irrelevant together with a lot of software?

        Comment


        • #5
          Originally posted by CochainComplex View Post
          I wish APU's would be used more often to support certain calculations (including ML, AI stuff).
          I completely agree. Even the smallest of iGPUs aren't small; they might not be powerful compared to a full-blown desktop GPU but to put things into perspective: the 780M is about equally as powerful as the R9 290, which I am currently using. My 290 can play many 10-year-old games in 4K. That's not insignificant. It's a shame to have all this processing power to go waste just because a dGPU is present.

          Originally posted by ddriver View Post
          Let me guess - it will run so bad it will essentially be less of a hand out to amd users and more of a commercial ad for nvidia?
          Are you saying this as a trend of Blender or because of the performance of such APUs? Because even though iGPUs are measly compared to modern dGPUs, it's still free performance. Think of it like overclocking your GPU with the stock heatsink - you probably won't get far but it's better than nothing.​

          Originally posted by cl333r View Post
          Somewhat offtopic:
          Are consumer grade quantum computers due out in how much, 20-30 years? Will it mean that the current split into CPU/GPU will become irrelevant together with a lot of software?
          I don't see consumer-grade quantum computers to ever exist, even if they can run at room temperature. Their use-case, while incredibly important, is very niche.

          Comment


          • #6
            Originally posted by schmidtbag View Post
            I completely agree. Even the smallest of iGPUs aren't small; they might not be powerful compared to a full-blown desktop GPU but to put things into perspective: the 780M is about equally as powerful as the R9 290, which I am currently using. My 290 can play many 10-year-old games in 4K. That's not insignificant. It's a shame to have all this processing power to go waste just because a dGPU is present.


            Are you saying this as a trend of Blender or because of the performance of such APUs? Because even though iGPUs are measly compared to modern dGPUs, it's still free performance. Think of it like overclocking your GPU with the stock heatsink - you probably won't get far but it's better than nothing.​


            I don't see consumer-grade quantum computers to ever exist, even if they can run at room temperature. Their use-case, while incredibly important, is very niche.
            I have a NAS with AMD APU. I wish the recognize APP in Nectcloud would properly support AMD APUs it runs over cuda and pytorch but my APU is not officially upported by ROCm..

            Comment


            • #7
              Originally posted by schmidtbag View Post

              Are you saying this as a trend of Blender or because of the performance of such APUs?

              It is almost certainly using heavily nvidia uarch tuned design, and it will almost certainly be a dull translate of that for amd. And those tend to function way below the hardware optimum.

              Thus the decent chance it will do more "making amd look bad" than "making amd faster".

              I'd be happy to be wrong on that one, that's merely the expectations that blender devs have set for me.

              And I expect it to be in the ballpark of 5 to 10 time slower than it needs to be.

              We shall see soon enough.


              Originally posted by CochainComplex View Post
              I wish APU's would be used more often to support certain calculations (including ML, AI stuff).


              APUs support gpu compute, and can do all of that fairly well.

              Not as well as specialized hardware, but still a decent boost vs cpu / software mode only.

              The problem is developers who are too lazy to leverage it.


              Originally posted by cl333r View Post
              Somewhat offtopic:
              Are consumer grade quantum computers due out in how much, 20-30 years? Will it mean that the current split into CPU/GPU will become irrelevant together with a lot of software?
              ​Quantum computing is very domain specific, quantum computers are great at things classical computers are terrible at, and terrible at things classical computers are commonly used for.

              You can expect that quantum computers replace classical computers at the same rate as the rate at which non-circular wheels replace circular wheels. It just doesn't get any better than a circle, and the few niche uses where the circle is not optimal are not universally applicable.
              Last edited by ddriver; 26 October 2023, 02:11 PM.

              Comment


              • #8
                Originally posted by schmidtbag View Post
                Are you saying this as a trend of Blender or because of the performance of such APUs? Because even though iGPUs are measly compared to modern dGPUs, it's still free performance. Think of it like overclocking your GPU with the stock heatsink - you probably won't get far but it's better than nothing.​
                Yes, it's a trend of Blender. The devs came out years ago to admit that they only buy NVidia GPUs and they'll never buy an AMD GPU, so they never do any work to support AMD (because they'd never be able to test it). This leaves the work of integrating support for AMD hardware into this incredibly complex program to be done by 3rd parties. AMD has done a lot of work to keep AMD support in Blender (until they eventually just dropped it entirely anyway), but they get very little support from the developers themselves. AKA the people who know the most about Blender and it's intricacies.

                Blender is extremely optimized for an NVidia workflow, and no amount of work on AMD's part is going to change that because the devs are fanboys. The sheer arrogance to say you don't support AMD cards because you never bought one years into developing an extremely popular open source program is wild. Not even one just to throw into a test bench ffs.

                Comment


                • #9
                  Blender is such a befitting oxymoron, they blend the spirit of open source with hardcore dedication to exclusively supporting a proprietary standard from a predatory anti-competitive corporation.

                  Committing to that goes against the very core motivation of FOSS.

                  But what do I know...

                  Comment


                  • #10
                    Originally posted by ddriver View Post
                    Blender is such a befitting oxymoron, they blend the spirit of open source with hardcore dedication to exclusively supporting a proprietary standard from a predatory anti-competitive corporation.

                    Committing to that goes against the very core motivation of FOSS.

                    But what do I know...
                    nVidia is sponsoring this work and there's nothing wrong.

                    AMD could do the same...they simply sponsored them to fix this nVidia code to work over their CUDA<=>ROCM translator/bridge.

                    While I dislike nVidia now, we can not blame them for everything. blender too (I never heard that they rejected some AMD-related code. Accepting nVidia one[or nVidia hiring devs to make it is not unethical])

                    Comment

                    Working...
                    X