Announcement

Collapse
No announcement yet.

Intel Continues Making Preparations For Ray-Tracing With Their Linux Graphics Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Continues Making Preparations For Ray-Tracing With Their Linux Graphics Driver

    Phoronix: Intel Continues Making Preparations For Ray-Tracing With Their Linux Graphics Driver

    Intel's open-source Linux graphics driver developers continue making their driver preparations for being able to accommodate Vulkan ray-tracing with upcoming Xe HPG graphics having ray-tracing hardware capabilities...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Isn't Intel's hardware too weak to run Ray-tracing and will be also for the next 5 years ?
    In my opinion, if they want to bring something new, they should join Red Hat and help bring HDR support.

    Comment


    • #3
      Originally posted by Danny3 View Post
      Isn't Intel's hardware too weak to run Ray-tracing and will be also for the next 5 years ?
      In my opinion, if they want to bring something new, they should join Red Hat and help bring HDR support.
      You should read some information of the new Intel dedicated GPU coming.

      Comment


      • #4
        I'm hoping the current Intel Iris Xe gets some attention to fix the overwhelming issue listed here https://gitlab.freedesktop.org/mesa/mesa/-/issues/5662#

        I'm pretty hyped for the gaming cards that are coming having experienced what the 1195g7 with Iris Xe graphics has to offer.

        Comment


        • #5
          Originally posted by dragonn View Post

          You should read some information of the new Intel dedicated GPU coming.
          He's right though. It's wishful thinking to believe that the dedicated GPUs that Intel is developing will handle ray tracing well when AMD and Nvidia still only poorly handle ray tracing, and there is also no reason to believe that as their first real push into dedicated graphics that they will be competing well against AMD and Nvidia on the high end.

          Comment


          • #6
            Originally posted by Luke_Wolf View Post

            He's right though. It's wishful thinking to believe that the dedicated GPUs that Intel is developing will handle ray tracing well when AMD and Nvidia still only poorly handle ray tracing, and there is also no reason to believe that as their first real push into dedicated graphics that they will be competing well against AMD and Nvidia on the high end.
            I dont expect Intel's high end card will be competitive with AMD/NVidia, but they are doing the right moves to get there one day.

            Comment


            • #7
              Originally posted by Danny3 View Post
              Isn't Intel's hardware too weak to run Ray-tracing and will be also for the next 5 years ?
              "weak" is not the comparison. This is about hardware ray queries, so it has little to do with the general purpose compute capabilities of the hardware. Also... I mean, Intel's top line performance has been increasing rapidly over the last few years, and their new desktop GPUs are possibly worth running some raytracing workload on. Even at laptop power budgets, if you're going to be doing raytracing, and you can handle the programming complexity of doing it on the GPU, then the GPU raytracing capability will almost always be more power efficient than doing it on the CPU, or doing it with general purpose GPU compute.
              Last edited by microcode; 05 December 2021, 03:03 PM.

              Comment


              • #8
                Originally posted by castlefox View Post

                I dont expect Intel's high end card will be competitive with AMD/NVidia, but they are doing the right moves to get there one day.
                I really like the Youtube channel Moore's Law is Dead. He said that the highest end Intel will at least perform at the level of a 3060 TI to 3070 TI. Source video. The last few minutes before that he explained how he got access to a render showing off the reference card.

                edit: If you go to 8:28 of same video. the lowest end Intel card will be pretty weak but still perform Ray Tracing.
                Last edited by CTown; 05 December 2021, 03:19 PM.

                Comment


                • #9
                  Originally posted by microcode View Post

                  "weak" is not the comparison. This is about hardware ray queries, so it has little to do with the general purpose compute capabilities of the hardware. Also... I mean, Intel's top line performance has been increasing rapidly over the last few years, and their new desktop GPUs are possibly worth running some raytracing workload on. Even at laptop power budgets, if you're going to be doing raytracing, and you can handle the programming complexity of doing it on the GPU, then the GPU raytracing capability will almost always be more power efficient than doing it on the CPU, or doing it with general purpose GPU compute.
                  Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?

                  Comment


                  • #10
                    Originally posted by Luke_Wolf View Post

                    Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?
                    It may be useless in games, but it helps a lot in physically based rendering. Don't take my word for it, pbrt-4 (you can find it on github) is claiming using Nvidia's optix led to a huge performance boost for them.

                    Comment

                    Working...
                    X