Announcement

Collapse
No announcement yet.

Intel Continues Making Preparations For Ray-Tracing With Their Linux Graphics Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • lacek
    replied
    Originally posted by Danny3 View Post
    Isn't Intel's hardware too weak to run Ray-tracing and will be also for the next 5 years ?
    In my opinion, if they want to bring something new, they should join Red Hat and help bring HDR support.
    Initially - most likely yes. The few demos that were available are not indicative of much. Also it is important to start with "compatibility" even if first few generation won't be ironed out. Getting software ready for ARC will take some time and this way developers will have cards to work with. As for "next 5 years" we know too little to be sure.

    Leave a comment:


  • TemplarGR
    replied
    Right now Ray tracing in video games is a gimmick. They are only using it for a few effects and the performance hit is immense while the visual improvement barely noticeable at best. Nothing that couldn't be achieved with traditional methods. I am sorry but that's the truth. Until hardware becomes far more capable, it is better to focus on performance without raytracing on.

    Leave a comment:


  • microcode
    replied
    Originally posted by Luke_Wolf View Post

    Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?
    Both of the existing desktop hardware raytracing implementations are useful for professional applications right now; in realtime obviously there are more tradeoffs.

    Leave a comment:


  • Jabberwocky
    replied
    Originally posted by piotrj3 View Post

    1st. Nvidia isn't weak at raytracing. If you look at Marbles at night demo - full path traced demo running real time, that is incredible. I did manage to run it at limited resolution at my 2060 super, and it is truly incredible.
    Marbles at night ran with so much lag it's painful to watch. Where's the 60 or 144 FPS video? I agree with Luke_Wolf, Nvidia and AMD are incredibly weak at pure ray tracing.

    Originally posted by piotrj3 View Post

    2nd. It is only barerly noticable in cases of light raytracing work (like raytraced shadows, since most engines got really good using shader based shadows, making shadows only marginally better using raytracing). Raytraced reflections are a ton better then shader based ones, and global illumination gives entire new life to many places.

    3rd. It simplifies a lot work for future game developers. At point when raytracing will be mandatory, a lot of things might become redundant - eg. no baking lights, a lot of complicated reflection shaders can be replaced by quite simple raytracing reflection queries, or simply move entirly to path tracing. Path tracing engines are quite simple to write, while engines supporting tons of special shaders just to proc special effect are super complicated and often fake results.
    Yes ray tracing is finally here in 3D games and it's here to stay. The question is more about what part of the GPU will be responsible for ray tracing, will GPU companies invest in dedicated hardware to increase the amount of rays per pixel? Currently game engines need to use denoising to work around the weak ray tracing performance of Nvidia and AMD GPUs. This obviously lowers the quality of the render after denoising filters are applied. You cannot cast a ray for every pixel in a 3D game at the moment, there's no GPU that can do that using just ray tracing hardware on it's own.

    From my perspective (a hobbyist noob). I agree games will start to depend much more on raytracing based techniques but the majority of the work is going to be done by compute (shaders) not dedicated ray tracing hardware. We will have to wait and see what next generation GPUs will give before we know for sure. For now we can look at what game engines are already investing in: https://docs.unrealengine.com/5.0/en...eatures/Lumen/

    Originally posted by piotrj3 View Post

    4th. It is incredibly useful for rendering like in Blender. With Optix even 3060 beats hard 6900XT using HIP while in cuda 3070 is around same performance as 6900XT using HIP.
    So a tortoise just overtook a snail? Cool cool cool cool.

    Leave a comment:


  • mangeek
    replied
    Originally posted by Danny3 View Post
    Isn't Intel's hardware too weak to run Ray-tracing
    Ray-tracing is traditionally an activity that happens on a CPU, not a GPU. Doing it in realtime in GPUs is somewhat novel. I wouldn't be surprised if Intel quickly gain a lot of ground in this space, especially if they see demand for it and have good CPU/GPU integration.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by piotrj3 View Post

    1st. Nvidia isn't weak at raytracing. If you look at Marbles at night demo - full path traced demo running real time, that is incredible. I did manage to run it at limited resolution at my 2060 super, and it is truly incredible.
    A 3090 can only hit 30fps at 1440p on that game. I don't know how you can say that's anything but weak, overall, unless you are grading on a curve and giving them credit for being faster than anyone else. A flagship card should be able to run at 4k.

    Anyway, Intel supposedly has decent RT performance according to the rumors. Probably not quite as good as Nvidia, but better than AMD.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by Luke_Wolf View Post

    Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?
    1st. Nvidia isn't weak at raytracing. If you look at Marbles at night demo - full path traced demo running real time, that is incredible. I did manage to run it at limited resolution at my 2060 super, and it is truly incredible.

    2nd. It is only barerly noticable in cases of light raytracing work (like raytraced shadows, since most engines got really good using shader based shadows, making shadows only marginally better using raytracing). Raytraced reflections are a ton better then shader based ones, and global illumination gives entire new life to many places.

    3rd. It simplifies a lot work for future game developers. At point when raytracing will be mandatory, a lot of things might become redundant - eg. no baking lights, a lot of complicated reflection shaders can be replaced by quite simple raytracing reflection queries, or simply move entirly to path tracing. Path tracing engines are quite simple to write, while engines supporting tons of special shaders just to proc special effect are super complicated and often fake results.

    4th. It is incredibly useful for rendering like in Blender. With Optix even 3060 beats hard 6900XT using HIP while in cuda 3070 is around same performance as 6900XT using HIP.

    Leave a comment:


  • er888kh
    replied
    Originally posted by Luke_Wolf View Post

    Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?
    It may be useless in games, but it helps a lot in physically based rendering. Don't take my word for it, pbrt-4 (you can find it on github) is claiming using Nvidia's optix led to a huge performance boost for them.

    Leave a comment:


  • Luke_Wolf
    replied
    Originally posted by microcode View Post

    "weak" is not the comparison. This is about hardware ray queries, so it has little to do with the general purpose compute capabilities of the hardware. Also... I mean, Intel's top line performance has been increasing rapidly over the last few years, and their new desktop GPUs are possibly worth running some raytracing workload on. Even at laptop power budgets, if you're going to be doing raytracing, and you can handle the programming complexity of doing it on the GPU, then the GPU raytracing capability will almost always be more power efficient than doing it on the CPU, or doing it with general purpose GPU compute.
    Currently AMD and Nvidia both are incredibly weak at Ray Tracing to the point where most of what turning Ray Tracing on is good for is tanking your FPS for limited usage of rays and visual returns that are barely noticeable in normal play. Given this is the case from the long term incumbents in the market why would you expect Intel's newcomer raytracing to be useful?

    Leave a comment:


  • CTown
    replied
    Originally posted by castlefox View Post

    I dont expect Intel's high end card will be competitive with AMD/NVidia, but they are doing the right moves to get there one day.
    I really like the Youtube channel Moore's Law is Dead. He said that the highest end Intel will at least perform at the level of a 3060 TI to 3070 TI. Source video. The last few minutes before that he explained how he got access to a render showing off the reference card.

    edit: If you go to 8:28 of same video. the lowest end Intel card will be pretty weak but still perform Ray Tracing.
    Last edited by CTown; 05 December 2021, 03:19 PM.

    Leave a comment:

Working...
X