Announcement

Collapse
No announcement yet.

DreamWorks' OpenMoonRay Renderer Code Published

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by JosiahBradley View Post
    There have been RT engines since day 1. Pixar has their own and it's in my original IBM graphics manual.
    The original Renderman spec claimed to have a way for shaders to spawn rays, but it was unimplemented. I don't know when Pixar switched over, but at least their movies throughout the 1990's all used rasterization.

    Originally posted by bug77 View Post
    entertainment that amazes us has been employing RT for decades.
    I think the bulk of the film industry only switched over around 2010, but I don't have that on terribly good authority.​
    Last edited by coder; 16 March 2023, 05:27 AM.

    Comment


    • #12
      Originally posted by coder View Post
      I think the bulk of the film industry only switched over around 2010, but I don't have that on terribly good authority.​
      Probably even later than that, if you're talking about the bulk. Amazing as it may be, ray tracing is also expensive for movies.

      Comment


      • #13
        Would be a great opportunity for a RVV vendor to port and demonstrate... Well this and Cycles, probably Cycles would be first.

        Comment


        • #14
          coder REYES rendering, which is what RenderMan supported through v20 (dropped in favor of the RIS engine which debuted in PRman 19), did support ray-tracing but was designed to do so as minimally as possible. Modern ray-tracing native engines handle RT much better than what was possible in the Reyes era, particularly with the kind of hardware we have today.

          bug77 Virtually all production-grade render engines these days have at a minimum the ability to perform GPU-based denoising. To my knowledge, all of the GPU derived options are exclusively CUDA based to use OptiX for denoising, otherwise using other technologies that are CPU bound.

          pegasus Premo, their in-house production animation toolset.

          Studio R&D director Paul Carmen DiLorenzo walks us through the studio’s groundbreaking and innovative production system being recognized at this Saturday’s MPAA Scientific and Technical Awards. 


          ---

          Offline GPU engines these days are making use of "RT" cores on the graphics chips and depending on the scene complexity can make a fairly significant impact on render times. Of course it's highly dependent on the assets being rendered as to what can be done efficiently in building out the paths, so it's not a free "turn on for #x performance!" switch, it varies.

          Cheers,
          Mike
          Last edited by mroche; 16 March 2023, 08:03 PM.

          Comment


          • #15
            Originally posted by coder View Post
            The original Renderman spec claimed to have a way for shaders to spawn rays, but it was unimplemented. I don't know when Pixar switched over, but at least their movies throughout the 1990's all used rasterization.


            I think the bulk of the film industry only switched over around 2010, but I don't have that on terribly good authority.​
            A Bugs Life was the first Renderman movie with RT but only for minor lighting. There's an excellent history paper over on Pixar's site detailing why they did RT later. My comment was I guess disjointed but the technique is as old as computer graphics was all I meant. Modern Pixar movies are path traced. POVray for instance has been out now for almost 32 years.

            Comment


            • #16
              Originally posted by JosiahBradley View Post
              A Bugs Life was the first Renderman movie with RT but only for minor lighting. There's an excellent history paper over on Pixar's site detailing why they did RT later. My comment was I guess disjointed but the technique is as old as computer graphics was all I meant. Modern Pixar movies are path traced.
              Thanks for the follow-up!

              Originally posted by JosiahBradley View Post
              POVray for instance has been out now for almost 32 years.
              Heh, I tried POV-ray back in its first few years of existence, when I saw it on a BBS. I modeled a couple scenes on graph paper and then typed in the scene file by hand. I'd let it render overnight or while I was at school.

              I didn't believe a home PC could be powerful enough to do ray-tracing. Imagine my surprise, when I put a light source in front of the camera and didn't see it! Or when I didn't get caustics from a translucent sphere! That's when I started to piece together what it was really doing.

              Fast forward to my first job, and we actually implemented the first example of realtime raytracing I'd ever seen. It used a board with 8 DSPs, plugged into a workstation with 4 CPUs and I used all of them! It managed like 5 fps at like 160x120 resolution, of a simple scene. Still, it was enough to impress. It was just a tech demo, however.

              Comment

              Working...
              X