Announcement

Collapse
No announcement yet.

Proof Of Concept: Open-Source Multi-GPU Rendering!

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    How about we get linux to support rendering with multiple GPU's across multiple monitors and not just a single monitor? The multiple monitor state of linux is junk. It's been broken with Ubuntu since hardy and the only way to get it to render multiple monitors with composite across multiple video cards is to use a hack for Xserver-XGL which has been retired a long time ago as well..

    What ever happend to XrandR 1.3+ supporting this? :S

    Sigh...

    Unless something ha changed in the latest distributions, but as of Ubuntu Karmic - this is all still broken and not working.. Something that only takes a minute to configure on windows after you download the latest drivers..

    - D2G

    Comment


    • #17
      Am I the only one who feels some kind of "Deja Vu"?

      From what I could actually make out of the news entry, as worded by our good friend Michael, what Arlie is doing (which is fantastic, by the way) has much resemblance to what back in their day 3Dfx did with the original Voodoo Graphics 3D processors... If you recall, they had a 3D engine that required an already present 2D card. What my understanding was at the time they used the framebuffer from the 2D-only card to attach the 3D rendering of their Voodoo graphics. What this new technique does and how it was described, lauched me 14 years in to the past of consumer 3D graphics rendering.

      I find it kind of ironic that we are coming basically to the same point as it all started back when 3Dfx originally introduced their Voodoo graphics 3D processors!

      I'd love to see these hybrid modes supported right from the underlying infrastructure of Linux, X and the others.

      Comment


      • #18
        Well, with the early 3dfx cards you had to daisy-chain the vga cable from the 2d card, to one accelerator, to (optionally) a second accelerator, and the accelerators would switch from pass-through to their output based on the timing of the signal.. This, by the sounds of it, is using the new standard memory management interface to ship data between chipsets internally.

        Not *too* different a concept, but a but more flexible, perhaps..

        Comment


        • #19
          Originally posted by Thetargos View Post
          I find it kind of ironic that we are coming basically to the same point as it all started back when 3Dfx originally introduced their Voodoo graphics 3D processors!
          well in my opinion, 3dfx was ahead of their time. they pioneered some great technologies and were the fastest cards of their day.

          Comment


          • #20
            Originally posted by DuSTman View Post
            Well, with the early 3dfx cards you had to daisy-chain the vga cable from the 2d card, to one accelerator, to (optionally) a second accelerator, and the accelerators would switch from pass-through to their output based on the timing of the signal.. This, by the sounds of it, is using the new standard memory management interface to ship data between chipsets internally.

            Not *too* different a concept, but a but more flexible, perhaps..
            This is more like the Matrox M3D, I think.

            Comment


            • #21
              Originally posted by DuSTman View Post
              Well, with the early 3dfx cards you had to daisy-chain the vga cable from the 2d card, to one accelerator, to (optionally) a second accelerator, and the accelerators would switch from pass-through to their output based on the timing of the signal.. This, by the sounds of it, is using the new standard memory management interface to ship data between chipsets internally.

              Not *too* different a concept, but a but more flexible, perhaps..
              The interfaces are also different this time around, since the AGP era it was made easier to share data between the bus and main memory. PCI-E having more bandwidth and supporting more channels (and even AGP itself) are much better suited than chaining together the cards, as the joint point would actually be the system bus (faster than memory). I do agree that this implementation is much refined, I guess 14 years wouldn't have gone in vain

              Comment


              • #22
                Originally posted by bridgman View Post
                This is more like the Matrox M3D, I think.
                I thought of it too, but the original concept of "borrowing" another display adapter's framebuffer was 3dfx's (IIRC), hence why I thought of them. At any rate, this kind of evolution, if implemented will be awesome!

                Comment


                • #23
                  I thought 3dfx used a pass-through cable and just switched between the 2D card's video out and the video from its 3d-only framebuffer.

                  Mac graphics (Radius TV among others) were doing this long before 3dfx or Matrox, BTW.

                  Comment


                  • #24
                    Sorry, missed the edit window. By "this" I mean "transferring framebuffer data from one card into the framebuffer of another card for display".

                    Comment


                    • #25
                      Originally posted by d2globalinc View Post
                      How about we get linux to support rendering with multiple GPU's across multiple monitors and not just a single monitor? The multiple monitor state of linux is junk. It's been broken with Ubuntu since hardy and the only way to get it to render multiple monitors with composite across multiple video cards is to use a hack for Xserver-XGL which has been retired a long time ago as well..

                      What ever happend to XrandR 1.3+ supporting this? :S

                      Sigh...

                      Unless something ha changed in the latest distributions, but as of Ubuntu Karmic - this is all still broken and not working.. Something that only takes a minute to configure on windows after you download the latest drivers..

                      - D2G
                      Hmm I must have missed the patch you sent to fix it, or the money you gave Red Hat to make me care, otherwise you used the word "we" incorrectly.

                      Dave.

                      Comment


                      • #26
                        Originally posted by bridgman View Post
                        I thought 3dfx used a pass-through cable and just switched between the 2D card's video out and the video from its 3d-only framebuffer.

                        Mac graphics (Radius TV among others) were doing this long before 3dfx or Matrox, BTW.
                        You are correct, of course! I was thiking of 3D data, though.

                        Comment


                        • #27
                          Originally posted by d2globalinc View Post
                          How about we get linux to support rendering with multiple GPU's across multiple monitors and not just a single monitor? The multiple monitor state of linux is junk. It's been broken with Ubuntu since hardy and the only way to get it to render multiple monitors with composite across multiple video cards is to use a hack for Xserver-XGL which has been retired a long time ago as well..

                          What ever happend to XrandR 1.3+ supporting this? :S

                          Sigh...

                          Unless something ha changed in the latest distributions, but as of Ubuntu Karmic - this is all still broken and not working.. Something that only takes a minute to configure on windows after you download the latest drivers..

                          - D2G
                          Tbe proprietary drivers allows you to use multiple graphics cards with a single GL desktop using Xinerama. This gives you multi-screen. The OSS drivers, don't do that though.

                          ATI's new cards support >2 screens as well with RANDR too.

                          Unfortunately, it isn't so much the drivers are fundamentally broken, but the userland. When I was playing with the 6 display cards, I would have a single desktop using RANDR. Yes, compiz would work, but when you rotate the cube or similar, the assumptions about what the different display information means (RANDR, Xinerama extensions, etc) would get misinterpreted and make the experience less than ideal. In theory, you could switch of RANDR and the XINERAMA extension and find that compiz would treat it like one big screen - but you then lose the ability to dynamically resize screens with RANDR or get poor window placement with Xinerama.

                          It's just a use-case that the Desktop Environments don't really consider.

                          Matthew

                          Comment

                          Working...
                          X