Announcement

Collapse
No announcement yet.

Intel Just Released A Crazy Fast Acceleration Architecture

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by bridgman View Post
    - implement support for all the performance-related features - tiling, HyperZ, pageflipping etc.. - and bug fix to the point where they can be enabled by default, so the hardware will be running at full speed at least at a micro-level.
    Do you have an estimated time for these features to be enabled by default?

    Comment


    • #32
      Bridgman: This is offtopic and certainly a bit on the naive side to ask you this.
      I'll give it a try nevertheless.

      First off, I'm really happy about the decision to release documentation of major parts of Ati/AMD GPU chipsets. Also I really liked how things evolved. Then again, while one might not care a lot about the motivation behind all this now that "it" happened, I asked myself several times how this "going open source" might be solely related to the new lines of CPUs with integrated GPU parts (out-of-the-box experience) and would not happened otherwise.

      Comment


      • #33
        Originally posted by bridgman View Post
        Stupid 30 minute edit limit
        Hahahahahahahahahaha!!!

        Sorry for spam, I just had to!

        Comment


        • #34
          Originally posted by tball View Post
          Do you have an estimated time for these features to be enabled by default?
          These are all already enabled features if i can remember my Phoronix stories correctly.

          Comment


          • #35
            They are not, at least not generally. On my RV710, tiling is NOT enabled by default and hyper-Z is not even implemented. http://www.x.org/wiki/RadeonFeature

            Comment


            • #36
              Originally posted by bbordwell View Post
              These are all already enabled features if i can remember my Phoronix stories correctly.
              AFAIK they are enabled for r3xx-5xx but not for r6xx-NI although the initial code is written and has gone through some testing, particularly on 6xx/7xx.
              Last edited by bridgman; 05 June 2011, 07:26 PM.
              Test signature

              Comment


              • #37
                Originally posted by entropy View Post
                Bridgman: This is offtopic and certainly a bit on the naive side to ask you this.
                I'll give it a try nevertheless.

                First off, I'm really happy about the decision to release documentation of major parts of Ati/AMD GPU chipsets. Also I really liked how things evolved. Then again, while one might not care a lot about the motivation behind all this now that "it" happened, I asked myself several times how this "going open source" might be solely related to the new lines of CPUs with integrated GPU parts (out-of-the-box experience) and would not happened otherwise.
                It's off topic even for an AMD thread, but what the heck

                Definitely not "solely related" but support for Fusion parts was a consideration even back in 2007. Maybe 30-40% of the motivation, something like that...
                Test signature

                Comment


                • #38
                  Does anyone know if any of these enhancements can be ported to the open source AMD drivers or Gallium3D in general if Intel ever decides to join the rest of us?

                  Comment


                  • #39
                    Originally posted by Prescience500 View Post
                    Does anyone know if any of these enhancements can be ported to the open source AMD drivers or Gallium3D in general if Intel ever decides to join the rest of us?
                    As was previously noted the main change in this patch set was using the 3D engine for everything "2D" related (let's call it X rendering related since the X RENDER stuff never really worked on traditional 2D engines) rather than using a mix of the 2D and 3D engines and dealing with the latency involved in the synchronization between them. Since R6xx, the open source drivers for AMD hardware already do this since we have no 2D engine any more.

                    This particular patch isn't directly related to the 3D drivers, but the same synchronization issues are still relevant in the 3D driver. None of the radeon 3D drivers (neither gallium no classic) for hardware that has a 2D engine (r1xx-r5xx) use the 2D engine in the 3D driver, so there are no synchronization issues in that respect.

                    This brings up a good point in general. Often hardware has features that it doesn't always make sense to use as we see in the case of this patch. On the surface have multiple asynchronous hardware engines may seem like a useful feature, but the overhead of synchronization that comes with sharing buffers between the engines is often not worth the extra functionality. That's not to say multiple engines don't have their uses, but just because you can use them doesn't always mean you should.

                    Comment


                    • #40
                      On a slightly unrelated topic... when is it planned to just retire the hardware-specific DDX's and use a generic interface exposed via Mesa, e.g. Gallium?

                      DDX's are the one part of the LInux graphics stack I've never at least slightly looked into. I'm assuming they're doing some direct hardware programming through the DRI2 interfaces rather than passing through the Mesa/Gallium code. Like the DDX is issuing commands for "draw opaque rectangle here" to the DRI2 interfaces rather than going through some Gallium interface to draw primitives (wrapping OpenGL is probably too much overhead to justify perhaps, but surely a 2D acceleration state tracker in Gallium would not be). I'm aware that this is basically what doing a nested X.org server over Wayland or so on would do, but why isn't X.org doing it that way internally already? Legacy support or something?

                      (This is a more AMD specific topic, as I know Intel doesn't use Gallium yet.)

                      Comment

                      Working...
                      X