Announcement

Collapse
No announcement yet.

Radeon Driver Enables Full 2D Acceleration For HD 7000

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by crazycheese View Post
    If AMD really understood the situation, they would punch a million dollar development into open driver. But they did not.
    Do you mean a million on top of the >1.5 mil we've already spent ?

    Originally posted by crazycheese View Post
    This is what I HATE about AMD. They were dumb enough to ignore GPU opensource and they are now dumb enough to improve catalyst instead of opensource.
    We have had developers working full time on the open source stack since late 2007... perhaps your definition of "ignore" is different from mine ? What makes you think that we are improving catalyst instead of opensource ?
    Last edited by bridgman; 12-27-2012, 05:00 PM.

    Comment


    • #17
      Its not a feature comparsion, I mean do you really believe that opensource drivers are in some way bad because their openess? They are bad because in the other driver are more man-hours involved there it does not matter if its opensource or not. at least no big one...

      So that said... I dont understand even your motivation behind promoting here nvidia for his "better" driver. I can only think of 2 reasons or 3... 1. you are a nvidia fanboy and you want that anybody loves nvidia 2. you think all world should use proprietary development modell because its the better model because if you put more money in it you get in some parts better results 3. you get paid for it...


      I mean its a different thing if you promote open or free drivers, because you believe in something... you cant believe that closed source is better for the world so why would you promote them, do you think that if we here flame against nvidia they stop supporting linux? They dont release this drivers because they are good neighbers or even a company that cares about their consumer linux customers. They see money in it and they will make that as long as they earn money from it... even if on each linux installation a popup would popup where nvidia would be flamed as worst cancer company ever...

      if you dont care about openess of drivers its pointless to compare features of the drivers... and it goes even deeper you would not even care if nvidia would bring you a complete own xserver and a complete own kernel... if it would work good for you... you seem to not care about freedom at all...

      so get it, freedom is a feature it self for most people using such software (at least that who know what they are using and what gpl stands for), so its worthless to discuss further. Of course if you care about fps the closed source nvidia driver is better than the opensource radeon driver, but nobody questions that...
      Last edited by blackiwid; 12-27-2012, 04:49 PM.

      Comment


      • #18
        Originally posted by bridgman View Post
        We have had developers working full time on the open source stack since late 2007... perhaps your definition of "ignore" is different from mine ? What makes you think that we are improving catalyst instead of opensource ?
        Does that mean that AMD does not improve Catalyst? That is not likely, although I suppose that the shared code with Windows does help a bit there.

        Like I mentioned, it would be ideal if there was only one driver, with proprietary extensions if they were necessary. That would make it so that less effort would be needed to maintain both. But, given the size of fglrx, that's probably not realistic, as it would require a lot of restructuring and code review...

        Comment


        • #19
          Originally posted by bridgman View Post
          Do you mean a million on top of the >1.5 mil we've already spent ?
          We have had developers working full time on the open source stack since late 2007... perhaps your definition of "ignore" is different from mine ? What makes you think that we are improving catalyst instead of opensource ?
          Please, for the love of deities, don't feed the trolls...

          On another note, something I never quite understood was how you can provide separate 2D acceleration,
          when the openGL model is completely 3D oriented (ie. you have to emulate 2D with orthographic projections etc.).
          Could someone explain please? =O

          Comment


          • #20
            Originally posted by phoronix View Post
            Phoronix: Radeon Driver Enables Full 2D Acceleration For HD 7000

            A commit to the xf86-video-ati driver this morning by AMD's Michel Dänzer says it enables full 2D acceleration for the Radeon HD 7000 "Southern Islands" GPUs...

            http://www.phoronix.com/vr.php?view=MTI2MjY
            Speaking of Glamour, I'd like to see some power consumption numbers that compare the various 2d schemes. Particularly, I'd like to see, a W/frame value for the schemes. I'd suspect the intel driver blows everything else away.
            Hmm, has anyone else wondered why the intel driver is so much better than the amd driver? The intel team isn't that big (perhaps bigger than the amd team, but not bigger than the amd + red hat team + community contributors). Sure, the intel gpu is simpler, but it performs better than low end discrete gpus from either company, and they keep improving them pretty significently. I wonder if the architechtural path Intel has chosen is the smarter one, or if they are going to require major changes if they want to reach higher performance levels...

            Comment


            • #21
              Originally posted by Ancurio View Post
              Please, for the love of deities, don't feed the trolls...
              Oh, I am sorry!


              @bridgman
              Thats good to know.. but you should also understand that this is still inefficient. A two year old plain direct scenario, a Linux user wants to buy a video card - and is still confronted with more troubles when going AMD route - is still actual. Fix power management, improve 3D performance to 25%, take our money if you need to. How much do you guys get from each windows installation that you are fanatically polishing it? If, say, every Linux user would pay you out of own pocket to reach near same performance and feature set you offer *for free* under Windows, would you think about it managing your open driver in a more efficient way? Just start a poll,.. anywhere.. even on Ubuntu forums, as an official AMD member. See the your potential buyer reaction. Intel started to invest into opensource driver recently, and voila, they punched through marketshare of both red and green team like through butter. I bet the hardware sells covered up the expenses, as they are continuing.

              Comment


              • #22
                Originally posted by liam View Post
                Speaking of Glamour, I'd like to see some power consumption numbers that compare the various 2d schemes. Particularly, I'd like to see, a W/frame value for the schemes. I'd suspect the intel driver blows everything else away.
                Well, mostly I would think whatever hardware has the lowest idle power usage would probably win. So yeah, Intel is likely the leader here. Especially since they don't have to send info across the wire and can have everything reside directly in main memory. 2D workloads just aren't difficult enough to push the hardware much.
                Hmm, has anyone else wondered why the intel driver is so much better than the amd driver? The intel team isn't that big (perhaps bigger than the amd team, but not bigger than the amd + red hat team + community contributors).
                It's a lot bigger. I'm not sure why you don't think so, but it is. Also, a lot of the Red Hat and community contributers only work on it part time, while the Intel devs are full time only working on their hardware.
                I wonder if the architechtural path Intel has chosen is the smarter one, or if they are going to require major changes if they want to reach higher performance levels...
                Well, they don't have discrete cards at all. So yeah, that's a simpler architecture that will obviously have less power.

                Comment


                • #23
                  Originally posted by Ancurio View Post
                  Please, for the love of deities, don't feed the trolls...

                  On another note, something I never quite understood was how you can provide separate 2D acceleration,
                  when the openGL model is completely 3D oriented (ie. you have to emulate 2D with orthographic projections etc.).
                  Could someone explain please? =O
                  It depends on what you mean by "2D". Classic 2D engines in old chips basically did three things:
                  1. draw solid filled rectangles
                  2. copy rectangles
                  3. draw lines

                  All of those a simple enough to perform on a 3D engine. 1. is just drawing a solid filled quad. 2. is just drawing a textured quad. 3. is just drawing a line.

                  Where is gets complicated is where "2D" APIs start to diverge from what can be done on classic 2D hardware. Unfortunately, just emulating classic 2D engine functionality on the 3D engine is not performant on modern "2D" APIs like RENDER. RENDER supports things like transforms (scaling, rotation, keystone, etc.) and alpha blending which require a 3D engine to implement. Modern 3D hardware requires a compiler to properly generate the shaders required for all the RENDER options, before you know it, you end up needing a driver stack almost as complicated as the GL driver. Additionally, RENDER semantics were developed for software rendering so in many cases they do not map easily to 3D engine designed for other APIs (GL or DX). So you can either build and maintain two separate driver stacks for both RENDER and GL, or you can layer RENDER on top of GL. It's not perfect, but there is no hardware designed for RENDER. Applications should be using APIs that the hardware is designed for, namely GL. Over time we've seen more and more applications move to GL. It's not worth the effort to maintain a ultra-tuned device specific RENDER stack if you have limited time and resources; it's never going to perform as well as GL.

                  Comment


                  • #24
                    Originally posted by crazycheese View Post
                    ...
                    You can still be a troll even if you use the oss drivers. For proof, look at your own posts.

                    Comment


                    • #25
                      Originally posted by liam View Post
                      Hmm, has anyone else wondered why the intel driver is so much better than the amd driver? The intel team isn't that big (perhaps bigger than the amd team, but not bigger than the amd + red hat team + community contributors).
                      It is definitely much bigger. Intel pays over 20 developers to work on graphics drivers. With AMD it's maybe 4 or 5, I don't remember. Intel's developers seem to be much more active as well.

                      Comment


                      • #26
                        Originally posted by smitty3268 View Post
                        You can still be a troll even if you use the oss drivers. For proof, look at your own posts.
                        Yes, they are becoming less and less demanding. Still.. we are by far not there. http://openbenchmarking.org/result/1...RA-1212216CR57
                        No, I don't regret selling it.

                        Comment


                        • #27
                          Originally posted by crazycheese View Post
                          + Nvidia driver works, it is best proprietary supported driver and it brought 3D to Linux.
                          - But its closed source and conflicts with GPL libre license, written ONLY for corporate customers with minor adaptations to fit general audiency, it lacks features compared to windows driver, it does not use advantages of Linux kernel, it is pain in butt to integrate (and Linux is designed this way for a reason). And when nvidia says its over, its over; when nvidia says - you don't get this functionality - you don't get it.
                          Basically, your quarrel is the license. I wonder how often you have to use that that makes cry rivers over it.
                          Besides optimus, I couldn't name any missing features compared to Windows and optimus support is blown way out of proportions. One feature I particularly like is getting simultaneous driver releases for both Linux and Windows.

                          Comment


                          • #28
                            Originally posted by smitty3268 View Post
                            Well, mostly I would think whatever hardware has the lowest idle power usage would probably win. So yeah, Intel is likely the leader here. Especially since they don't have to send info across the wire and can have everything reside directly in main memory. 2D workloads just aren't difficult enough to push the hardware much.
                            The idle power wouldn't be an issue with benchmarks that usually have a defined endpoint. Also, I want the W/frame numbers.
                            As for 2d loads not being difficult enough, complicated vector images with filters applied will bring most pcs to their knees.



                            Originally posted by smitty3268 View Post
                            It's a lot bigger. I'm not sure why you don't think so, but it is. Also, a lot of the Red Hat and community contributers only work on it part time, while the Intel devs are full time only working on their hardware.
                            Looking over recent commit history for the xorg driver, chris wilson is the only name I see.
                            Intel employs 22 devs (not including release managers and QA) but they do tons of X work (I'd imagine they are, by far, the biggest X contributor).
                            For their 3d driver over the last six months I count 12 contributors for the i915 driver but not all of the contributors are from intel (Marek, for instance, unless he was hired recently).
                            For xorg radeon I count 7 devs over the last 5 months.
                            For mesa radeon I count 10 devs for the last two months.

                            Originally posted by smitty3268 View Post
                            Well, they don't have discrete cards at all. So yeah, that's a simpler architecture that will obviously have less power.
                            I don't understand your point here. AMD also make non-discrete gpus but the architecture should be pretty similar to their discrete gpus (obviously memory management would be quite different, though).

                            Comment


                            • #29
                              Originally posted by brent View Post
                              It is definitely much bigger. Intel pays over 20 developers to work on graphics drivers. With AMD it's maybe 4 or 5, I don't remember. Intel's developers seem to be much more active as well.
                              A number of those intel engineers, like Keith Packard, seem to be more stack oriented than particular driver oriented. I gave a simple accounting of commits by author in my post to smitty above.
                              Of course, that says nothing about activity.

                              Comment


                              • #30
                                Originally posted by liam View Post
                                For their 3d driver over the last six months I count 12 contributors for the i915 driver but not all of the contributors are from intel (Marek, for instance, unless he was hired recently).
                                Heh, I don't even have an Intel IGP. I have recently made small changes in all gallium drivers while improving the gallium interface. I think I made about the same number of changes in each driver.

                                Comment

                                Working...
                                X