Announcement

Collapse
No announcement yet.

Considering a new GPU soon. How's the 7700 series on Linux?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by Paul Frederick View Post
    What tans my hide is AMD has said some things, then done others. I'd be more on their side if either they didn't promise the Moon, yet fail to deliver, or just were more consistent. I'll admit Nvidia's policy isn't ideal, but at least they're not all over the map. So I have to respect Nvidia more. Nvidia just seems more honorable to me than AMD. They're not lying to me, or insulting my intelligence by saying one thing, then doing another.

    Furthermore it is my understanding that in order to acquire all of the technology that Nvidia uses they have had to enter agreements with other entities themselves. Elements of these contracts constrain Nvidia from disclosing proprietary information they use. What is Nvidia supposed to do? They have 3 choices here, either don't use the best tech, throw their partners under the bus to make the FOSS zealots happy, or do what they're doing. I imagine they're doing the best they can.

    I hate to break it to people but we don't live in a perfect world where Unicorns run around crapping out rainbows. Those who get along the best compromise when they need to. The rest cut off their own noses to spite their faces. That is too ugly for me.

    If AMD is the good guys then fit me for an Imperial Stormtrooper suit of armor. Because I'm not comfortable living the life of a rag tag rebel if it means I have to sacrifice. Ethics don't increase frames per second for me.
    Actually, I'm Luke Skywalker using captured X-fighter.
    AMD did something very good with opensource move, the problem was and is - how they did it. Unicorn without corn is a weak unicorn.

    Comment


    • #62
      I know where my Cyrix chip is

      Originally posted by crazycheese View Post
      Incorrect. Please check who Cyrix were and what happened to them.
      So I obviously know who they were, and what happened to it too. It was fast, but it identified itself as a 486, so that hampered its performance with 586 applications.

      Comment


      • #63
        Originally posted by Paul Frederick View Post
        So I obviously know who they were, and what happened to it too. It was fast, but it identified itself as a 486, so that hampered its performance with 586 applications.
        I haven't asked what Cyrix processors where, but who Cyrix themself were. They were elite company, much better than Intel, who designed processors some of which were more capable than those of Intel - all this with a such small team size. Yes, they essentially failed at marketing, yet Intel was not the omnipontent as you suggested.

        Also, if you trackback to the history around K7, it was essentially better than both pentium 3 and 4 (at start, during several years).

        So Intel is not the only one capable of producing good solutions

        Comment


        • #64
          k6(-2/3) were very bad compared pentium 2/3. but athlon and athlon xp where good cpus at that time. the problems for amd began with intel core2 and especially witht the core2 quad. intel just put 2 duals together to create one quad core, amd needed too long time to develop a "real" quad. phenom was 1 year to late and had one stupid tlb error, phenom ii fixed that, but then came intel i series with an additional performance boost. you can forget the new fx series, the speed/core even degraded because compared to 6 real cores you get a curious 8/4 core combo. now amd wants to sell cpu+gpu combinations, but intel does the same. amd has the "better" gpu part when you look at win drivers for mainstream, but i dont think the target user group really wants to play games which would barly run with intel gfx and 25% faster with amd gfx. All that matter is the price in that market segment, there raw power is not important for office/internet use.

          Comment


          • #65
            Originally posted by Paul Frederick View Post
            What tans my hide is AMD has said some things, then done others. I'd be more on their side if either they didn't promise the Moon, yet fail to deliver, or just were more consistent. I'll admit Nvidia's policy isn't ideal, but at least they're not all over the map. So I have to respect Nvidia more. Nvidia just seems more honorable to me than AMD. They're not lying to me, or insulting my intelligence by saying one thing, then doing another.
            Hi Paul;

            Do you have any specific examples of this on the graphics side ? I've heard the same statement multiple times over the years but never been able to pin down an actual example. There are a lot of cases where random people on the internet have said things which we haven't done, but that's obviously not the same.

            Originally posted by Paul Frederick View Post
            Furthermore it is my understanding that in order to acquire all of the technology that Nvidia uses they have had to enter agreements with other entities themselves. Elements of these contracts constrain Nvidia from disclosing proprietary information they use. What is Nvidia supposed to do? They have 3 choices here, either don't use the best tech, throw their partners under the bus to make the FOSS zealots happy, or do what they're doing. I imagine they're doing the best they can.
            We have all the same issues. There's a 4th solution -- spend the time and effort to cut very carefully between what you can release and what you can't release, try to negotiate agreements to release the remaining bits, and/or find other technical solutions then release what you can. It's a lot of work, it was a tough sell internally, and it goes more slowly than anyone would like, but it works.

            Comment


            • #66
              The field is wide open

              Originally posted by crazycheese View Post
              I haven't asked what Cyrix processors where, but who Cyrix themself were. They were elite company, much better than Intel, who designed processors some of which were more capable than those of Intel - all this with a such small team size. Yes, they essentially failed at marketing, yet Intel was not the omnipontent as you suggested.

              Also, if you trackback to the history around K7, it was essentially better than both pentium 3 and 4 (at start, during several years).

              So Intel is not the only one capable of producing good solutions
              Perhaps what you say is true. If it is then players should step up to the plate and give it their best shots. Right now the undeniable heavy hitter is Intel, and fans are not finding it difficult to cheer for their performance. I never suggested, or mentioned that Intel was omnipotent, that simply is not a word I use. I will say this though, Intel invented microprocessors. That is something no one will ever be able to take away from them.

              So it is fair for me to say that Intel was the first, and is still the best. Maybe Intel is omnipotent after all?

              By the by, my Cyrix chip ran like a hot plate, and it could not execute x586 code, although Cyrix did in fact bill it as an x586 CPU. Something I found to be rather deceptive while I ran the hardware. Needless to say I have regretted my experience with Cyrix, but I will mention it none the less, for the benefit of the less astute who may read this.

              But I'm sure you will claim that all of Cyrix's shortcomings were Intel's fault. That seems to be a common theme around here. Is there a South Park episode being written here? Blame Intel! What I'd really like to see is responsibility placed where it belongs.

              Comment


              • #67
                @bridgman

                Basically it should be clear that your statements about the speed of oss drivers for legacy hardware is far away from fglrx. Even for the "older" legacy devices. The powermanagement is very bad as well and therefore many laptop users have got heat problems with oss drivers. Btw. intel oss drivers provide video hardware accelleration and amd does not.

                Comment


                • #68
                  Yep, power management needs more work, although a lot of that work can be done without anything more from AMD, and we said at the start not to assume open source video decode acceleration support (although we are working hard to see if we can release it anyways).

                  Not sure I agree with your statement about performance of open drivers being "far away" from catalyst on legacy hardware but obviously that depends on what you call "far away". Michael's last benchmarks suggested a range from 50-130% (on an X1950 IIRC), averaging maybe 70-80%, which I would not call "far away" but others might.

                  It probably is fair to say that we underestimated how complex power management was going to become a couple of years after launching the open source effort, and if I had had access to a time machine (or even enough time to dig through every internal development program people were considering in 2007) I probably would have included some disclaimers about power management alongside the UVD statement and definitely would have started looking at power management sooner.

                  That said, none of the above has anything to do with AMD making promises we did not keep.
                  Last edited by bridgman; 07-01-2012, 12:19 PM.

                  Comment


                  • #69
                    And you know that hd 4000 is legacy now as well?

                    Comment


                    • #70
                      OK, guess we'd better get definitions straight. When you say "legacy" do you mean "hardware that isn't getting any more fglrx updates" (5xx and earlier, rs6xx/rs740 and earlier) or "hardware which has been moved to a separate branch and update schedule" (6xx/7xx/rs7xx/rs8xx) ? We have used the term both ways at various times over the years, which doesn't help things, but if you're talking about dependence on the open drivers then I assume you mean the first definition.
                      Last edited by bridgman; 07-01-2012, 01:00 PM.

                      Comment


                      • #71
                        As I see it

                        Originally posted by Kano View Post
                        k6(-2/3) were very bad compared pentium 2/3. but athlon and athlon xp where good cpus at that time. the problems for amd began with intel core2 and especially witht the core2 quad. intel just put 2 duals together to create one quad core, amd needed too long time to develop a "real" quad. phenom was 1 year to late and had one stupid tlb error, phenom ii fixed that, but then came intel i series with an additional performance boost. you can forget the new fx series, the speed/core even degraded because compared to 6 real cores you get a curious 8/4 core combo. now amd wants to sell cpu+gpu combinations, but intel does the same. amd has the "better" gpu part when you look at win drivers for mainstream, but i dont think the target user group really wants to play games which would barly run with intel gfx and 25% faster with amd gfx. All that matter is the price in that market segment, there raw power is not important for office/internet use.
                        Right now I'm getting a feeling we may all be rooting for our favorite dinosaur. The days of the big CPU may be drawing to a close, or perhaps the days of the small CPU being big enough might just be dawning. Keep your eye on ARM. It could be the wave of the future none here are seeing yet.

                        Comment


                        • #72
                          @bridgman

                          The legacy definition is the same as the amd one: every chip with is not supported by a current binary driver. Every other defintion would be useless.

                          Comment


                          • #73
                            Does your definition of "current" still mean "within the last month" ? That doesn't really hold any more now that we've changed release scheduling.

                            I think it's fair to complain about deficiencies in the open source drivers for 5xx/rs6xx/rs740 and earlier since there are no plans to provide fglrx updates for those parts, but lumping 6xx/7xx in just confuses things IMO.
                            Last edited by bridgman; 07-01-2012, 01:46 PM.

                            Comment


                            • #74
                              There is no other driver out there with support for those chips, did you see em? I do NOT count announcements, i only count REAL drivers. For newer chips there is also no driver with xserver 1.12 support on debian 64 bit. no support for kernel > 3.3, while 3.3 is already marked as EOL on kernel.org...

                              Comment


                              • #75
                                Ahh, OK, so you're just saying "between driver releases I assume that you're lying and that the driver I have is the last one I will ever see". I can accept that.

                                It's also true that less frequent driver releases will make it more difficult to use the newest Linux distros with older hardware, but I think you will find things work out better than your worst-case assumptions.
                                Last edited by bridgman; 07-01-2012, 01:57 PM.

                                Comment

                                Working...
                                X