No announcement yet.

10.4/fglrx in Lucid w/ ATI HD 4xxx or 5xxx

  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    You only get modesetting with hd5 cards now, definitely no powersave mode!


    • #22
      but thats stupid to run run the cards with oss driver, why should it run all the time in max speed mode? only complete idiots propose that!


      • #23
        a) That's something for .35+
        b) If its not enabled by default it's buggy by definition as it would be on otherwise.


        • #24
          Originally posted by Qaridarium
          if you use the multicore option on the mplayer there is no need for an viedeo acceleration 'GPU'

          on 3D stuff the hd5xxx is much better per watt power consuming than an modern nvidia "thermi/fermi" card.

          so why do you wana buy 'bad'/'worst' hardware?

          a gtx480 use up to 100watt idle!

          a hd5870 in the best case 28watt idle!

          nvidia just build very bad hardware.
          If you'd care to peruse,2585-15.html , and the GTX 470 is what I'd be comparing with the 5870 and the power use difference is not that great 12 - ~30W, also similarly with GTX 480 v. 5890...

          Bottom line for me is after experiencing AMD "drivers" I'm going with nVidia on the desktop build, but will likely be waiting for Sandy Bridge now as it's not that far off and comes along with a new socket LGA2011. (Another good point about waiting is that usually for major API hw architecture support changes aren't so hot the first iteration of the hw, and I expect by the time Sandy bridge, new mbs, etc. are out that both nVidia and ATI will be on their 2nd generation of DX11/OGL4 supporting GPUs and since the latest cards are fairly expensive for decent specs I'll definitely be going with nVidia unless they manage another FX 5XXX cluster ----.)

          nVidia & OSS: they don't need it. Their drivers usually work VERY well, and any major problems are usually quickly fixed or such has been my experience in the past.

          Ubuntu 10.4: I'll be upgrading, but this time I plan to do a fresh install so I need to do some data backing up and then I'll still probably wait until June to do it. (Let others guinea pig it for a while, maybe 10.10 I'll try the betas IFF we don't need another early release driver for it...)


          • #25
            Originally posted by Qaridarium
            45watt is bad hardware! nvidia just do not have any chip-making-skill!

            is TDP320 watt for the gtx480 a good chip? most of the time only 5% faster than a 5870 TDP220watt!

            100watt for only 5%....... WOW nvidia you are the best ever!

            buying nvidia is like killing trees or mybe kids!
            actually, considering that the nVidia GTX 4XX series GPUs are MUCH larger than ATI's 5XXX GPUs nVidia MUST have better hw with such a slight increase in power usage, iff power usage is any criteria for judging a GPUs architecture.

            IMO they're both fairly equivalent in hw arch terms, however nVidia just beats the pants off of AMD as far as drivers and extras go. Sure Eyefinity's a nice gimmick, but one that I'm unlikely to ever use...


            • #26
              Originally posted by Panix View Post
              Sorry, one last part to the other post: I was wondering how temps in Linux. My current card, as I said, 7950 GT gets about 47 to 50 degrees, give or take, idle. I think it's around the same in Windows. I haven't really compared it, lately. I don't know if this is typical for this card but it seems acceptable.
              My 7600 GT would get up to 55C under FULL loads.

              I'm wondering how a HD 4850 or HD 4770 would do for temps in Linux and whether there's a big change depending on whether you use FOSS driver or the FGLRX driver. So, 1) what are the temps? 2) what are the temps using a)foss driver; b)binary fglrx driver
              My 4850 mobility pegs at 80C under FULL load, and generally 65 - low 70s C under most apps, which seems to approximate very well to desktop 4850 temps as seen in various reviews.


              • #27
                If something is not enabled by default there are usually three possible explanations :

                - there are known problems which are sufficiently serious to keep it disabled for now

                - there are no known serious problems but the devs don't feel there is sufficient user/test coverage to enable by default yet

                - the feature is ready to go but nobody got around to enabling it by default (the devs normally have the WIP features enabled on their systems anyways so it's not actually obvious when a default needs to change)
                Test signature


                • #28
                  Well the result is the same for the normal user. Till i would suggest somebody using oss drivers for ati cards then those will be legacy.


                  • #29
                    Sure, but one requires developer effort to resolve, one primarily needs user feedback, and the third one just needs people to ask "so, like, why isn't xxx enabled by default ?".
                    Test signature


                    • #30
                      Hopefully next time when there are new legacy cards these work directly not 1+ years after last fglrx driver for em.