You only get modesetting with hd5 cards now, definitely no powersave mode!
10.4/fglrx in Lucid w/ ATI HD 4xxx or 5xxx
Collapse
X
-
Originally posted by Qaridariumif you use the multicore option on the mplayer there is no need for an viedeo acceleration 'GPU'
on 3D stuff the hd5xxx is much better per watt power consuming than an modern nvidia "thermi/fermi" card.
so why do you wana buy 'bad'/'worst' hardware?
a gtx480 use up to 100watt idle!
a hd5870 in the best case 28watt idle!
nvidia just build very bad hardware.
Bottom line for me is after experiencing AMD "drivers" I'm going with nVidia on the desktop build, but will likely be waiting for Sandy Bridge now as it's not that far off and comes along with a new socket LGA2011. (Another good point about waiting is that usually for major API hw architecture support changes aren't so hot the first iteration of the hw, and I expect by the time Sandy bridge, new mbs, etc. are out that both nVidia and ATI will be on their 2nd generation of DX11/OGL4 supporting GPUs and since the latest cards are fairly expensive for decent specs I'll definitely be going with nVidia unless they manage another FX 5XXX cluster ----.)
nVidia & OSS: they don't need it. Their drivers usually work VERY well, and any major problems are usually quickly fixed or such has been my experience in the past.
Ubuntu 10.4: I'll be upgrading, but this time I plan to do a fresh install so I need to do some data backing up and then I'll still probably wait until June to do it. (Let others guinea pig it for a while, maybe 10.10 I'll try the betas IFF we don't need another early release driver for it...)
Comment
-
-
Originally posted by Qaridarium45watt is bad hardware! nvidia just do not have any chip-making-skill!
is TDP320 watt for the gtx480 a good chip? most of the time only 5% faster than a 5870 TDP220watt!
100watt for only 5%....... WOW nvidia you are the best ever!
buying nvidia is like killing trees or mybe kids!
IMO they're both fairly equivalent in hw arch terms, however nVidia just beats the pants off of AMD as far as drivers and extras go. Sure Eyefinity's a nice gimmick, but one that I'm unlikely to ever use...
Comment
-
-
Originally posted by Panix View PostSorry, one last part to the other post: I was wondering how temps in Linux. My current card, as I said, 7950 GT gets about 47 to 50 degrees, give or take, idle. I think it's around the same in Windows. I haven't really compared it, lately. I don't know if this is typical for this card but it seems acceptable.
I'm wondering how a HD 4850 or HD 4770 would do for temps in Linux and whether there's a big change depending on whether you use FOSS driver or the FGLRX driver. So, 1) what are the temps? 2) what are the temps using a)foss driver; b)binary fglrx driver
Comment
-
-
If something is not enabled by default there are usually three possible explanations :
- there are known problems which are sufficiently serious to keep it disabled for now
- there are no known serious problems but the devs don't feel there is sufficient user/test coverage to enable by default yet
- the feature is ready to go but nobody got around to enabling it by default (the devs normally have the WIP features enabled on their systems anyways so it's not actually obvious when a default needs to change)Test signature
Comment
-
Comment