Originally posted by mirv
View Post
Announcement
Collapse
No announcement yet.
nVidia likely to remain accelerated video king?
Collapse
X
-
Originally posted by mirv View PostTypically "firmware" in most circles refers to the burnt/flashed/whatever software on the component itself.
Open source firmwares do exist (e.g coreboot), although I'll admit I'm not very familiar with any of them.
We normally use the term "microcode" when talking about horizontal microcode running on an on-chip hardware state machine (as opposed to vertical microcode running on an on-chip general purpose processor) but there aren't really any industry-wide terminology standards there either. Again, sometimes that microcode is burned into the device and sometimes it has to be loaded by the driver.
The only safe thing is to ask lots of questions when someone talks about "firmware" or "microcode" so you can understand the impact, eg :
- does it run on the main CPU or on the device ?
- how big is it, ie does it represent a big chunk of the driver stack ?
- is it running on a hardware state machine or on a general purpose processor ?
- etc...Test signature
Comment
-
Whether you call it microcode or firmware, and whether it's permanently burnt to the device or loaded at runtime, it's interesting to watch some individuals try to twist and turn while arguing that on one hand it's OK that it be completely closed, and on the other hand you must run open drivers or not be worthy of running Linux.
It appears to me to be nothing but a thinly veiled attempt to try to de-legitimise the nVidia drivers as they make their own preferred hardware vendor look less than they might otherwise.
It seems that the AMD/ATI fanbois will have much to be happy about soon enough without needing to appear so desperate. The 10.6 release of fglrx is looking to be a very good thing and gives hope for their superior hardware to be backed with perhaps a very solid binary driver going forwards.
The question of hardware decode of video (or lack thereof) via fglrx may mean that my HTPC will need to stay nVidia but for other uses, ATI on Linux looks better and better with each fglrx release.
Comment
-
Originally posted by mugginz View PostIt appears to me to be nothing but a thinly veiled attempt to try to de-legitimise the nVidia drivers as they make their own preferred hardware vendor look less than they might otherwise.
Originally posted by mugginz View PostThe question of hardware decode of video (or lack thereof) via fglrx may mean that my HTPC will need to stay nVidia but for other uses, ATI on Linux looks better and better with each fglrx release.
Do -any- of you honestly think a company will come out and say "lets add this feature" when their userbase is getting defensive by saying how much they don't need it? No, they release fixes and features when their userbase screams at them. I -guarantee you- this is why ATI released spec to open source developers, and even have a binary driver at all.
NVIDIA has a good chunk of the Linux market for a good reason- they listened to their userbase. They're still wary of releasing a fully open source driver, but they've provided key features users want and can use. But I'm not going to nod and say yes when I find out VDPAU can only decode one video stream at once per card, I'm going to complain about it, and push for a bug to be filed. By doing so, I know the software will mature. (VDPAU added support for multiple streams shortly after).
Comment
-
Originally posted by Rahux View PostHey guys - in the next few months I'll be looking into a new PC and my main priority is being able to watch 1080p movies and do a bit of gaming (but video is more important).
I invested in a nice large monitor as my new place will not have a TV. As far as I can see, ATI cards are performing much better overall but nVidia is the only one offering accelerated video (both local and flash).
Is this likely to remain the case in the next 6 months? Also I hear that the current crop of nVidia cards are very noisy - would I be getting nice video play at the expense of being able to hear the movies I watch? What's the outlook on blue-ray in linux too? Is the card likely to make a difference there?
I guess the question is really whether waiting will be worthwhile.
Microsoft Windows = ATi could be a better deal, generally speaking)
Linux = nVidia could be a better deal (less hassle, ATi is still nightmare for most when it comes to Proprietary drivers, as for OS drivers - their progress is really slow).
Comment
-
Originally posted by Vii7 View PostMicrosoft Windows = ATi could be a better deal, generally speaking)
Linux = nVidia could be a better deal (less hassle, ATi is still nightmare for most when it comes to Proprietary drivers, as for OS drivers - their progress is really slow).
I'm trying to build a low cost budget computer for a relative who will need a new computer.
I have an old Nvidia card that I may move to that and I might jump in and get an ATI card. I want to know what all the fuss is about.
Comment
-
AMD fanboys ATTACK. No but seriously, ATI cards have better performance for the price. The only reason to get an Nvidia right now is for GPU video decode. If you get an ATI card, upgrading to the latest driver might be a pain on Ubuntu, but it is for Nvidia as well, so that is a bit moot. If you are just looking for a cheap video card, you can get a Radeon 4350 for a great price (if you don't care to have the absolute latest gen card.) Stay away from the Nvidia GeForce 210, its performance is abysmal.
Comment
-
Originally posted by LinuxID10T View PostAMD fanboys ATTACK. No but seriously, ATI cards have better performance for the price. The only reason to get an Nvidia right now is for GPU video decode. If you get an ATI card, upgrading to the latest driver might be a pain on Ubuntu, but it is for Nvidia as well, so that is a bit moot. If you are just looking for a cheap video card, you can get a Radeon 4350 for a great price (if you don't care to have the absolute latest gen card.) Stay away from the Nvidia GeForce 210, its performance is abysmal.
I've installed Nvidia drivers on more than one distro and this includes both Debian and Fedora. Yes, a major PITA but I accomplished it. I guess you could say I improved on each attempt! I figured it was going with the devil you know... but, now, current ATI cards are better for price/performance and deliver on lower power/heat than comparable Nvidia cards, if that is of an importance... it is to me.
But, the video decoding and hardware acceleration part bothers me. Call it a pet peeve. Nvidia still rules the roost in this area even with older cards.
Perhaps, it shouldn't be given such significance but since I tend to value based on POTENTIAL, I might want that feature even if some other user doesn't care. I've read of some users who seem to hold it in importance and I agree with them. The card is capable of the feature so it should get SOME support on it. It's a PITA to have to boot up Windoze to get those features. It's one thing to boot it up for games. That is a sacrifice one has to make if they want an optimized setting for games...
Comment
-
Originally posted by Panix View PostThat's true - even Nvidia fans can't argue that part about the performanc/price comparison.
I've installed Nvidia drivers on more than one distro and this includes both Debian and Fedora. Yes, a major PITA but I accomplished it. I guess you could say I improved on each attempt! I figured it was going with the devil you know... but, now, current ATI cards are better for price/performance and deliver on lower power/heat than comparable Nvidia cards, if that is of an importance... it is to me.
But, the video decoding and hardware acceleration part bothers me. Call it a pet peeve. Nvidia still rules the roost in this area even with older cards.
Perhaps, it shouldn't be given such significance but since I tend to value based on POTENTIAL, I might want that feature even if some other user doesn't care. I've read of some users who seem to hold it in importance and I agree with them. The card is capable of the feature so it should get SOME support on it. It's a PITA to have to boot up Windoze to get those features. It's one thing to boot it up for games. That is a sacrifice one has to make if they want an optimized setting for games...
Also think of it this way... Getting an Nvidia card for GPU decode is purpose defeating due to the MASSIVE amount of heat and power they use. In a day of sub $100 dollar quad cores, GPU decode is becoming increasingly moot.
Comment
Comment