If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
I think the "free & open" fanatics sometimes confuse the difference between a project and a product. NVidia offers products, not projects. They manage to make their products work within the confinements of a kernel that basically only supports in-kernel drivers. It's quite a challenge, but they come through with a product that works.
I don't offer products or projects. I offer cash. And since AMD made the move to at least try to play nice I have stopped giving it to NVidia and given it exclusively to AMD. Yes I could use the blob if I wanted to use a 3 year old kernel along with older crappy drivers for every thing else. And if I never, ever want to update again. My hope is that some share holders catch wind of the fuss and start asking questions about the support. What a great experience for people trying out Linux. Learning to make the blob work. Nothing like it to drive people back to Windows. Beside NVidia has already said they won't support Wayland. Since my distro of choice has had wayland binaries for months now I highly suspect NVidia isn't going to be a choice at all soon.
1) We don't give a rat's ass about "consistent experience". In fact, we DON'T WANT a consistent experience, we want a proper LINUX experience, not some bloated bluescreen experience of crap not working.
2) There is no USE in a "consistent experience". Different OS, ***DIFFERENT NEEDS***.
3) If you want to use the same code, forgetting all about (1) and (2), releasing the code does NOT contradict this!!!! All releasing the code does is provides some FREE DEBUGGING AND PROGRAMMING. FREE LABOR!!! RELEASE THE FUCKING CODE NVIDIA DOUCHEBAGS!!!
1) Linux end users benefit from same-day support for new GPUs , OpenGL version and extension parity between NVIDIA Windows and NVIDIA Linux support, and OpenGL performance parity between NVIDIA Windows and NVIDIA Linux.
So if the drivers where open they would stop giving same-day support and Windows parity?
2) We support a wide variety of GPUs on Linux, including our latest GeForce, Quadro, and Tesla-class GPUs, for both desktop and notebook platforms. Our drivers for these platforms are updated regularly, with seven updates released so far this year for Linux alone. The latest Linux drivers can be downloaded from www.nvidia.com/object/unix.html.
So if the drivers where open people would start ripping support out?
3) We are a very active participant in the ARM Linux kernel. For the latest 3.4 ARM kernel – the next-gen kernel to be used on future Linux, Android, and Chrome distributions – NVIDIA ranks second in terms of total lines changed and fourth in terms of number of changesets for all employers or organizations.
Sure, the rank of #2 might suffer if they let other people work on their drivers.
nVidia is full of crap. Let's take a look at all of the Android based hardware out there on the Tegra 2 that has been forgotten about and left behind because we have NO DRIVERS for their hardware. O2X, G2X, Viewsonic GTab, etc... All we want is open drivers that can be used to implement hardware acceleration but NOOOOOO... nVidia is too engrossed in being ass hats to give us that.
I Agree... There is a ton of issues on the Open source kernel front. In fact, a true blue case in point is how horrible the arm support is maintained. The Viewsonic GTablet does have the open source parts of the device released, but key features are lost in translation from android 2.2 to 4.x. It'd be excellent if these features where fixed to further remove the dependence on the original maker, but i doubt it.
However, on a decent note, at least nvidia does release the all open source components at a somewhat regular pace, and increasingly their vendors are following suite. I actually can think of a few large companies that release linux devices but fail to provide the kernel source ( For example, Pandigital, and most chinese tablets ).
Linus criticized from a Linux kernel developer point of view. He didn't say: "Fu**in' artifacts and frame drops when I'm pawning noobs in ETQW"
The PR, as usual, gave a canned response that is only marginally related to the main issue.
I totally agree. While from an end user point of view ATI might be a hassle at first ( when dealing with the binary only driver), but this is because nvidia has had a stranglehold on the end user graphics/games in the linux market for too long. I actually am getting tired of the argument that nvidia is just plain better on linux. It's only "better" because of the fact that too many developers have flocked to this graphics vendor, and not enough have setup a somewhat proper test setup. Anyways, I will still suggest AMD over Nvidia is for the fact that AMD has much better open source driver support ( although that is only through multiple documentation releases), and in the longer term the amd product would be better.
Their answer was meant for clueless users who are being fed the "religious fanatics of FOSS" crap by astroturfers and who probably only want to play some games under Wine on Ubuntu.
The fact that kernel developers have a really hard time dealing with Nvidia hardware is something they don't want addressed, so they bring out the PR candy about features and products and corporate vision and progress and freedom and universal human rights. So the casual user, who blames SuSE when his blob fucks up, keeps buying the closed hardware and blaming Linus and the kernel and X devs every time he has a problem.
Yet again, half of this is caused by the nvidia stranglehold on the linux end user market. The only fix is to get more developers with AMD/ATI hardware to fix these bugs. Anyways, I believe that the casual user in any case is about the same. They will blame the wrong party that does not have anything to do with the issue. I recall that some major games on windows that where released in the last couple of years would fail terribly if ran on a system that was powered by anything but Intel and Nvidia.
I just removed the last Nvidia card from my systems; now running ATI and Intel with Fedora 17 out-of-the-box drivers. I get flawless 1080p video, 3D desktop and adequate light gaming. When Nvidia considers updating their attitude, I'll reconsider my opinion and recommendations.
Welcome to the Dark side. I'm fairly certain that things are probably going to be much better shortly considering all the excellent news I have seen recently on the AMD/ATI open source drivers front. Anyways, I have been using ATI/AMD for almost a decade, and generally speaking, I don't have problems upgrading every 5 to 6 years because I generaly see new features I want. Anyways, The open source drivers I see now are hundreds of times better than the nearly non-existend drivers 5 years ago.
Basically, they gave us the middle finger. That's why I'll never buy laptop with discrete graphic card. Intel all the way. At least their drivers are open source.
I would suggest re-examining the problem. The discrete graphics powered by AMD are getting much better, and are advancing tot he point to where a mostly usable driver is available on all fronts. I personally would recommend the AMD fusion series of laptops. The discrete cards found in these laptops allow for reasonable battery life ( 4 to 6 hours) when in use, and most of the time you can disable the dedicated card until the multiple graphics card support improves to the point that it is usable. Anyways, open source radeon drivers are rapidly advancing, and it's key to remember that the Open Source drivers generally see a lot more work done by end developers using documentation provided by AMD.
I don't get why we have switchable graphics in the first place.
Did somebody really think "Our graphics cards drain the battery too much on idle so we should also put in another graphics chip from another company that can display stuff while we idle so we can power off our card..."?
Why not try to build a modular graphics card where you could actually electrically turn off many parts so that the remaining parts still can provide basic 3d acceleration and render 2d stuff?
As stated Before, Power savings. Personally I see 4 to 6 hours of battery life on a laptop powered by a radeon hd 6720g2 (crossfire of Radeon HD 6520 + Radeon HD 6650M). While I currently do not have crossfire support for this under linux, it is nice to be able to change which graphic card I am using depending on the task. I personally would like to be able to change between all three modes, but currently that is not an option yet. Howevre, the two main modes available now are more than enough, where When i want switch to a power savings mode i use the radeon hd 6520, and when i want a bit more performance, I use the radeon hd 6650M. However, the latest xrandr work might prove useful for both blobs.
I wish that was true, but no. I'm a HD4890 user, and from my experience, there are many games on Wine that just won't work properly on AMD hardware while NVIDIA users report that everything is just fine. Pretty much most of the issues I've had were with Unreal Engine 3 games - Mass Effects and Unreal Tournament 3, which don't even start properly. Older games work fine, though.
That's because for quite a few years, Wine was written specifically towards the Nvidia drivers, bugs and all. Nvidia do have a good GL driver for the most part, but what you're seeing is a side effect of the Wine developers only targetting the Nvidia blob for a long time. That may have changed in the last few years, but it was true for a long time.
On my laptop, the Nvidia drivers STILL DONT work. Jockey doesn't see any Nvidia drivers and doing it manually doesn't work. Nvidia GT 520MX- Ubuntu 12.04 64 bit
FUCK YOU NVIDIA!!
When Optimus first came out, jockey would still offer nvidia blob drivers, and it would prevent X from starting (jockey finally got fixed in Precise and won't offer nvidia drivers on systems with intel driver loaded). Of course, some n00bs still insist on installing nvidia blob manually, and are shocked when it doesn't work...
How is this a response to Linus' argument? He said "Nvidia sucks at open source" and they respond with "our closed source blob is good" (when it works, I'd add). The two matters are completely unrelated.
They're not unrelated, since Nvidia's argument is that their driver strategy is for the Linux driver to share 90% of it's code with their Windows drivers - and that while this might not please kernel developers or open-source fans, it's the only reason they can provide Linux drivers at all.
Sure, you might not agree with that reasoning, but I don't see why people are claiming it doesn't address Linus's argument. Because it does, if "reduce effort by sharing driver code" and "play nicely with kernel" are exclusive choices.