Announcement

Collapse
No announcement yet.

NVIDIA Doesn't Expect To Have Linux 5.9 Driver Support For Another Month

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Grinness
    replied
    Originally posted by birdie View Post

    Dear Grinness,

    I don't understand what you're trying to say here. That certain not exactly power efficient GPUs work OK'ish in relatively simple games with VSync turned on? But what about more complex titles? Should people buy their GPUs, play carefully selected games or enable VSync or even decrease image quality to achieve good thermals? Sorry, very few people do that (btw I also do it for certain non-demanding games like King's Bounty). Most people however just install their video cards, enable maximum image quality and play. Lastly if you're frugal you've got a chance of buying the GTX 1650 with a 75W TDP or the RX 570 which has mostly the same performance at 167W. The choice is yours.

    And lastly, a thing which people choose not to discuss here: with NVIDIA I can choose which drivers to use. With AMD your drivers are part of your kernel and Mesa. You cannot easily shuffle these packages around when regressions occur: in fact most people have no idea how to downgrade the Mesa package or roll back the kernel.
    Birdie,

    what I mean is very simple: TDP (as for CPUs) and (max FPS)/(max Power) do not mean a thing if you do not contextualise (e.g. low/medium/high grade GPU, type of usage)
    In fact what you call 'not exactly power efficient' is perfectly in line with expectation for majority of cards within the same era/price/grade (rx480 is from 2016, 14nm processing node)
    The article you link in fact states the following, w.r.t. the GTX 1650 OC:

    "Another great alternative is the Radeon RX 5500 XT, which starts at $180 with a similar performance uplift and much quieter coolers, especially the Sapphire Pulse is a good choice here. If you are willing to buy used, then the Radeon RX 570 or RX 580 are definitely worth a look as they are being sold off at bargain prices because people are upgrading. Yes, they don't have the latest tech and draw more power, but they are still compatible with all new games, which would help you pass the time until next-gen becomes available."
    https://www.techpowerup.com/review/g...-gddr6/35.html

    With the huge list of cards in the image you link, it is like arguing which car is better between a city-car (any type/any brand) and a sport-car (any type/any brand) without knowing the use you gonna make of it (city driving, highway, sunday-race,...)

    Finally, with AMD you have 3 drivers you can choose from: Mesa, AMD-PRO, AMD-Open -- no need to recompile anything (and ppl even complain for the 'too many options')
    With NVidia you have 1 driver (unfortunately nouveau is not a real option, until NVidia ....)

    Leave a comment:


  • birdie
    replied
    Originally posted by Grinness View Post
    Dear Birdie,
    Dear Grinness,

    I don't understand what you're trying to say here. That certain not exactly power efficient GPUs work OK'ish in relatively simple games with VSync turned on? But what about more complex titles? Should people buy their GPUs, play carefully selected games or enable VSync or even decrease image quality to achieve good thermals? Sorry, very few people do that (btw I also do it for certain non-demanding games like King's Bounty). Most people however just install their video cards, enable maximum image quality and play. Lastly if you're frugal you've got a chance of buying the GTX 1650 with a 75W TDP or the RX 570 which has mostly the same performance at 167W. The choice is yours.

    And lastly, a thing which people choose not to discuss here: with NVIDIA I can choose which drivers to use. With AMD your drivers are part of your kernel and Mesa. You cannot easily shuffle these packages around when regressions occur: in fact most people have no idea how to downgrade the Mesa package or roll back the kernel.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by cyni
    In what sense is it not ready? In many ways it is simpler to use than the competition.
    Did you semi-quote my name on purpose so I would not find your post?

    How is it "simpler" to use if it requires you to read a thousand manuals, use a terminal and even potentially learn programming just to do a simple task?!

    Originally posted by cyni
    Outside of specialized proprietary software made for professionals with no open-source equivalents, it seems to have covered all the important ground
    Video editing?

    Leave a comment:


  • Grinness
    replied
    Originally posted by birdie View Post
    Much touted RX 500 series cards are horrible in terms of power consumption, thermals, noise and effectiveness.
    Dear Birdie,

    those graphs do not mean much -- I would say they are completely misleading.
    What they show is the end of the scale of power consumption by FPS, they do not represent standard usage power consumption (or better, power consumption at given scenario/usage)
    Are you familiar with power vs rpm plots used to characterise engines?
    A GPU is more or less like a engine, it can scale down or up its power usage depending on the requirement.
    The important is to understand that characteristic and where your usage lands on the power consumption

    To give you an example, I run Dota 2 an a rx480 using vulkan (RADV+ACO), 1080p, full screen and logging the output of 'sensors' every 1 second for 30 seconds while changing the MAX FPS settings
    I just left the game running while watching a live event.
    The results are as follows
    MAX FPS AVERAGE Power Consumption
    60 56
    90 60
    240* 90
    *: my card does not achieve 240 PFS -- here it means essentially FPS Uncapped

    In my use case, Dota2 Maxed out with VSync on does not consume more than 60 Watt on average .....

    Now, the above is just an exercise, but it would be useful to understand how the different GPUs (across vendors) scale their power at LOCKED FPS.
    I would rather buy a GPU that gives me 60fps locked (MAX/HIGH/ULTRA) at less than 60 Watt, rather than a GPU that gives me 300+ FPS at 300+ Watt

    Unfortunately I do not know which GPU is which as I have yet to come across a report on the above across the different 'I give you the numbers to buy your new GPU' websites

    TBH, I would not be surprised (given what shown in the article on FPS per Watt) that at the end all the GPUs (red vs green, open vs closed) are equal at main stream settings (1080p, 60FPS) in terms of power (thus thermals and effectively noise )
    Last edited by Grinness; 19 October 2020, 07:04 PM.

    Leave a comment:


  • birdie
    replied
    Originally posted by omer666 View Post
    Where did you sense any sympathy in my words?

    You are telling us that hating a company is counterproductive, while writing hateful words directed at actual people doesn't seem to worry you the least. Check that with your therapist.
    I quoted your words about sympathy - make sure you at least remember what you said earlier. I don't have a therapist yet.

    Speaking of hateful words? Are you talking about me calling people "fanatics" on several occasions? But the people I've referred to hate the company with passion for absolutely no reasons. Again, I've repeated it three times already, does NVIDIA make anyone buy their GPUs? No? Do they impede Linux development? No? Do they do anything bad aside from providing closed source drivers for a God forsaken OS? No? Then why so much unwarranted hatred as if they've killed your children?

    Originally posted by omer666 View Post
    I am well aware of both Nvidia and AMD's power efficiency, thank you very much. It shows both RX 5600 and 5700 doing fine, by the way. You'd better pick the 4K performance if you really want to demonstrate RTX 3080's true efficiency. But then, we're not talking about TDP and price, so you're eluding part of the problem. Oh, and everybody knows RX 500 series were flawed, but do you want to talk about GTX 8000's? Or GTX 5000's? Or even GeForce 4 vs Radeon 9700? Who's counterproductive now?
    Look how you've carefully chosen good AMD products and carefully ignored all their fiascos. And you've carefully ignored all the great NVIDIA products starting with the GeForce4 Ti 4200. How nice of you. The RX 5000 series is the the first one to truly feature good performance/TDP ratio in what, eight years?

    Radeon HD 7000 series: decent, oh, wait it was in 2012
    Radeon 200 series: OC'ed rebrandeon with bad thermals
    Radeon 300 series: OC'ed rebrandeon with bad thermals
    Radeon RX 400 series: not good
    Radeon RX Vega series: not good, cost AMD a fortune to produce, deprecated fast because margins were horrible.
    Radeon RX 500 series: bad
    Radeon VII: not good, a top tier product released two years after the GTX 1080 Ti for the same price featuring the same performance and performance/TDP ratio, only the GTX 1080 Ti was produced using a 16 nm node, and the Radeon VII on a 7 nm node.

    Originally posted by omer666 View Post
    Does the fact that I pick my parts from whoever fits me the most make me a "fanatic?"
    Who is screaming in this thread?
    Why do you want people to exercise "modesty" when you brag about your many years using Linux and your contributions to open source software as a means to have people keep their mouth shut?

    I only exposed my views on the subject that don't need your approbation.
    It depends on how you choose these parts. If you go around screaming "NVIDIA fuck you" and buy anything by AMD just not to buy better NVIDIA products that makes you nothing but fanatic. Rational people buy products based only on their merits. Fanatics will buy everything from their company because ... they have tribalism in their genes instead of rationale. And this forum and lots of comments here are a prime example of that: "we love our Linux tribe and we hate everything which is not Open Source with vengeance". Look where it's taken Linux in the past 30 years. Practically nowhere. 99% of people have never heard of it.

    There's one thing which Open Source fans will never see in Linux (in its current form and current development model): native AAA games. And Open Source AAA games will not happen ever. Good luck having tons of fun with Wine/Proton (and don't get me started on various anticheat solutions which just do not work under Linux).

    Lastly I don't gag people. You're now making things up
    Last edited by birdie; 19 October 2020, 08:44 PM.

    Leave a comment:


  • omer666
    replied
    Originally posted by birdie View Post

    I don't need your sympathy or your nod. Even when surrounded by Open Source fans, even while I've been using Linux for more than two decades, I still have my wits about me and I don't blindly hate companies because they have their own PoV. The only thing I cannot forgive NVIDIA for is their reluctance (which is yet to be explained) to release firmware for their GPUs in order to let nouveau work properly. Hating is stupid and counterproductive anyways.
    You are telling us that hating a company is counterproductive, while writing hateful words directed at actual people doesn't seem to worry you the least. Check that with your therapist.

    I am well aware of both Nvidia and AMD's power efficiency, thank you very much. It shows both RX 5600 and 5700 doing fine, by the way. You'd better pick the 4K performance if you really want to demonstrate RTX 3080's true efficiency. But then, we're not talking about TDP and price, so you're eluding part of the problem. Oh, and everybody knows RX 500 series were flawed, but do you want to talk about GTX 8000's? Or GTX 5000's? Or even GeForce 4 vs Radeon 9700? Who's counterproductive now?

    Does the fact that I pick my parts from whoever fits me the most make me a "fanatic?"
    Who is screaming in this thread?
    Why do you want people to exercise "modesty" when you brag about your many years using Linux and your contributions to open source software as a means to have people keep their mouth shut?

    I only exposed my views on the subject that don't need your approbation.
    Last edited by omer666; 19 October 2020, 06:11 PM.

    Leave a comment:


  • cynical
    replied
    Originally posted by bug77 View Post

    In other words, if you disregard anything that's about serious work, it's all peachy

    At the same time, I see where birdie is coming from. Compared to Windows or macOS, which are all about the desktop, Linux desktop is full of papercuts.
    I guess if you restrict serious work to that narrow range of activity, you have a point.

    Windows and MacOS have their own papercuts. If they didn't, I would be using them instead.

    Leave a comment:


  • birdie
    replied
    Originally posted by omer666 View Post
    While this discussion seems to be ridiculous, some points are being made nonetheless. I don't really have any sympathy for birdie nor do I agree with him, but I have to confess that some of the attacks that are being made about Nvidia are quite unfair.
    I don't need your sympathy or your nod. Even when surrounded by Open Source fans, even while I've been using Linux for more than two decades, I still have my wits about me and I don't blindly hate companies because they have their own PoV. The only thing I cannot forgive NVIDIA for is their reluctance (which is yet to be explained) to release firmware for their GPUs in order to let nouveau work properly. Hating is stupid and counterproductive anyways.

    Originally posted by omer666 View Post
    On the other hand, there is more than just licensing issues leading to hatred directed towards Nvidia. This company is quite not consumer friendly and has never been ashamed of charging their clients a lot more, and sometimes for features barely used at the time of their release, or going overkill with cards that eat enough electricity to power a little house for a week in 2 hours of gaming.
    I am well aware there is an "enthusiast" market, and it's the same people who buy extreme-series Intel Core i9, but lately it seems it's become Nvidia's only target.
    Do not buy NVIDIA's products. End of story. Everyone's happy. Again, we have another disgruntled person who can buy AMD GPUs and instead of talking about GPL issues, is talking about something completely different. Speaking of "going overkill with cards that eat electricity" - AMD has released similar cards in the past and at the same time NVIDIA does release cards with modest power consumption. In fact their 12nm cards are still more power efficient than AMD's 7nm offerings, so your criticism is completely misplaced:


    Much touted RX 500 series cards are horrible in terms of power consumption, thermals, noise and effectiveness.

    Originally posted by omer666 View Post
    The other thing that is getting on people's nerves is that Nvidia has been refusing to comply with many open source projects, the latest being Wayland's compositing protocol. I don't mind having a closed-source driver, in fact I think its quality is top-notch. I have been gaming on Linux for quite some time now and it seems people are quick to forget that something like 5 years ago, gaming with an AMD card on Linux was not even thinkable. But when it slows adoption of Wayland or when it takes years to correct a repaint bug, yeah, it's boring me.
    Wayland is still a toy, incomplete and doesn't offer benefits for most users out there while still having rough edges. I don't understand why NVIDIA has to support or "comply" with it. What if you created a brand new yet another graphics server tomorrow? Should NVIDIA also support it?

    Originally posted by omer666 View Post
    That's why I think my next rig is gonna be AMD for both CPU and GPU. I don't trust Nvidia anymore.
    That's called fanaticism and blind company allegiance. I buy products based only on their merits (for a GPU it's drivers quality, performance, thermals, price and noise), not on how fans are pitching them. I had RX 5600 XT for five months and I don't want it back.

    Again, remember that the Linux desktop market share is less than 2%, so maybe Open Source fans could exercise modesty instead of screaming, threatening and showing resentment.

    Leave a comment:


  • omer666
    replied
    While this discussion seems to be ridiculous, some points are being made nonetheless. I don't really have any sympathy for birdie nor do I agree with him, but I have to confess that some of the attacks that are being made about Nvidia are quite unfair.

    On the other hand, there is more than just licensing issues leading to hatred directed towards Nvidia. This company is quite not consumer friendly and has never been ashamed of charging their clients a lot more, and sometimes for features barely used at the time of their release, or going overkill with cards that eat enough electricity to power a little house for a week in 2 hours of gaming.
    I am well aware there is an "enthusiast" market, and it's the same people who buy extreme-series Intel Core i9, but lately it seems it's become Nvidia's only target.

    The other thing that is getting on people's nerves is that Nvidia has been refusing to comply with many open source projects, the latest being Wayland's compositing protocol. I don't mind having a closed-source driver, in fact I think its quality is top-notch. I have been gaming on Linux for quite some time now and it seems people are quick to forget that something like 5 years ago, gaming with an AMD card on Linux was not even thinkable. But when it slows adoption of Wayland or when it takes years to correct a repaint bug, yeah, it's boring me.

    That's why I think my next rig is gonna be AMD for both CPU and GPU. I don't trust Nvidia anymore.

    Leave a comment:


  • Volta
    replied
    Originally posted by mdedetrich View Post

    I just had a quick look and this was due to logitech not providing a driver for Windows, so you can blame logitech here. Same problem as countless devices that don't work on Linux because it has no driver.

    If you modify the generic XInput driver to work with the logitech gamepad then Windows works with it fine.
    The same rule can be applied to problematic hardware on Linux. I wouldn't say countless in this case. Most of the hardware runs fine out of the box on Linux while you have spend some time to make it run on Windows. I made it work by modifying xbox driver.

    I am pretty sure I know who I am speaking to.
    Won't repeat myself.

    Leave a comment:

Working...
X