Originally posted by Sonadow
View Post
Announcement
Collapse
No announcement yet.
Asahi Linux On The Apple M1: "Usable As A Basic Linux Desktop" Sans GPU Acceleration
Collapse
X
-
Last edited by akira128; 10 October 2021, 02:12 AM.
- Likes 1
-
Originally posted by Developer12 View Post
Good god, did you read anything I typed? Do you know anything about any of it?
I perfectly know what RT is (what an insult it was "you don't know what RT") and what its use cases are. Hint: this has little to nothing to do with the desktop or desktop GPUs as it applies to embedded systems, e.g. car artificial visual/driving systems which NVIDIA, surprise, has had for many years now.
Almost the entirety of your reply contains future projects or projects which NVIDIA couldn't care less about. I especially love the OpenCL vs CUDA part. CUDA is an effing standard in HPC and AI, tons of AI python libraries, it's taught in dozens of universities, it's taught a lot more than OpenCL. It has an excellent toolkit, documentation, support, everything. People use CUDA for a reason and the reason is simple: NVIDIA cares. AMD and Intel couldn't give two shits about the state of OpenCL. "You have a standard, go use it. No help or support from our side".
You've written a wall of text but you have given zero arguments as to why Open Sourcing of drivers could give NVIDIA any benefits. You seem to be deeply in love with AMD for some reasons, I get it, but as I've already shown it doesn't give AMD any immediate benefits but they surely spend a ton of money maintaining three (four?) software stacks for their GPU drivers. NVIDIA is successful for a reason: they don't waste money showing off. It's not like NVIDIA is not infallable, far from it, I was personally affected by the bumpgate, I remember Crysis 2 and hair-works tesselation shenanigans and other "niceties" they've been implicated in.
Wayland in Fedora and Ubuntu? WTF are you talking about? Again 1% of desktop users who don't really game. Wayland which is only usable in Gnome. Oh, God.
Speaking of current projects: https://blogs.nvidia.com/blog/2020/1...fast-smart-ai/
So much for OpenCL.
Good luck and have fun.Last edited by birdie; 09 October 2021, 07:13 PM.
- Likes 1
Comment
-
All open source "experts" here should remember one extremely important thing: you are here posting under made-up nicknames, slaving for some people and you have no successful businesses to speak of otherwise you wouldn't be here. Jensen Huang has created a company which currently has a valuation of over 500 fucking billion USD. If Open Sourcing allowed NVIDIA to be more successful and earn even more money, they'd do that in an instant. Now please kindly hush and stop spreading the "wisdom" no one gives a damn about.
Actually I will save this paragraph and use it from now on in all future discussions with Phoronix Open Source "experts".
- Likes 2
Comment
-
Originally posted by birdie View PostAll open source "experts" here should remember one extremely important thing: you are here posting under made-up nicknames, slaving for some people and you have no successful businesses to speak of otherwise you wouldn't be here. Jensen Huang has created a company which currently has a valuation of over 500 fucking billion USD. If Open Sourcing allowed NVIDIA to be more successful and earn even more money, they'd do that in an instant. Now please kindly hush and stop spreading the "wisdom" no one gives a damn about.
Actually I will save this paragraph and use it from now on in all future discussions with Phoronix Open Source "experts".
Comment
-
Originally posted by birdie View PostAlmost the entirety of your reply contains future projects or projects which NVIDIA couldn't care less about.
Until a couple of years ago, we used to think that the universal symbol for "give me the keys", was purely human. Now we know that it is actually in the vocabulary of all great apes; humans, chimpanzees, bonobos, gorillas and urangutangs; we all use this word. On a fundamental level, it really means "i am hungry and you are eating", but that principle is so deep that it can be used in all kinds of situations. It is not a question and it's not a request; you're simply letting someone else know that you are currently in need of something that they are currently able to provide. The rest is up to them, and it always will be, whether you're talking to a gorilla or an artificial intelligence.
This is very important knowledge, because it proves that we are born moral. It is not something you have to learn. It is something you have to learn to ignore.
Apple has never taken advantage of our community and also never stood in our way, so they are not at all bound by our moral principles either as friends or enemies. There is nothing wrong with that. They choose to be different and they are not hypocritical about it. Laws are about power. Morality is a social contract. It is immoral to use power to enforce an idea that someone else has not agreed to, because they have not agreed to it. Microsoft stood in our way and that was immoral, because our fundamental cause is freedom of speech and that offends us on a deeper level than knowledge and technnology. It was never disagreement that made Microsoft my enemy, but only the fact that they were in my way. Hypocrisy is just a breach of pattern and I can't imagine that any successful life-form will ever enjoy that.
When Linus Torvalds raised his middle finger towards Nvidia, it did not mean "I will fight you in court". What he was saying in deep human language, was «You do not recognize my value as a friend». It is a very strong ape-word that loosely translates to «I see you as a sociopath». You exploit the system.
When we said this to AMD, they went out of their way to change their ways and become compatible with our ideology. We are not asking this of Apple, because they do not take advantage of us. Once they do, we will treat them as moral entities like any other, because that is what happens when you eat my food. Nvidia takes, but it doesn't give. Most human beings will understand why that is wrong by the age of four.
- Likes 1
Comment
-
Originally posted by akira128 View Post
Ok, I'm just going to state the obvious. You're here posting under a made-up nickname presumably slaving away for some people and have no successful businesses to speak of. Otherwise, you wouldn't be here....right? But...somehow you magically get to tell everyone what to do?! The one post I don't give "a damn about" is yours. Also, this notion that money == success or importance is total horses**t. Money can be made via anti-competitive practices that don't benefit the consumer, and big companies/corporations do this all the time. If you want to worship money that's your business...
Also, my name and whereabout are fully known - I don't hide under nicknames.
Also, unlike you, I don't worship any corporation either. You're madly deeply in love with AMD despite the company charging top dollar for its latest GPUs and selling an equivalent of RX 480 at an almost two times higher price.Last edited by birdie; 10 October 2021, 06:28 AM.
- Likes 1
Comment
-
Originally posted by jo-erlend View PostI've never cared much for legalese.
As to your statement that NVIDIA "doesn't give" - that's a really dangerous and insincere thing to say. Their products are used to make peoples lives brighter (by gaming), their GPUs are used to create a ton of stuff (buildings, machines, airplanes, etc.), predict the weather, invent new molecules and drugs, etc.
If NVIDIA actually didn't give anything, the society wouldn't give money back to them. Sorry, that's logic 101.
Also, had we not had Linux nowadays, a certain group of people would have never thought of having open source drivers for anything under the Sun and the world would have churned on as nothing has happened. Think about that. Maybe you'll finally realize the world doesn't revolve around Open Source and considering that 100% of triple-A games and the absolute majority of serious professional applications are closed-source, it seems like Linux and Open Source are quite a marginal idea.
Just to make things clear I'm writing all of this sitting on the desktop running Fedora 35. My laptop is also running Fedora. I've started using Linux long before Michael Larabel or most people here even learned about it. That was in the the end of 90s.
Comment
-
Originally posted by birdie View PostAlso, unlike you, I don't worship any corporation either. You're madly deeply in love with AMD despite the company charging top dollar for its latest GPUs and selling an equivalent of RX 480 at an almost two times higher price.
Saying "oh it's a 1080p card" is a bit misleading because games became a lot more demanding in the years between RX 480 and latest generation.Test signature
- Likes 1
Comment
-
Originally posted by bridgman View Post
In fairness, that "equivalent of the RX 480" is also about twice as fast as the 480.
Saying "oh it's a 1080p card" is a bit misleading because games became a lot more demanding in the years between RX 480 and latest generation.
RX 480 was released in June 2016, i.e. five years ago, isn't it expected that GPUs during this time span could maybe have become twice as fast? I mean in the past graphics cards become twice as fast in less than three years if I remember correctly.
GeForce GTX 460 release date: July 12, 2010
GeForce GTX 760 release date: June 25, 2013 - more than twice as fast (212%)
And let's be honest RX 480 was a top midrange card while RX 6600 XT is bottom midrange (you've got a midranger 6700, top range 6800 and absolute top 6900), so not only it's a lot more expensive it's in a different class which further proves my point.
I have no issues with you, bridgman, you're not the one setting the pricing policy, I'm just extremely disappointed with what we (as customers) now have to deal with.
Comment
-
Originally posted by birdie View PostBut games have always become more demanding and low midrange graphics cards have always become faster while retaining their price spot. Suddenly, both NVIDIA and AMD have changed the rules of the game drastically and those price increases had occured long before the semiconductor shortage began.
The cost per transistor is not really dropping the way it used to, presumably as a consequence of the increasingly complex fab processes required for finer pitches, and so prices correlate fairly closely with transistor count these days where they used to correlate more closely with die size in the past.
One could argue that the rules haven't changed at all, and the current situation is really just the laws of physics catching up with us.
I guess the only bit of good news is that rather than hitting a hard wall and not being able to reduce feature pitch at all (which was what everyone expected a decade ago) the fab vendors have been able to keep shrinking feature size but at the cost of massive increases in process complexity and capital costs. I think that gives the industry a better chance of finding simplifications and cost reductions in the future compared with if everyone had actually hit a hard wall as expected.
Originally posted by birdie View PostI have no issues with you, bridgman, you're not the one setting the pricing policy, I'm just extremely disappointed with what we (as customers) now have to deal with.Last edited by bridgman; 10 October 2021, 09:27 PM.Test signature
- Likes 1
Comment
Comment