Originally posted by debianxfce
View Post
Announcement
Collapse
No announcement yet.
NVIDIA GeForce RTX 2060 Linux Performance From Gaming To TensorFlow & Compute
Collapse
X
-
-
Originally posted by debianxfce View PostThe Fnatic team gets new gaming PCs form AMD with the 3rd gen Ryzen CPU and the Radeon VII GPU card this year. Source: CES 2019.
Leave a comment:
-
Originally posted by vegabook View Post
Hilarious! The entire article gives you 9 pages of valid reasons to do exactly that. Spankin' AMD's finest with their budget card basically on all games, and an embarrassing absence of red bars on the machine learning charts altogether!
I've been an ATI/AMD fan since the 8514 Ultra (yeah, stone age)... and even I just cannot ignore the fact that under AMD's stewardship, ATI has been wrecked and Nvidia now owns the show. Let's HOPE AND PRAY that with AMD's newfound market capitalization, they can throw the several billions of dollars of investment in that the RTG division desperately needs, just to survive. The way it's going now, and with time running out before Intel muscles in like a gorilla, RTG will be a console bit-player within 24 months and on the way to becoming the next Imagination Technologies. IE: dead.
AMD's strategy today is targeting the lucrative console market. And they beat intel and nvidia every time. AMD has the PS4 and the XboxOne, and also some Chinese consoles not sold in the US. AMD wins consoles. AMD's peecee games strategy today is to win the middle ground, the price/performance segment, and as you can see in any Phoronix benchmarks, AMD crushes the "performance per dollar" charts. AMD wins again. And for those who care about open source, AMD's open-source drivers perform on-par with their proprietary ones, while NVidia gives Linux users the middle finger. AMD wins again.
If you had watched any of AMD's presentations, or listened to their earnings calls, you would know that AMD is not targeting the high end PC gaming market, not until ~2021 anyways.
Maybe next time do a little research, eh?
Leave a comment:
-
Originally posted by phoronix View PostPhoronix: NVIDIA GeForce RTX 2060 Linux Performance From Gaming To TensorFlow & Compute
Yesterday NVIDIA kicked off their week at CES by announcing the GeForce RTX 2060, the lowest-cost Turing GPU to date at just $349 USD but aims to deliver around the performance of the previous-generation GeForce GTX 1080. I only received my RTX 2060 yesterday for testing but have been putting it through its paces since and have the initial benchmark results to deliver ranging from the OpenGL/Vulkan Linux gaming performance through various interesting GPU compute workloads. Also, with this testing there are graphics cards tested going back to the GeForce GTX 960 Maxwell for an interesting look at how the NVIDIA Linux GPU performance has evolved.
http://www.phoronix.com/vr.php?view=27373
- Likes 1
Leave a comment:
-
Originally posted by pracedru View Post
I am working on this:
Yes, and i am using older GLSL shaders (#version 130)
I take it that you are using Redhat or Suse then?
At my old work we had two Redhat clusters and a Suse cluster.
We tried with Debian as an experiment, but since Ansys didn't support Debian and our IT admin guy wasn't able to convince the management that he would be able to keep that environment stable they decided to switch away. I guess maybe they didn't like the idea that company would be dependent on him.
I try to stay away from CUDA since i don't like the idea that one company should own a framework in that manner.
When i get to implementing FEM/CFD solvers, i'm definately going to look into PyViennaCL. I hope that OpenCL 1.1 will suffice since Nvidia wont support 1.2.
What happens if you download and install the drivers directly from the Nvidia website?
So do you use CUDA or not then?
Leave a comment:
-
Originally posted by AndyChow View PostThe TensorFlow results are impressive. I've been having so much problems with ROCm, I might just get one, just for the compute.
Leave a comment:
-
Originally posted by Dedale View PostThey say that for the additional connectors it is relatively easy because they can measure the voltage on the card and the amperage of the connector via a current clamp. How they did for the PCIE connector they do not explain.Originally posted by HenryM View PostPretty sure it involves a custom rewired PCI-E riser that runs the power cables / +voltage connections through a hall-effect multimeter, possibly several multimeters.
- Likes 2
Leave a comment:
-
Originally posted by torsionbar28 View Post
x2, there really is no valid reason for a Linux user to choose nvidia these days.
I've been an ATI/AMD fan since the 8514 Ultra (yeah, stone age)... and even I just cannot ignore the fact that under AMD's stewardship, ATI has been wrecked and Nvidia now owns the show. Let's HOPE AND PRAY that with AMD's newfound market capitalization, they can throw the several billions of dollars of investment in that the RTG division desperately needs, just to survive. The way it's going now, and with time running out before Intel muscles in like a gorilla, RTG will be a console bit-player within 24 months and on the way to becoming the next Imagination Technologies. IE: dead.Last edited by vegabook; 08 January 2019, 08:06 PM.
- Likes 1
Leave a comment:
-
Originally posted by Dedale View PostThey say that for the additional connectors it is relatively easy because they can measure the voltage on the card and the amperage of the connector via a current clamp. How they did for the PCIE connector they do not explain.
But their tone suggest it was not trivial.
Leave a comment:
Leave a comment: