Originally posted by Qaridarium
View Post
Radeon ROCm Updates Documentation Reinforcing Focus On Headless, Non-GUI Workloads
Collapse
X
-
Originally posted by AresMinos View PostYou think you don't. Install ROCm and go test Blender, DaVinci Resolve or Natron. Let me know if you still think you don't have a problem.
I have this one but 8GB, not 16:
https://geizhals.de/amd-radeon-vega-frontier-edition-liquid-100-506062-a1646318.html?hloc=at&hloc=de✔ Preisvergleich für AMD Radeon Vega Frontier Edition Liquid ✔ Bewertungen ✔ Produktinfo ⇒ Anschlüsse: 1x HDMI 2.0b, 3x DisplayPort 1.4 • Grafik: AMD Radeon Vega Frontier Edition Liquid - 16GB HBM… ✔ PCIe ✔ Testberichte ✔ Günstig kaufen
the 8GB vram is called VEGA64. and i do have the same card.Phantom circuit Sequence Reducer Dyslexia
Comment
-
-
Originally posted by smitty3268 View Post*sigh*
Stuff like this is why i quit posting to Phoronix. I think i came back too soon.
AMD and others try to make RDNA chiplet design for years and multiGPU cards for years and it all failed for the graphic parts.
CDNA is the first time AMD is able to make a successfull chiplet design.
sure you can make one very big GCN or RDNA chip who does the same but fact is smaller silicon dies are cheaper for the same performance. be sure if AMDs builds a dual-GPU card or chiplet design cards with one RDNA chip and one CDNA chip people will buy it and it will be a big success.Phantom circuit Sequence Reducer Dyslexia
Comment
-
-
AresMinos , tildearrow the Blender fixes are in the 20.45 packaged driver and have not yet made it to the ROCm releases.Test signature
Comment
-
-
Originally posted by plasticbomb1986 View PostWhen i woke up this morning, ive read it again, and realized you were talking about something somewhat else. I missed out the chiplet part.
AMD 1700X monolit design= Intel won big
AMD 2700X monolit design= intel won big
AMD 3950X chiplet design= intel lose all multicore benchmarks
AMD 5950X chiplet design= intel lost all multucore and singlecore benchmarks
and you cana lso see price drop the 16core 3950X was at 790-950€ and all monolit 16core cpus where at 1100-1400€
thats why the intel 10900K is 10 cores and the 11800K is only 8cores because intel can not compete in the multicore game with a monolit chip design.
this is historical lets see what the REAL market shows us:
Intel Xeon Silver 4216, 16C/32T, 2.10-3.20GHz 100watt
✔ Preisvergleich für Intel Xeon Silver 4216, 16C/32T, 2.10-3.20GHz, tray ✔ Produktinfo ⇒ Kerne: 16 • Threads: 32 • Turbotakt: 3.20GHz (Turbo Boost 2.0) • Basistakt: 2.10GHz… ✔ Intel ✔ Testberichte ✔ Günstig kaufen
860,80€
Intel Core i9-9960X, 16C/32T, 3.10-4.40GHz 165watt
903€
AMD Ryzen 9 3950X, 16C/32T, 3.50-4.70GHz 105watt
chiplet design of 3 chips: 659€
AMD Ryzen Threadripper 1950X, 16C/32T, 3.40-4.00GHz, 180watt
chiplet design of 2 chips 449€
AMD threadripper 2950X 180watt
chiplet design of 2 chips 590€
AMD Ryzen 9 5950X, 16C/32T, 3.40-4.90GHz 105 watt
chiplet design of 3 chips: 969€
as you can see AMD chips because of chiplet design is much cheaper than intel and the multicore performance is higher.
if you want the same chiplet design success on the GPU side you need to be very clever. because it is not possible to make RDNA chiplet. because "real time 3D graphics" do not like it.
to make a RDNA/CDNA hybride chiplet design would be very smart.Phantom circuit Sequence Reducer Dyslexia
Comment
-
-
Originally posted by Qaridarium View Postyou can see the chiplet effect in cpus...
Originally posted by Qaridarium View Postto make a RDNA/CDNA hybride chiplet design would be very smart.
I'm not saying your hybrid-chiplet idea can't happen, but I don't see it as a given (or even very likely).
Comment
-
-
Originally posted by Qaridarium View Post
??? you will see it in the future. asymetric chiplet design can be a huge success.
AMD and others try to make RDNA chiplet design for years and multiGPU cards for years and it all failed for the graphic parts.
CDNA is the first time AMD is able to make a successfull chiplet design.
sure you can make one very big GCN or RDNA chip who does the same but fact is smaller silicon dies are cheaper for the same performance. be sure if AMDs builds a dual-GPU card or chiplet design cards with one RDNA chip and one CDNA chip people will buy it and it will be a big success.
1. Chiplets for GPUs aren't easy. If it was possible, it already would have been done. Maybe in a year we'll start to see it, but moving the massive amounts of data around that a GPU is dealing with is a different type of problem than what a CPU chiplet architecture has to deal with. (And CDNA is not currently chiplet based. I think you misunderstand what all these technology terms mean - see point 3 below for more examples)
2. If CDNA clients wanted any kind of video output, they wouldn't have removed it in the first place. A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else. You are asking them to add more, so it's going to take more resources, more money, and more chips. More chiplets.
3. Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA. They'd just add an I/O chiplet that talked to the display. RDNA refers to the actual graphics execution units, which is precisely what is replaced by CDNA - it would be pointless to have both present.Last edited by smitty3268; 03 March 2021, 11:19 AM.
Comment
-
-
Originally posted by smitty3268 View PostOk, let me break down all the problems with this.
Or, would it appear as 2 devices that happen to share a package? In this case, what's even the point of putting them in the same package, instead of just putting the two existing packages on the same board? And if they share a package, presumably memory as well? How is that going to be partitioned (and, again, what was even the point)?
There are just so many issues with this idea, and it's still not clear there's a big market for it or that it solves any problems any better than just installing separate RDNA and CDNA cards in your machine.
Comment
-
-
Originally posted by coder View PostCPU cores have different communication patterns and bandwidth requirements than GPUs. In GPUs, the communication patterns tend to be more global. I believe you know this, but I mention it to put your points about CPU chiplets in perspective.
and less true for CDNA...
thats why it is not possible to build a successfull dual-gpu RDNA or chiplet RDNA gpu.
but a dualgpu or chiplet design with RDNA + CDNA could be a very smart one.
Originally posted by coder View PostI just think if there was such a big market for a mixed CDNA + RDNA GPU, then they'd have probably kept some graphics capabilities in CDNA.
and dual-gpu card was a failure in the past if you use it for graphics...
this means it is pointless to add a second chip with graphic capabilitie....
but to put an AI/compute chip inside an chiplet design to have both RDNA and CDNA would make perfect sense.
Originally posted by coder View PostI think AMD will simply expect the few people who do want both to install one card of each type in their machine.
I'm not saying your hybrid-chiplet idea can't happen, but I don't see it as a given (or even very likely).
yes there will be CDNA only chiplet design to because yes with CDNA you can easily build chiplet design.
you can also see it in this forum many people want graphics and compute but they also want higher performance and lower costs... and chiplet design is the only way to get all this.Phantom circuit Sequence Reducer Dyslexia
Comment
-
-
Originally posted by smitty3268 View PostOk, let me break down all the problems with this.
1. Chiplets for GPUs aren't easy. If it was possible, it already would have been done. Maybe in a year we'll start to see it, but moving the massive amounts of data around that a GPU is dealing with is a different type of problem than what a CPU chiplet architecture has to deal with. (And CDNA is not currently chiplet based. I think you misunderstand what all these technology terms mean - see point 3 below for more examples)
2. If CDNA clients wanted any kind of video output, they wouldn't have removed it in the first place. A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else. You are asking them to add more, so it's going to take more resources, more money, and more chips. More chiplets.
3. Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA. They'd just add an I/O chiplet that talked to the display. RDNA refers to the actual graphics execution units, which is precisely what is replaced by CDNA - it would be pointless to have both present.
you are right it is very hard to impossible to do RDNA/graphics chiplet thats reason why all dual-gpu cards are a full failure for 3D graphics.
"I think you misunderstand what all these technology terms mean"
I do this all day and all night... but yes for sure i misunderstand it all...
"A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else."
yes you buy CDNA chiplet + nothing else if you are a HPC server customer.
but for gamers who also do some mining a CDNA+ RDNA design could be a win.
"Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA"
LOL really LOL be sure it is not about add a video output to CDNA.
it is only to add a co-processor for AI and other stuff to your RDNA graphic card for desktop and workstation users.
"it would be pointless to have both present"
the point is: you save transistors for the second gpu because you do not need the graphics part at all for the second chip.
and why? you said it to: "Chiplets for GPUs aren't easy. If it was possible, it already would have been done."
so it is super easy to say that a second chip for graphics makes no sense ... because of this you dont do it
but to have 2 cards in the pc one RDNA for graphics and one CDNA for compute makes sense.Phantom circuit Sequence Reducer Dyslexia
Comment
-
Comment