Radeon ROCm Updates Documentation Reinforcing Focus On Headless, Non-GUI Workloads

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • plasticbomb1986
    Junior Member
    • Mar 2019
    • 19

    #61
    Originally posted by Qaridarium View Post

    the point is this: to make chiplet design for graphics means RDNA is very very hard.
    but make chiplet design with CDNA for compute is very easy.

    but chiplet lowers the costs compared to monolite bigger chip

    this means: if you make hybrid chiplet design with one RDNA and one CDNA chip you could run graphic on RDNA and the compute parts on CDNA... and Games could use it to because modern games like cyberpunk 2077 use more and more AI and deep learning features and complex physic... this means you can do thison the CDNA part and do only graphics on the RDNA part.
    When i woke up this morning, ive read it again, and realized you were talking about something somewhat else. I missed out the chiplet part.

    Comment

    • qarium
      Senior Member
      • Nov 2008
      • 3435

      #62
      Originally posted by AresMinos View Post
      You think you don't. Install ROCm and go test Blender, DaVinci Resolve or Natron. Let me know if you still think you don't have a problem.
      I have this one but 8GB, not 16:
      ✔ Preisvergleich für AMD Radeon Vega Frontier Edition Liquid ✔ Bewertungen ✔ Produktinfo ⇒ Anschlüsse: 1x HDMI 2.0b, 3x DisplayPort 1.4 • Grafik: AMD Radeon Vega Frontier Edition Liquid - 16GB HBM… ✔ PCIe ✔ Testberichte ✔ Günstig kaufen
      there was no amd-radeon-vega-frontier-edition-liquid with 8GB VRAM

      the 8GB vram is called VEGA64. and i do have the same card.
      Phantom circuit Sequence Reducer Dyslexia

      Comment

      • qarium
        Senior Member
        • Nov 2008
        • 3435

        #63
        Originally posted by smitty3268 View Post
        *sigh*
        Stuff like this is why i quit posting to Phoronix. I think i came back too soon.
        ??? you will see it in the future. asymetric chiplet design can be a huge success.
        AMD and others try to make RDNA chiplet design for years and multiGPU cards for years and it all failed for the graphic parts.
        CDNA is the first time AMD is able to make a successfull chiplet design.

        sure you can make one very big GCN or RDNA chip who does the same but fact is smaller silicon dies are cheaper for the same performance. be sure if AMDs builds a dual-GPU card or chiplet design cards with one RDNA chip and one CDNA chip people will buy it and it will be a big success.
        Phantom circuit Sequence Reducer Dyslexia

        Comment

        • bridgman
          AMD Linux
          • Oct 2007
          • 13186

          #64
          AresMinos , tildearrow the Blender fixes are in the 20.45 packaged driver and have not yet made it to the ROCm releases.
          Test signature

          Comment

          • qarium
            Senior Member
            • Nov 2008
            • 3435

            #65
            Originally posted by plasticbomb1986 View Post
            When i woke up this morning, ive read it again, and realized you were talking about something somewhat else. I missed out the chiplet part.
            you can see the chiplet effect in cpus...

            AMD 1700X monolit design= Intel won big
            AMD 2700X monolit design= intel won big

            AMD 3950X chiplet design= intel lose all multicore benchmarks
            AMD 5950X chiplet design= intel lost all multucore and singlecore benchmarks

            and you cana lso see price drop the 16core 3950X was at 790-950€ and all monolit 16core cpus where at 1100-1400€

            thats why the intel 10900K is 10 cores and the 11800K is only 8cores because intel can not compete in the multicore game with a monolit chip design.

            this is historical lets see what the REAL market shows us:

            Intel Xeon Silver 4216, 16C/32T, 2.10-3.20GHz 100watt
            ✔ Preisvergleich für Intel Xeon Silver 4216, 16C/32T, 2.10-3.20GHz, tray ✔ Produktinfo ⇒ Kerne: 16 • Threads: 32 • Turbotakt: 3.20GHz (Turbo Boost 2.0) • Basistakt: 2.10GHz… ✔ Intel ✔ Testberichte ✔ Günstig kaufen

            860,80€

            Intel Core i9-9960X, 16C/32T, 3.10-4.40GHz 165watt

            903€

            AMD Ryzen 9 3950X, 16C/32T, 3.50-4.70GHz 105watt

            chiplet design of 3 chips: 659€

            AMD Ryzen Threadripper 1950X, 16C/32T, 3.40-4.00GHz, 180watt

            chiplet design of 2 chips 449€

            AMD threadripper 2950X 180watt

            chiplet design of 2 chips 590€

            AMD Ryzen 9 5950X, 16C/32T, 3.40-4.90GHz 105 watt

            chiplet design of 3 chips: 969€

            as you can see AMD chips because of chiplet design is much cheaper than intel and the multicore performance is higher.

            if you want the same chiplet design success on the GPU side you need to be very clever. because it is not possible to make RDNA chiplet. because "real time 3D graphics" do not like it.

            to make a RDNA/CDNA hybride chiplet design would be very smart.
            Phantom circuit Sequence Reducer Dyslexia

            Comment

            • coder
              Senior Member
              • Nov 2014
              • 8920

              #66
              Originally posted by Qaridarium View Post
              you can see the chiplet effect in cpus...
              CPU cores have different communication patterns and bandwidth requirements than GPUs. In GPUs, the communication patterns tend to be more global. I believe you know this, but I mention it to put your points about CPU chiplets in perspective.

              Originally posted by Qaridarium View Post
              to make a RDNA/CDNA hybride chiplet design would be very smart.
              I just think if there was such a big market for a mixed CDNA + RDNA GPU, then they'd have probably kept some graphics capabilities in CDNA. I think AMD will simply expect the few people who do want both to install one card of each type in their machine.

              I'm not saying your hybrid-chiplet idea can't happen, but I don't see it as a given (or even very likely).

              Comment

              • smitty3268
                Senior Member
                • Oct 2008
                • 6954

                #67
                Originally posted by Qaridarium View Post

                ??? you will see it in the future. asymetric chiplet design can be a huge success.
                AMD and others try to make RDNA chiplet design for years and multiGPU cards for years and it all failed for the graphic parts.
                CDNA is the first time AMD is able to make a successfull chiplet design.

                sure you can make one very big GCN or RDNA chip who does the same but fact is smaller silicon dies are cheaper for the same performance. be sure if AMDs builds a dual-GPU card or chiplet design cards with one RDNA chip and one CDNA chip people will buy it and it will be a big success.
                Ok, let me break down all the problems with this.

                1. Chiplets for GPUs aren't easy. If it was possible, it already would have been done. Maybe in a year we'll start to see it, but moving the massive amounts of data around that a GPU is dealing with is a different type of problem than what a CPU chiplet architecture has to deal with. (And CDNA is not currently chiplet based. I think you misunderstand what all these technology terms mean - see point 3 below for more examples)

                2. If CDNA clients wanted any kind of video output, they wouldn't have removed it in the first place. A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else. You are asking them to add more, so it's going to take more resources, more money, and more chips. More chiplets.

                3. Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA. They'd just add an I/O chiplet that talked to the display. RDNA refers to the actual graphics execution units, which is precisely what is replaced by CDNA - it would be pointless to have both present.
                Last edited by smitty3268; 03 March 2021, 11:19 AM.

                Comment

                • coder
                  Senior Member
                  • Nov 2014
                  • 8920

                  #68
                  Originally posted by smitty3268 View Post
                  Ok, let me break down all the problems with this.
                  The elephant in the room is software. How would such a device be presented to the applications? Would it be a single device with a mix of execution resources? We don't even have APIs for that (e.g. describing a mix of resources and specifying which workloads should run where, etc.)!

                  Or, would it appear as 2 devices that happen to share a package? In this case, what's even the point of putting them in the same package, instead of just putting the two existing packages on the same board? And if they share a package, presumably memory as well? How is that going to be partitioned (and, again, what was even the point)?

                  There are just so many issues with this idea, and it's still not clear there's a big market for it or that it solves any problems any better than just installing separate RDNA and CDNA cards in your machine.

                  Comment

                  • qarium
                    Senior Member
                    • Nov 2008
                    • 3435

                    #69
                    Originally posted by coder View Post
                    CPU cores have different communication patterns and bandwidth requirements than GPUs. In GPUs, the communication patterns tend to be more global. I believe you know this, but I mention it to put your points about CPU chiplets in perspective.
                    yes right thats why chiplet is in cpu and not on GPU right now. but what you say here is MORE true in RDNA
                    and less true for CDNA...
                    thats why it is not possible to build a successfull dual-gpu RDNA or chiplet RDNA gpu.

                    but a dualgpu or chiplet design with RDNA + CDNA could be a very smart one.


                    Originally posted by coder View Post
                    I just think if there was such a big market for a mixed CDNA + RDNA GPU, then they'd have probably kept some graphics capabilities in CDNA.
                    you really have the wrong idea here. if you keep the graphics capabilities int he CDNA chip then chiplet design is not possible.
                    and dual-gpu card was a failure in the past if you use it for graphics...
                    this means it is pointless to add a second chip with graphic capabilitie....

                    but to put an AI/compute chip inside an chiplet design to have both RDNA and CDNA would make perfect sense.


                    Originally posted by coder View Post
                    I think AMD will simply expect the few people who do want both to install one card of each type in their machine.
                    I'm not saying your hybrid-chiplet idea can't happen, but I don't see it as a given (or even very likely).
                    i say it will happen because it is the most easy and cheap way to do a chiplet design for the GPU+graphics market.
                    yes there will be CDNA only chiplet design to because yes with CDNA you can easily build chiplet design.

                    you can also see it in this forum many people want graphics and compute but they also want higher performance and lower costs... and chiplet design is the only way to get all this.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment

                    • qarium
                      Senior Member
                      • Nov 2008
                      • 3435

                      #70
                      Originally posted by smitty3268 View Post
                      Ok, let me break down all the problems with this.
                      1. Chiplets for GPUs aren't easy. If it was possible, it already would have been done. Maybe in a year we'll start to see it, but moving the massive amounts of data around that a GPU is dealing with is a different type of problem than what a CPU chiplet architecture has to deal with. (And CDNA is not currently chiplet based. I think you misunderstand what all these technology terms mean - see point 3 below for more examples)
                      2. If CDNA clients wanted any kind of video output, they wouldn't have removed it in the first place. A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else. You are asking them to add more, so it's going to take more resources, more money, and more chips. More chiplets.
                      3. Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA. They'd just add an I/O chiplet that talked to the display. RDNA refers to the actual graphics execution units, which is precisely what is replaced by CDNA - it would be pointless to have both present.
                      yes all present CDNA gpus are monolites and not chiplet but it is a fact that all future CDNA designs will be chiplet.
                      you are right it is very hard to impossible to do RDNA/graphics chiplet thats reason why all dual-gpu cards are a full failure for 3D graphics.

                      "I think you misunderstand what all these technology terms mean"

                      I do this all day and all night... but yes for sure i misunderstand it all...

                      "A CDNA chiplet + RDNA chiplet is still going to be more expensive than just a CDNA chiplet + nothing else."

                      yes you buy CDNA chiplet + nothing else if you are a HPC server customer.
                      but for gamers who also do some mining a CDNA+ RDNA design could be a win.

                      "Finally, even if they wanted to add a video output to CDNA and decided to do that via chiplets, that has nothing to do with RDNA"

                      LOL really LOL be sure it is not about add a video output to CDNA.

                      it is only to add a co-processor for AI and other stuff to your RDNA graphic card for desktop and workstation users.

                      "it would be pointless to have both present"

                      the point is: you save transistors for the second gpu because you do not need the graphics part at all for the second chip.

                      and why? you said it to: "Chiplets for GPUs aren't easy. If it was possible, it already would have been done."

                      so it is super easy to say that a second chip for graphics makes no sense ... because of this you dont do it

                      but to have 2 cards in the pc one RDNA for graphics and one CDNA for compute makes sense.
                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X