Announcement

Collapse
No announcement yet.

AMD Arcturus Might Be The Codename Succeeding Navi

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by valici View Post

    I really don't like the idea. I mean AMD works with both Microsoft and Sony and as with PS4 and Xbox (GCN1 for first generation and Polaris for Pro and X), AMD will supply both with the same graphics solution. So why would Sony would help and pay to develop chips also used by Microsoft....
    They already did the same with customization for PS4 and PS4 pros APUs. As I understand it most if not all tech stays with AMD.
    see for example the checkerboard-rendering, which is freaking great tech.

    Originally posted by valici View Post
    I think it's designed fully by AMD, but I think they are not so willing to invest tens of million of dollar in a chip not many will buy.
    I does not matter what you think, especially since the GPUs used in PS5 will be bought by the millions. And again, the tech stays with AMD, so you will find improvements mandated or originated from Sony there. Potential with some clauses to keep Microsoft's fingers from them for a period of time.

    Apparently both companies are fine with it, so whats the issue?
    Sony has alot of expertise that can complement AMDs (the picture-processing in their TVs is market-leading, and their digital cameras are among the best), but does not have a market for GPUs outside the consoles.

    https://wccftech.com/exclusive-amd-n...dmap-cost-zen/

    Comment


    • #22
      Originally posted by valici View Post

      I think it's designed fully by AMD, but I think they are not so willing to invest tens of million of dollar in a chip not many will buy.
      If you're speaking of high end cards, I suspect there's a strong case that halo cards drive sales of low end cards. Maybe Nvidia genuinely makes massive profits from the GTX 1080 Ti and new GTX 2080 and so forth. But I suspect instead the best benefit from having those cards and selling them to a small volume of enthusiasts is that it causes hundreds of thousands of other enthusiasts to buy a GTX 1060 or GTX 1070.

      So an improved Vega 64 successor could lose money for AMD on its own but still be worthwhile by driving sales of the RX580 and its successor.

      Originally posted by ElectricPrism View Post
      Cool.

      I can't help but think of Arcturus StarCraft 1.

      https://www.youtube.com/watch?v=aL5FM4OPPlE&t=53s
      Me too, I didn't know it was the name of a star either. Thanks for that, Ungweliante (by the way, is 'Ungweliante' a play on 'Ungoliant' from Tolkien's Silmarillion?)

      Comment


      • #23
        Originally posted by discordian View Post
        They already did the same with customization for PS4 and PS4 pros APUs. As I understand it most if not all tech stays with AMD.
        see for example the checkerboard-rendering, which is freaking great tech.

        I does not matter what you think, especially since the GPUs used in PS5 will be bought by the millions. And again, the tech stays with AMD, so you will find improvements mandated or originated from Sony there. Potential with some clauses to keep Microsoft's fingers from them for a period of time.

        Apparently both companies are fine with it, so whats the issue?
        Sony has alot of expertise that can complement AMDs (the picture-processing in their TVs is market-leading, and their digital cameras are among the best), but does not have a market for GPUs outside the consoles.

        https://wccftech.com/exclusive-amd-n...dmap-cost-zen/

        1. I was speaking about a high eng >500 mm2 chip. Consoles do not have that GPU power.
        Xbox one X has a 360 mm2 including the CPU part and is a little bigger than RX 580.

        2. I don't think Sony can design a better GPU than AMD who has 20 years or more of graphics design experience.
        All modern console graphic chips are derived from consumer GPU tech. It's not financially feasible to have 2 separate technologies. The same with Nintendo's Switch which I believe uses Nvidia's Maxwell graphics.

        Comment


        • #24
          Originally posted by Michael_S View Post

          If you're speaking of high end cards, I suspect there's a strong case that halo cards drive sales of low end cards. Maybe Nvidia genuinely makes massive profits from the GTX 1080 Ti and new GTX 2080 and so forth. But I suspect instead the best benefit from having those cards and selling them to a small volume of enthusiasts is that it causes hundreds of thousands of other enthusiasts to buy a GTX 1060 or GTX 1070.

          So an improved Vega 64 successor could lose money for AMD on its own but still be worthwhile by driving sales of the RX580 and its successor.
          I agree with this ideea with one condition. You have to believe you can deliver a chip which can compare with Nvidia's greatest.

          Vega 64 was not that chip. AMD's luck was the crypto boom and that the chip was good at it. If AMD can't do a chip that can beat at least the 2080, than there's no point to it.

          Comment


          • #25
            Originally posted by valici View Post

            I agree with this ideea with one condition. You have to believe you can deliver a chip which can compare with Nvidia's greatest.

            Vega 64 was not that chip. AMD's luck was the crypto boom and that the chip was good at it. If AMD can't do a chip that can beat at least the 2080, than there's no point to it.
            In the CPU market, AMD Ryzen does well by offering 80-90% of Intel performance at greatly reduced cost. I think the same strategy could work for GPUs - it would be best if they have something to match the GTX 2080, but it wouldn't be bad to have something that comes close but is 30% cheaper.

            Comment


            • #26
              Originally posted by ElectricPrism View Post
              Cool.

              I can't help but think of Arcturus StarCraft 1.

              https://www.youtube.com/watch?v=aL5FM4OPPlE&t=53s
              Same character is in SC2, too.

              Comment


              • #27
                Originally posted by valici View Post
                1. I was speaking about a high eng >500 mm2 chip. Consoles do not have that GPU power.
                Xbox one X has a 360 mm2 including the CPU part and is a little bigger than RX 580.
                Then I misunderstood.
                Originally posted by valici View Post
                2. I don't think Sony can design a better GPU than AMD who has 20 years or more of graphics design experience.
                All modern console graphic chips are derived from consumer GPU tech. It's not financially feasible to have 2 separate technologies. The same with Nintendo's Switch which I believe uses Nvidia's Maxwell graphics.
                I never said Sony can design a better GPU itself, but that they have competency in this and many other related fields. They have some of the brightest gamedevs as well and are not limited by existing APIs (or making old games and benchmarks run as fast as possible).
                So some features would not be there without their involvement.

                Comment


                • #28
                  Originally posted by Michael_S View Post

                  In the CPU market, AMD Ryzen does well by offering 80-90% of Intel performance at greatly reduced cost. I think the same strategy could work for GPUs - it would be best if they have something to match the GTX 2080, but it wouldn't be bad to have something that comes close but is 30% cheaper.
                  AMD was very smart with Ryzen, they currently have only 2 dies... for the entire stack of GPUs including EPYC.
                  - The so called "Summit Ridge" with 2CCX s (whole Ryzen desktop except 2200g and 2400g and Epyc)
                  - The so called "Raven Ridge" with 1CCX + 11 Vega CU (Ryzen mobile and 2200g and 2400g)
                  - And the 3 rd should come with only 1CCX ( the 2300X and 2500X )

                  But they can't do that with GPU. AMD said themselves multi-die solution isn't feasible at the moment (to work as a single GPU).
                  So AMD can make powerful CPUs because they can just glue smaller dies together, for GPU they must design big dies from the start.

                  Comment


                  • #29
                    Originally posted by Michael_S View Post

                    Me too, I didn't know it was the name of a star either. Thanks for that, Ungweliante (by the way, is 'Ungweliante' a play on 'Ungoliant' from Tolkien's Silmarillion?)
                    Ungweliante is the Quenya for Ungoliant which is Sindarin, probably at that moment I tried to register Ungoliant and it was taken

                    Comment


                    • #30
                      I'd rather AMD fix current hindrances and polish Ryzen APU's in Linux. I'm, while on f29, still getting "random" lockups on Lenovo Thinkpad E485.

                      Comment

                      Working...
                      X