Announcement

Collapse
No announcement yet.

AMD Announces The Radeon RX 6700 XT For $479

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    I guess what I’m saying is that the Apple M1 should tell us that the future is more soldered RAM, less CPU power, more accelerators

    Comment


    • #92
      Originally posted by TemplarGR View Post
      No, really, the correct price for this is 200$, at best. And i am being generous. The gpu market has been overinflated for years now.
      So you're saying that GPUs should sell well below cost once you factor in R&D.

      Are you thinking about a system of subsidies from other markets, or from governments, or is the idea that manufacturers would just take turns losing money until they closed their doors ?

      Governments have subsidized things like rice, alcohol and fuel at various times but subsidizing gaming GPUs seems like a stretch.
      Last edited by bridgman; 04 March 2021, 12:01 PM.
      Test signature

      Comment


      • #93
        Originally posted by pete910 View Post
        Yeah, these are gonna be any easier to get gold of!
        and no chance in hell a RRP !!!!!
        Saw earlier local speculator selling his fresh 6900XT and asking 1600€ for it..

        Comment


        • #94
          Originally posted by bridgman View Post

          So you're saying that GPUs should sell well below cost once you factor in R&D.

          Are you thinking about a system of subsidies from other markets, or from governments, or is the idea that manufacturers would just take turns losing money until they closed their doors ?

          Governments have subsidized things like rice, alcohol and fuel at various times but subsidizing gaming GPUs seems like a stretch.
          How much of the moneys people are paying reaches AMD and how much is filling the pockets of speculators? Ramp up the production for Christ sake.

          Comment


          • #95
            Originally posted by lyamc View Post
            I guess what I’m saying is that the Apple M1 should tell us that the future is more soldered RAM, less CPU power, more accelerators
            i think APPLE is on the wrong way... ARM is death already because it is no OpenSource ISA also their GPU is IP to instead of opensource.

            just imagine a computer with OpenPOWER and Opensource GPU
            and NVRAM Non-volatile_random-access_memory without SSD or REAL ram

            and also with all this:
            OpenCAPI
            ECC memory RAM for the CPU (maybe later GPU to)
            Easy way to repear like clean the copper fins of the cooler
            Operation System needs to be Opensource
            Drivers of the hardware like GPU need to be opensource
            RAM and Harddrive need to be upgradeable easily

            you will see ARM bullshit and APPLE bullshit can simple DIE DIE DIE
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #96
              Originally posted by bridgman View Post
              So you're saying that GPUs should sell well below cost once you factor in R&D.
              Are you thinking about a system of subsidies from other markets, or from governments, or is the idea that manufacturers would just take turns losing money until they closed their doors ?
              Governments have subsidized things like rice, alcohol and fuel at various times but subsidizing gaming GPUs seems like a stretch.
              this man only talks bullshit... o please... this is complete Idiotocrazy it is no longer Democrazy it is now Idiotocrazy...
              these people who want cheap stuff do not unterstand anything.

              3/5/7nm fab is expensive
              GFFR6/GDDR6x/HBM2 is expensive

              the only way to make a cheap GPU for gamers is this. you backport RDNA to 12nm global foundries and you do DDR5 instead of the expensive ram versions. maybe with infinity cache to.

              and i don't get this would not be great... ma vega64 is 14nm global foundries and it has 13TFLOPS

              256bit DDR5 interface makes 0,225TB/s

              with a backport of RDNA on 12nm it should be possible to have 16TFLOPS card.
              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • #97
                Originally posted by aht0 View Post
                How much of the moneys people are paying reaches AMD and how much is filling the pockets of speculators? Ramp up the production for Christ sake.
                "Ramp up the production"

                this needs 4-6 years to build new 3/5nm fab ...

                how can you fix it in 1 year???
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #98
                  Originally posted by aht0 View Post
                  How much of the moneys people are paying reaches AMD and how much is filling the pockets of speculators?
                  I don't know the exact numbers, but AFAIK some fraction of MSRP (selling price of chip plus sometimes matched memory) goes to AMD, a smaller fraction of MSRP goes to the board partners, and the rest (balance of MSRP plus any selling price over MSRP) goes to a combination of distributors, retailers and scalpers.

                  Originally posted by aht0 View Post
                  Ramp up the production for Christ sake.
                  We are running totally fab-limited in the middle of a global chip shortage, along with global shortages (and the resulting bidding wars) on a lot of the other parts that go into chips and boards, partially but not completely fuelled by a mining craze that consumes more GPUs than the entire industry can produce.

                  A new fab (using TSMC #18 as an example) is $18 billion and 4 years to full production, although limited production can start sooner.
                  Test signature

                  Comment


                  • #99
                    Originally posted by bridgman View Post
                    We are running totally fab-limited in the middle of a global chip shortage, along with global shortages (and the resulting bidding wars) on a lot of the other parts that go into chips and boards, partially but not completely fuelled by a mining craze that consumes more GPUs than the entire industry can produce.
                    A new fab (using TSMC #18 as an example) is $18 billion and 4 years to full production, although limited production can start sooner.
                    and people believe it can be done in 1 year... even if government all over the world would put in 1000 billion dollars
                    it would still need 4+ years to produce the fabs.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • Originally posted by bridgman View Post

                      So you're saying that GPUs should sell well below cost once you factor in R&D.

                      Are you thinking about a system of subsidies from other markets, or from governments, or is the idea that manufacturers would just take turns losing money until they closed their doors ?

                      Governments have subsidized things like rice, alcohol and fuel at various times but subsidizing gaming GPUs seems like a stretch.
                      I don't think he's suggesting that, but his point stands - GPU (and CPU) prices are overinflated for quite a while now. There are valid reasons for having an overinflated price (things you've mentioned), and it is possible that 100% of those reasons are valid, that doesn't change the fact that value is way higher than what "should be", similar to the top soccer players for example, or any top sports person - not a single person could convince me that one person who's sole job is to kick a ball is worth more than a person who does brain surgery or something, it's insane (because it suggests that entertaining X amount of people is worth more than a life of X amount of people), but it is a very similar issue.

                      But, this is more of a philosophical approach to the topic, not the "real world" one.
                      TL;DR version is: It's an system fault, those anomalies are just the product of it.

                      Comment

                      Working...
                      X