Announcement

Collapse
No announcement yet.

AMD Shows First Glimpse Of Radeon RX 6000 Series Graphics Card

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by damentz View Post

    I'm late to quoting you but this is hilarious. The fact that the first thing you point out is dual 8-pin connectors means you're completely out of touch with what matters to people who buy and play games. Power consumption only matters to a small percentage of gamers where heat output is important (a LAN cafe comes to mind), or you run off of solar power. During regular desktop usage, GPU power gating keeps power consumption in check while not under 100% utilization, so heat output may only be an issue during serious or long gaming sessions.

    This slightly reminds me of trolls on Twitter that attack game companies for revealing content offensive to particular individuals. They get their friends to announce they've decided NOT to buy the game due to such offensive content. However, it's quickly discovered that these trolls don't play games and maybe the last game they recall playing is Bejeweled back in 2012 on their phone. In other words, they're virtue signaling and never were in the market, but use "sound and fury" to make it look like everyone's offended. In reality, it's just these "gamers" that don't play games.

    Check the controversy section of this game: https://en.wikipedia.org/wiki/Ion_Fury
    Of course that the first thing I point out is the dual 8-pin connectors, WTF should I point out from a picture ?
    I am a gamer from time to time and a movie watcher from time to time, when I don't have time, I let the computer on to do some other stuff like downloading or compute tasks.
    Power consumption is very important to me and I don't give a fuck if gamers care about it or not.
    A lot of gamers are so stupid anyway that they even use Windows 10 with all its spyware and crap just to get DX12 or 1-2 FPS more.
    I couldn't care less about their choices.
    And AMD should stop with this "for gamers only" attitude and marketing because it's really annoying, I still remember the "game cache" crap marketing.

    Comment


    • #22
      Kind of pains me that they use Fortnite to showcase it. A game that doesn't and probably never run on linux

      Comment


      • #23
        Originally posted by Danny3 View Post
        Two 8-pin PCI Express power connectors ?
        WTF, how much is this GPU consuming ?
        I would rather pay upfront more for a GPU with faster and more efficient HBM2 memory than slowly paying more in electricity price for a slower memory type and performance.
        Sad to see more and more evidence that RDNA 2 will be a disappointing release.
        team green 12Pin connectors

        Comment


        • #24
          Looking forward to seeing performance of the mid-tier cards. Anything with 8+6 pin will do me, especially if it's at least 25% faster than the 5700 XT. If it really is 50% more efficient than RDNA like they claim then that's not impossible.

          Originally posted by 89c51 View Post
          Kind of pains me that they use Fortnite to showcase it. A game that doesn't and probably never run on linux
          CS:GO would have been a logical choice considering how much Valve has contributed to AMD's Linux drivers (thinking ACO, among other things). Anyway, it's just a tease. We really aren't missing much!
          Last edited by ResponseWriter; 15 September 2020, 05:15 AM.

          Comment


          • #25
            Originally posted by Danny3 View Post
            Two 8-pin PCI Express power connectors ?
            WTF, how much is this GPU consuming ?
            I would rather pay upfront more for a GPU with faster and more efficient HBM2 memory than slowly paying more in electricity price for a slower memory type and performance.
            Sad to see more and more evidence that RDNA 2 will be a disappointing release.
            Just because the power consumption didn't drop, doesn't mean it's less efficient. Efficiency has nothing to do with how much power is being used, it's how much productive work (as opposed to heat) you're getting out of the power used.
            Expect all high-end GPUs from now until forever to be 300W+. AMD didn't screw up anything here. If all you care about is getting Radeon 7 performance with less power draw, I have complete confidence the RX 6000 series will offer something, it just won't be the top-end card.

            Also, your power bill shouldn't be of any concern. If you're running your GPU at max load 24/7, ok, maybe it'll make a dent in your power bill, but at that point it's probably earning you money. You have to spend money to earn money.

            Comment


            • #26
              Originally posted by 89c51 View Post
              Kind of pains me that they use Fortnite to showcase it. A game that doesn't and probably never run on linux
              It will have to, eventually. Epic is trying to build the Metaverse, and Fortnite is now their main product in that direction. It's supposed to be something like the next Second Life, and they are succeeding, but it will never be complete until they reach everybody everywhere, and that means every platform. Their multiplatform push has been strong so far, so no reason not to think Linux might be next.

              Comment


              • #27
                Originally posted by bridgman View Post

                As long as we keep taking code names from the Wikipedia list of fish names (and the list of X11 colours) there's going to be a bit of fishiness.




                Hey at least that means there is a chance we can get a Crimson Firefish...

                Comment


                • #28
                  People look at reviews before they buy their GPUs.. And they look at framerates and not power consumption (at least 90%+ of people).
                  That´s why as a manufacturer of such gaming GPUs, you just push you chip to the max limit your cooling solution can handle, that way that "FPS" number can be increase by some %..
                  If we look at reasonable operating modes, we would probably see 15% less performance for 50% less power draw.. However if you can just increase VCore and run 20% higher clocks, you do it on gaming GPUs as a manufacturer..
                  Unfortunately this is the world we live in..

                  In the end someone will use this GPU to play games with a 60Hz refresh Monitor and if the game is written badly, it will render at 300+fps and draw a lot of power without even showing the rendered frames, as the monitor does only show every 5th Frame.

                  For most GPUs you can set some sort of TDP limit for different performance states via the driver, so you can limit it to a sane frequency / powerdraw without loosing too much performance..
                  Modern GPUs do change power profiles pretty quickly and use fine grained clock-gating to allow for maximum usage of the TDP limit..

                  Comment


                  • #29
                    Originally posted by brunosalezze View Post
                    350W from the nvidia side doesn't sound power efficient. I sure hope this consumes less.

                    And on Fortnite, they showed the cards video outputs, its 1xHDMI 2xFullDP and 1xUSB-C.
                    Oooh, does that mean it'll work with displays that accept USB Type-C as input ? (I think they do DisplayPort over USB Type-C)

                    Comment


                    • #30
                      Originally posted by Ipkh View Post
                      Once against AMD just doesn't really understand marketing. Two 8 pins isn't bad as most high end cards have them, but they shouldn't be highlighting that given their power hungry reputation vs current Nvidia cards. I'm glad AMD has better open source linux drivers but that only goes so far. If anything they ought to be teasing some performance numbers, not to mention some ray tracing details considering the consoles that are using custom Rdna variations.
                      It's completely ridiculous that Microsoft and Sony can't tought their true graphics prowess because AMD is being tight lipped on their RDNA2 architecture details. They definitely shouldn't let Nvidias narrative stick with unsubstantiated performance claims and misleading slide deck.
                      AMD is riding a pretty big hype train with Ryzen and 7 nm. I don't think they're too worried. And I don't think they were ever relying on open source Linux drivers for marketing, that's something only we Linux users (and someone like Linus Torvalds) cares about.

                      Comment

                      Working...
                      X