Announcement

Collapse
No announcement yet.

AMD Radeon RX 7900 GRE Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by sophisticles View Post

    Hate to break it to you but most experts conclude that the human eye can only process between 30 and 60 frames per second.

    I arrived at my conclusions from the work i have done over the past 20+ years with video.

    When you go to a movie theater movies are displayed at 24 fps and the lights are off, this is done to eliminate any competing information from light sources not originating from the screen and so your mind has the opportunity to process what it is seeing.

    At rates below 30 images per second we see individual frames in a discrete fashion, i.e. we can notice different details within each picture. As the number of images increases the images start to blend together and the appearance starts to smooth out and we lose the ability to discern different details among different images.

    Look up the soap opera effect.

    This is the reason why many people, including myself, prefer movies shot on film over films shot on digital.


    You can try it your self, hold an object, say your hand, in front of your face, stationary and look at it. Your mind is able to perceive all the details, pores, lines, cuts, etc.

    Now move your hand from right to left or left to right at a speed where it takes one second to cover the distance from the points where your hand is out the outer edges of your peripheral vision.

    What do you see?

    Toy will notice that you have lost the ability to see most of the fine details, your eye is not able to transfer enough unique information from each position of your hand and it looks like a blur.

    There's a reason why most console games look better than the PC variants, it's because the console versions are usually locked at fps,



    https://caseguard.com/articles/how-m...20per%20second.

    https://www.healthline.com/health/hu...-animal-vision
    Again, you never read my post, and you go on to repeat things you don't understand. If you are indeed a professional, you are garbage at it, going from what you posted.

    Our optic nerves, do not work with a frame rate. They capture photons as they come, instantly. They waste very little time transfering nerve signals to our brains. The maximum frame rate we can perceive is all about our brains, not our eyes. The math you did was bogus, garbage, unscientific, out of your behind, do you get it?

    Also, there have been experiments with people and they could perceive framerates of even up to 300fps or 400fps. Yes there are diminishing returns but people usually can tell the difference. 60 to 120 fps there is an significant difference for most people. Claiming that the human eye can't process more is absurd.

    Esports players wouldn't spend insane amounts of money to get the most framerate they can get for competitive reasons, and they wouldn't play at the lowest graphical settings, if they didn't benefit from maximum framerates.

    Comment


    • #32
      Originally posted by sophisticles View Post
      Hate to break it to you but most experts conclude that the human eye can only process between 30 and 60 frames per second.
      Has nothing to do with eyes. Also perception limit for the most people is at least several hundred FPS.

      You are either trolling or your knowledge in computers/IT is near 0.

      Comment


      • #33
        Originally posted by Panix View Post
        Wrong, except for the first sentence. They really screwed up on the RDNA 3. Inefficient, power hogs - defective designs - read about the problems running these gpus with multiple monitors/different refresh rates, super high junction/hotspot temps, overheating/high temps in general - that's just a few issues on the hardware design side.
        What the actual f.... are you smoking? I've been running my 7800XT at high loads in 41C / 105.8F degrees ambient temperature environments and it's been operating like a beast. Last season's extreme heat finally killed my rx 480. The only issue that I saw was the vapor chamber in reference 7900s, that was production problems not design.

        Originally posted by Panix View Post
        Their software stack/productivity side is a disaster - they are behind in practically everything compared to Nvidia's - in Blender, it's not even a contest - ZLUDA is a somewhat surprise but if you look at opendata blender (website) - the 7900 xtx only competes with a last gen Ampere 3080 card - a card that only has half the vram. In AI/ML - they are way behind but lots of ppl still have trouble using their cards in that area.
        Yes and no. Yes, their software stack is a disaster but that's nothing new. AMD is a hardware company who loves open source. Nvidia still until today is more software orientated (relative to AMD). Nvidia's hardware is much better due to their monopoly but not their modus operandi. AMD's high-end GPUs is a lot better than it was at the end of GCN, the cards are fast and power efficient again. The "AI" software foundations are still bad unless you have very specific hardware. This is where the GRE card shines but I'll get back to that...

        AMD cards knock Nvidia out of the park if you're willing to get your hands dirty. People like Tom's hardware is still reporting that AMD get ~15% of Nvidia performance in "AI". When you say crap like "the 7900 xtx only competes with a last gen Ampere 3080 card" then you sound like Tom's hardware NPC.... Don't be like Tom's hardware. Be inspired by people like Seunghoon Lee, George Hotz and Vladimir Mandic who are doing hard work to show what AMD hardware is capable of.

        Originally posted by Panix View Post
        The GRE is another gimped gpu, a 'hacked' 7900 xt but the binning/hardware was so bad they had to cut it down to resemble a 7800 xt more than a 7900 series.
        This is AMD's first MCM GPU, heck it's the world's first production cards. It makes logically sense for AMD to produce as many 7900s as possible as higher production reduces cost of units. This is exactly why we are seeing the GRE cards. It's not the best card for gaming as you can see. I am happy that I got a 7800 instead. Compute e.g. rocm/pytorch support is another story. The GRE cards have much better support compared to 7800 / 7700 / 7600 etc. https://www.guru3d.com/story/amd-roc...7900-gre-gpus/

        It's literally the same as Nvidia. If you produce things on big scale you will have room for x, x-Super, x-Ti, x-Ti-Super. It turns out today's consumers are happy to pay for this.

        Comment


        • #34
          Originally posted by drakonas777 View Post
          You are either trolling or your knowledge in computers/IT is near 0.
          It's probably both.

          Of course no one with a science background would ever claim that the eye can only see X frames because the eye just collects photons continuously. As soon as a certain amount of photons have hit your eye you will see something, doesn't matter if it was a flash of 0,002 s with blinding brightness or a dim flash of 0,2s.

          Comment


          • #35
            Originally posted by Jabberwocky View Post

            What the actual f.... are you smoking? I've been running my 7800XT at high loads in 41C / 105.8F degrees ambient temperature environments and it's been operating like a beast. Last season's extreme heat finally killed my rx 480. The only issue that I saw was the vapor chamber in reference 7900s, that was production problems not design.

            Yes and no. Yes, their software stack is a disaster but that's nothing new. AMD is a hardware company who loves open source. Nvidia still until today is more software orientated (relative to AMD). Nvidia's hardware is much better due to their monopoly but not their modus operandi. AMD's high-end GPUs is a lot better than it was at the end of GCN, the cards are fast and power efficient again. The "AI" software foundations are still bad unless you have very specific hardware. This is where the GRE card shines but I'll get back to that...

            AMD cards knock Nvidia out of the park if you're willing to get your hands dirty. People like Tom's hardware is still reporting that AMD get ~15% of Nvidia performance in "AI". When you say crap like "the 7900 xtx only competes with a last gen Ampere 3080 card" then you sound like Tom's hardware NPC.... Don't be like Tom's hardware. Be inspired by people like Seunghoon Lee, George Hotz and Vladimir Mandic who are doing hard work to show what AMD hardware is capable of.

            This is AMD's first MCM GPU, heck it's the world's first production cards. It makes logically sense for AMD to produce as many 7900s as possible as higher production reduces cost of units. This is exactly why we are seeing the GRE cards. It's not the best card for gaming as you can see. I am happy that I got a 7800 instead. Compute e.g. rocm/pytorch support is another story. The GRE cards have much better support compared to 7800 / 7700 / 7600 etc. https://www.guru3d.com/story/amd-roc...7900-gre-gpus/

            It's literally the same as Nvidia. If you produce things on big scale you will have room for x, x-Super, x-Ti, x-Ti-Super. It turns out today's consumers are happy to pay for this.
            Well, I'm not smoking the 'AMD fanboy' crack many are smoking here! LOL! Any search on TH, Reddit, LTT, forums etc. - anywhere ppl are talking about the 7900 series - they're talking about bad hotspot/junction temps - and also the multi-monitor/different-refresh-rates issue - also, there's very high power consumption when playing a video! A friggin' video. I dunno if the 7800 series does or doesn't suffer from these or any of these issues. But, the 7900 xtx cards do. They only recently addressed the vapor chamber issue - and many ppl, including myself WON'T touch a reference/MBA card - for that reason alone. Sure, it's been fixed and addressed - and it doesn't happen anymore? The last bad batch has been all sent in for RMA/repair? Yeah, sure. I only look at AIB cards anyway - since, I'm mostly looking at used unless AMD has a sale (no chance).

            AMD is a hardware company - but, can't come up with good hardware - their gpus are woefully inefficient power-wise - Nvidia's Ada series are way more power efficient - it's just that Ngreedia gimps their cards and charges a small fortune. I get that - they're an evil company. But, guess what?!? - we only have TWO (sorry, Intel - until you catch up.....) CHOICES! Nvidia's software is the de facto first choice - that's not my fault. I am only reacting to what is out there. Give me a good alternative and I won't even look at Nvidia. But, AMD doesn't put enough focus in their software - maybe they should, huh? A guy who isn't even their original employee hacked CUDA and AMD had another option - but, it was embarrassingly BETTER THAN HIP - so, it looked bad on AMD, I guess and some ppl even had the perspective, 'why go with that hacked version, when you can have the real thing - CUDA, with Nvidia?!?' I think AMD should have just went with Zluda or some offshoot - since, their HIP/Rocm just doesn't seem to have caught on. Whenever I read of ppl using it - most are struggling and performance seems to be inadequate or trailing Nvidia's....again.

            Btw, you included your ambient temps but not the real temps of the card? Also, where are you - that you have those temps? Some greenhouse? I think my place gets bad until the central A/C kicks on but that is awful. Put some oscillating fans in front of that PC, bud! :-)

            Comment


            • #36
              Originally posted by Panix View Post
              AMD is a hardware company - but, can't come up with good hardware - their gpus are woefully inefficient power-wise - Nvidia's Ada series are way more power efficient - it's just that Ngreedia gimps their cards and charges a small fortune. I get that - they're an evil company. But, guess what?!? - we only have TWO (sorry, Intel - until you catch up.....) CHOICES! Nvidia's software is the de facto first choice - that's not my fault. I am only reacting to what is out there. Give me a good alternative and I won't even look at Nvidia. But, AMD doesn't put enough focus in their software - maybe they should, huh? A guy who isn't even their original employee hacked CUDA and AMD had another option - but, it was embarrassingly BETTER THAN HIP - so, it looked bad on AMD, I guess and some ppl even had the perspective, 'why go with that hacked version, when you can have the real thing - CUDA, with Nvidia?!?' I think AMD should have just went with Zluda or some offshoot - since, their HIP/Rocm just doesn't seem to have caught on. Whenever I read of ppl using it - most are struggling and performance seems to be inadequate or trailing Nvidia's....again.
              ZLUDA is made with ROCM/HIP, says so right there on the ZLUDA git page in the FAQ:
              • What underlying GPU API does ZLUDA use? Is it OpenCL? ROCm? Vulkan?

                ZLUDA is built purely on ROCm/HIP. On both Windows and Linux.
              ZLUDA isn't a HIP replacement (no pun intended), it just allows CUDA code to be executed using HIP underneath.​
              And if you are going to shit talk AMD on this, you might as well Include Intel since initially ZLUDA was made to work with Intel graphics cards but after they abandoned it, the maintainer contacted AMD. Both companies came to the same conclusion (it isn't worth it) and stopped funding it.
              If anything ZLUDA proves that the current HIP implementation for Cycles is not using ROCM/HIP to its fullest potential (hell not even ZLUDA does, it still lacks ray tracing support), while also with other CUDA applications that HIP can be used for them but they first need to start actually implementing it. It is the old chicken and egg problem that Linux users know all too well, user X doesn't want to come because app Y won't work, app Y won't work on integration because there is a lack of interest from users.

              Speaking of the Intel cards, they actually have pretty good Cycles performance for their price point and even have full on ray tracing (including on Linux, thanks to the fact it uses Embree which was already used by their CPUs and even pre-dates HIP by a year). The major problem being that the viewport and mat-preview performance is incredibly bad due to the cards having poor OpenGL performance which on the one hand could be fixed by Intel or on the other hand could be fixed by Blender finally making the jump from OpenGL to Vulkan which is something they actually have planned and has been delayed a lot (it was supposed to happen by 2022), I would make a big deal out of it if weren't for the fact you could technically install another render engine like Hydra Storm that does already use Vulkan and is basically an engine designed for viewport use (specifically with USD workflows which are the standard for professional studios and getting more and more accessible for indie projects), Arc cards could even use Radeon ProRender which is also a Vulkan based render engine with some Cycle node support (on paper these cards would be really good for beginners, 16GB, an amazing video encoder, low prices, just a shame of the OpenGL performance since it ties to the viewport but not impossible to mitigate). Give Intel some credit, they might not have an expensive top of the line flagship but their cards are getting quite functional and sometimes have insane budget value.

              Also "AMD is a hardware company - but, can't come up with good hardware", Ryzen begs to differ and so does the #1 spot for supercomputers. I get that their GPUs have some criticism worthy software optimization but most of the time the hardware is fine while we both seem to agree the opposite is true for Nvidia, it is well optimized on day 1 but man is there a glaring hardware problem staring at you (usually small bus/low bandwith or low memory unless you pay quite a premium). It sucks but you're just gonna have to weight the pros and cons and well sometimes they affect you, sometimes they don't, there are plenty of use cases for a GPU that don't involve CUDA, Optix, HIP or OpenAPI.

              Comment


              • #37
                Originally posted by Panix View Post
                Well, I'm not smoking the 'AMD fanboy' crack many are smoking here! LOL! Any search on TH, Reddit, LTT, forums etc. - anywhere ppl are talking about the 7900 series - they're talking about bad hotspot/junction temps - and also the multi-monitor/different-refresh-rates issue - also, there's very high power consumption when playing a video! A friggin' video. I dunno if the 7800 series does or doesn't suffer from these or any of these issues. But, the 7900 xtx cards do. They only recently addressed the vapor chamber issue - and many ppl, including myself WON'T touch a reference/MBA card - for that reason alone. Sure, it's been fixed and addressed - and it doesn't happen anymore? The last bad batch has been all sent in for RMA/repair? Yeah, sure. I only look at AIB cards anyway - since, I'm mostly looking at used unless AMD has a sale (no chance).
                I suggest to stay way from the echo chambers. I agree with staying way from reference cards. It's high risk low reward IMO. AMD had driver issues on Windows middle of last year but everything was sorted for me before the end of 2023. I monitor my GPU with gpu-z my temps are fine (junction or not). My refresh rates are fine. I run 3 displays with my main display being a 2K 165Hz 10bit HDR screen. I run Linux with VIFO Windows and I dual boot at times when I have time to play competitive multiplayer games that block VIFO. I haven't played many hours this year, but I at least briefly checked out palworld, enshrouded, pacific drive, helldivers 2.

                Originally posted by Panix View Post
                AMD is a hardware company - but, can't come up with good hardware - their gpus are woefully inefficient power-wise - Nvidia's Ada series are way more power efficient - it's just that Ngreedia gimps their cards and charges a small fortune. I get that - they're an evil company. But, guess what?!? - we only have TWO (sorry, Intel - until you catch up.....) CHOICES! Nvidia's software is the de facto first choice - that's not my fault. I am only reacting to what is out there. Give me a good alternative and I won't even look at Nvidia. But, AMD doesn't put enough focus in their software - maybe they should, huh? A guy who isn't even their original employee hacked CUDA and AMD had another option - but, it was embarrassingly BETTER THAN HIP - so, it looked bad on AMD, I guess and some ppl even had the perspective, 'why go with that hacked version, when you can have the real thing - CUDA, with Nvidia?!?' I think AMD should have just went with Zluda or some offshoot - since, their HIP/Rocm just doesn't seem to have caught on. Whenever I read of ppl using it - most are struggling and performance seems to be inadequate or trailing Nvidia's....again.
                So MI300X isn't a GPU then?
                MI300X (1xPCEe 5) H100 SXM (1xPCEe 5)
                Memory 192 GB 80 GB
                Mem Bandwidth 5.3 TB/s​ 3.35 TB/s
                Interconnect 896 GB/s 900 GB/s
                Peak Power 750 W 700 W
                FP64 / Tensor 81.7 / 163.4 34 / 67
                FP32 / Tensor 163.4 / 1300 67 / 989
                FP16 / Tensor 2,610 / NA NA / 1,979
                BF16 / Tensor 2,610 / NA NA / 1,979
                FP8 / Tensor 5,220 / NA NA / 3,958
                INT8 / Tensor 5,220​ / NA NA / 3,958
                All compute numbers are measured in TFLOPs.



                Zluda is over hyped at this stage. The power of it mostly comes from CUDA compiler and it's not very stable nor does it have many features for example cuDNN / optix. If you use it with games you might get flagged for using cheats. That said, it has huge potential. I agree with you AMD should be investing in it and other software. Hardcore open source devs are taking Zluda over and adding support for it, they should get paid by AMD, but like I said previously nothing has changed, AMD is still more a hardware company.

                Rocm is at least improving. Windows drivers, latest gen consumer GPU support, end to end open source (waiting for Fedora 40 in like 6 weeks?). Pytorch still needs rocm Windows support though, but if you want to do things by hand (like you have always needed to if you buy AMD) then you can use it. In this case AMD blows the socks off Nvidia.

                So again, stop parroting reddit, LLT, Tom's hardware and other echo chambers. These people only know "git clone nvidia-optimized-repo" and "webui.bat --no-half --precision full --skip-torch-cuda-test --use-directml" ... it's like they are trying to cripple their AMD GPUs as much as possible. Then they are off to post about how bad things are. It's a joke. Majority of people no idea of how to use hardware and it's AMD's fault (I'm not sarcastic, AMD should spend more on software).

                Originally posted by Panix View Post
                Btw, you included your ambient temps but not the real temps of the card? Also, where are you - that you have those temps? Some greenhouse? I think my place gets bad until the central A/C kicks on but that is awful. Put some oscillating fans in front of that PC, bud! :-)
                My real temps are on windows FS. I live in South Africa, my house has bad insulation so fans don't really help. It's normal for it to be this warm outside even if I'm not that far from the coast, no greenhouse needed. I have seven 140mm fans on my Fractal Design Torrent so oscillating fans are just going to make more noise >.< AC will work but my computers don't take priority over my sleep lol.

                Comment


                • #38
                  Originally posted by Jabberwocky View Post

                  I suggest to stay way from the echo chambers. I agree with staying way from reference cards. It's high risk low reward IMO. AMD had driver issues on Windows middle of last year but everything was sorted for me before the end of 2023. I monitor my GPU with gpu-z my temps are fine (junction or not). My refresh rates are fine. I run 3 displays with my main display being a 2K 165Hz 10bit HDR screen. I run Linux with VIFO Windows and I dual boot at times when I have time to play competitive multiplayer games that block VIFO. I haven't played many hours this year, but I at least briefly checked out palworld, enshrouded, pacific drive, helldivers 2.



                  So MI300X isn't a GPU then?
                  MI300X (1xPCEe 5) H100 SXM (1xPCEe 5)
                  Memory 192 GB 80 GB
                  Mem Bandwidth 5.3 TB/s​ 3.35 TB/s
                  Interconnect 896 GB/s 900 GB/s
                  Peak Power 750 W 700 W
                  FP64 / Tensor 81.7 / 163.4 34 / 67
                  FP32 / Tensor 163.4 / 1300 67 / 989
                  FP16 / Tensor 2,610 / NA NA / 1,979
                  BF16 / Tensor 2,610 / NA NA / 1,979
                  FP8 / Tensor 5,220 / NA NA / 3,958
                  INT8 / Tensor 5,220​ / NA NA / 3,958
                  All compute numbers are measured in TFLOPs.



                  Zluda is over hyped at this stage. The power of it mostly comes from CUDA compiler and it's not very stable nor does it have many features for example cuDNN / optix. If you use it with games you might get flagged for using cheats. That said, it has huge potential. I agree with you AMD should be investing in it and other software. Hardcore open source devs are taking Zluda over and adding support for it, they should get paid by AMD, but like I said previously nothing has changed, AMD is still more a hardware company.

                  Rocm is at least improving. Windows drivers, latest gen consumer GPU support, end to end open source (waiting for Fedora 40 in like 6 weeks?). Pytorch still needs rocm Windows support though, but if you want to do things by hand (like you have always needed to if you buy AMD) then you can use it. In this case AMD blows the socks off Nvidia.

                  So again, stop parroting reddit, LLT, Tom's hardware and other echo chambers. These people only know "git clone nvidia-optimized-repo" and "webui.bat --no-half --precision full --skip-torch-cuda-test --use-directml" ... it's like they are trying to cripple their AMD GPUs as much as possible. Then they are off to post about how bad things are. It's a joke. Majority of people no idea of how to use hardware and it's AMD's fault (I'm not sarcastic, AMD should spend more on software).

                  My real temps are on windows FS. I live in South Africa, my house has bad insulation so fans don't really help. It's normal for it to be this warm outside even if I'm not that far from the coast, no greenhouse needed. I have seven 140mm fans on my Fractal Design Torrent so oscillating fans are just going to make more noise >.< AC will work but my computers don't take priority over my sleep lol.
                  So, you're another AMD shill here, huh? Don't tell the truth about AMD gpu hardware - all the other sites are wrong...lol - even though they seem to be saying the same thing. You don't want to talk about the software but how good the hardware is? LOL! Okay, well, you have a 7800 XT - maybe all you've read about is that series? The 7900 XTX sounds like a disaster the more I read about it - constant hotspot/junction temp issues - overheating - etc. - and hey, you will probably need to repaste the heatsink since it can't keep the gpu cool enough? So, all these companies have probably cut corners to put out a cheap quality product and charge consumers through the nose. Do we see a pattern, here? I thought only Nvidia and their AIB partners were bad? Perhaps, you should read about thermal 'pump out' with the 7900 XTX series? The worst gpu quality problem with Nvidia hardware - their recent architecture was the 4090 and the connectors - but, AMD gpus have way more problems - from vapor chambers to these constant temp/heat problems - and requiring thermal paste/pad application - so, I'd probably have to take apart a gpu I just purchased - even used, these 7900 xtx cards cost over $1000!

                  At least, Q concedes there's a design problem as he gives up defending these faulty graphics cards and says to wait for the next gen, in so many words.

                  I'm glad ROCm is improving - maybe their next gen will be good at software and maybe AMD will figure out how to design a decent gpu that doesn't overheat in the meantime?

                  At least, their thousand buck heater is open source, though.

                  Comment


                  • #39
                    Originally posted by Panix View Post
                    So, you're another AMD shill here, huh? Don't tell the truth about AMD gpu hardware - all the other sites are wrong...lol - even though they seem to be saying the same thing. You don't want to talk about the software but how good the hardware is? LOL! Okay, well, you have a 7800 XT - maybe all you've read about is that series? The 7900 XTX sounds like a disaster the more I read about it - constant hotspot/junction temp issues - overheating - etc. - and hey, you will probably need to repaste the heatsink since it can't keep the gpu cool enough? So, all these companies have probably cut corners to put out a cheap quality product and charge consumers through the nose. Do we see a pattern, here? I thought only Nvidia and their AIB partners were bad? Perhaps, you should read about thermal 'pump out' with the 7900 XTX series? The worst gpu quality problem with Nvidia hardware - their recent architecture was the 4090 and the connectors - but, AMD gpus have way more problems - from vapor chambers to these constant temp/heat problems - and requiring thermal paste/pad application - so, I'd probably have to take apart a gpu I just purchased - even used, these 7900 xtx cards cost over $1000!

                    At least, Q concedes there's a design problem as he gives up defending these faulty graphics cards and says to wait for the next gen, in so many words.

                    I'm glad ROCm is improving - maybe their next gen will be good at software and maybe AMD will figure out how to design a decent gpu that doesn't overheat in the meantime?

                    At least, their thousand buck heater is open source, though.
                    Grow up

                    Comment


                    • #40
                      Originally posted by Jabberwocky View Post

                      Grow up
                      No, I want my gimped 7900 GRE!!!!!!!!!!!

                      Comment

                      Working...
                      X