Announcement

Collapse
No announcement yet.

Radeon GPUs Are Increasingly Competing With NVIDIA GPUs On Latest RadeonSI/RADV Drivers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Leopard View Post

    So we cannot say this surpassed your expectations because this %50-60 thing happened a little bit late?

    Sorry , since my native language is not English it is hard for me to understand sometimes.
    Bridgman means one can write generic driver with a round estemate at about that performance rate, further things goes harder and harder, takes more time on fine tuning and enhancements... and at the end perf improvments become smaller and smaller and so on...

    Kind of saying that Windows have this "Project ReSX", while we have "Project Mareko" As always, the best driver is boring driver, yes - most boring one also with no so much regressions

    Last edited by dungeon; 03-20-2018, 04:27 AM.

    Comment


    • #52
      Originally posted by dungeon View Post

      Bridgman means one can write generic driver with a round estemate at about that performance rate, further things goes harder and harder, takes more time on fine tuning and enhancements... and at the end perf improvments become smaller and smaller and so on...

      Kind of saying that Windows have this "Project ReSX", while we have "Project Mareko"

      I got it clearly now. Thanks

      Comment


      • #53
        Originally posted by treba View Post
        Really impressive. I always wonder what kind of patented stuff nvidia and amd have in their proprietary drivers that make it impossible to open source them. It doesn't seem to be that much of an advantage anymore
        I know that nvidia stabbed 3dfx to death with patents that were simple and actually known. It's just nvidia gave it a spin so they could patent it.
        If you remove address decoder logic, you were just saving pennies. But if you removed address decoder logic, and branded it for easier access by the host cpu, you suddenly got a patent.

        Comment


        • #54
          Originally posted by rene View Post
          Over a decade ago Matrox and then ATi drivers were already quite fine under Linux, too, … ;-)
          You forgot 3dfx... 3dfx was the first and the first opensource. They also were first by opening the hardware specs/programming manuals.
          As a matter of fact, it's due to an old 3dfx card that I got from my boss that I lost interest in programming and started gaming :-(.

          Comment


          • #55
            Originally posted by Lord_Phoenix View Post
            So pleased with the results. It's very impressive how far AMD drivers went in a year+ time frame. I'd love to swap my GTX1060 to RX580 or maybe even Vega56, but I need a new PSU for that since they are much more power hungry. Is there any known issue right now with AMD graphics that would be a regression for me coming from nVidia camp? The monitor output is now merged, right?
            Originally posted by Dick Palmer View Post
            Not meaning to dampen proceedings... but... some efficiency data would have been nice, to qualify the "performance" comparisons.
            AMD hardware isn't anywhere near as disproportionately power hungry as people think. When you run synthetic benchmarks with unlimited frame rate, yes, AMD hardware runs very hot and gulps down watts, but once you turn on v-sync, the power usage suddenly isn't that noteworthy.

            From what I recall, when it comes to compute performance, in some cases AMD hardware actually has better performance-per-watt than Nvidia.

            Comment


            • #56
              Originally posted by audi.rs4 View Post
              If a 390x/480/580 becomes available for under $150, I would likely make the switch. I am actually hoping the mining all busts, and the market is flooded with used mining cards, and I can pick one up for under $100.
              I also hope that this silly mining bubble will burst, there is hardly any good in it.
              But please keep in mind: You do not know about the history of the individual card that is sold to you then. Maybe it was running at the outer rim of specs (or beyond) 24/7 for a long time? Can't imagine that this is good for the card.
              Moreover: Some HW retailers said that most of the cards don't even reach (e.g.) Europe. They are bought off probably right after they leave the factory in Taiwan (or where ever else). So maybe the Asian market will be flooded, but maybe the North Am., European,... markets not so much.


              I'm happy I got my RX 560 (and that's not even the usual scope of miners) when the prices were roughly coming back to the original suggested market introduction prices, maybe +15 Euros more. Not much after, the prices started to rise again.
              Stop TCPA, stupid software patents and corrupt politicians!

              Comment


              • #57
                I think what they were foreseeing when John & team said that they think they might only reach 70% of the BLOB driver is that they were trying to go also without game-specific profiles that might bloat the drivers. I have no clue how many percent in the Windows driver are just for game specific optimizations, but I guess it is fairly considerable.

                It's awesome to see that they did this (as it seems) with generic improvements. Now the binary blob of NV probably contains game or benchmark specific optimizations and we do not know how far the games / the ports from W32 games were tested and optimized in which direction.
                Stop TCPA, stupid software patents and corrupt politicians!

                Comment


                • #58
                  Originally posted by Tomin View Post
                  That Tomb Raider result is interesting. Is there something about that game that works better on AMD or has Nvidia regressed? I suppose Mesa rendering was correct so that is not an issue.
                  Isnt it known that those "eidos" titles tomb raider, Hitman tend to run better on AMD in general also. Maybe it has something to do how they are coded to use the hardware.

                  Comment


                  • #59
                    Originally posted by schmidtbag View Post
                    AMD hardware isn't anywhere near as disproportionately power hungry as people think. [...] From what I recall, when it comes to compute performance, in some cases AMD hardware actually has better performance-per-watt than Nvidia.
                    That's true but I look at it more like Dehir. On one hand something about Vega seems to have boundaries in specific scenarios. One thing is the VRAM of 8GiB in the standard versions so I find the Frontier Edition much better balanced.
                    But on the other hand when you compute graphics you mainly need computational power. It's not that much of a difference when you are pretty sure there are no other significant bottlenecks.
                    And Vega has tons of it in various shapes. And when you make use of these shapes it should eventually perform better than the competition. So as an example, when you display hair you don't need that much precise numbers. So when you compute hair with the awareness of this it might have either a positive impact in performance by the same amount of objects or on the image quality with an increased amount of objects without a decrease in performance. Of course RPGs are more predestined to make use of this than other genres and it is a little bit of work.

                    In my opinion the Vega GPU is absolutely fine. But I see the main problem in its distribution. And this can probably be solved with Raven Ridge and EPYC embedded. Because in all these chips there is a little Vega. And in all the Linux "consoles" that are powered with them will be a little Vega...
                    But there is no doubt that it is still a long way of continuous work for AMD to convince the professional hardware customers. You really can't imagine how irrational and uninformed many of these people buy hardware although it's their job. It will take time for some of them to even notice that there is a significant difference between Ryzen and Bulldozer.
                    But the distribution of these consoles will help Linux the same way like Vega and AMD. Both have/are amazing products that need to gain market share in order to get more support by the same third parties. I am very optimistic though because the price for the performance of the APUs will make the difference and I really hope that the 7nm process will underpin the benefits by significantly more shaders and/or CPU cores.

                    I mean when you look at the actions of Nvidia they look pretty mean but on the other hand it's very desperate. They have no clue what else they could do to get out of the situation they are in. The economic numbers do still look very positive but they are losing the entry market for discrete GPUs because of the cheap, power saving APUs. It is foreseeable that they will eventually lose the mid-range soon as well. The mining situation is very positive for their business because it made the discrete AMD GPUs unreasonably expensive for gaming. But the APUs are on another market, independent from memory chips. There will be a point in time when they can't preserve the convenient situation that games are usually just optimized for their hardware anymore because it will be too expensive without entry-level GPU sales.
                    And the massive majority of GPUs being integrated chips instead of discrete cards I am very sure that this way might indeed lead to a significant market share of Vega GPUs.

                    Originally posted by Adarion View Post
                    I also hope that this silly mining bubble will burst, there is hardly any good in it.
                    I don't see why people call it bubble! It is technically not easy to argue that the currency value can rationally decrease other than people just stop buying it for an irrational reason when we exclude the ecological issues.
                    When there were stable currencies that aren't coupled it would not be necessary to do this strongly ecologically harmful stuff.
                    When specific local authorities would not subsidize electrical power for industries so they pay only 2 cents while a private citizen has to pay 30 cents in some countries it would not be economically reasonable to this extend.
                    When authorities would not justify rationally criminal acts against the privacy of citizens, like the limitation of free speech, of free information with the hypocritical potential prevention of abstract crime it would not be necessary for the people to create new ways of anonymous transactions.

                    In my opinion these are mainly negative effects of antisocial global misguidance. There is a fluent transition between an imbalance and a massive imbalance which can even result in civil war. I do hope very much that the world leaders notice the emerging imbalance of the last 15 years and regain prudence to protect every individual and restore the separation of powers, the homogeneity of markets and the preservation of communal infrastructure before we get serious issues here.

                    And when an autonomous car overruns a person in an area where you would usually drive very carefully as a human instead of exceeding the actual speed limit as an example it must not be that the police tells people that it would have been hard to be prevented by a human driver when the car hasn't even braked at all as it seems.
                    That's just bullshit. A person has been robbed of her life and there is nothing material that would ever compensate this incident. It is absolutely reckless to allow autonomous machines on public streets when you haven't ensured that these machines are able to recognize dangerous situations and estimate possible mistakes of others just like real people. Because people do make mistakes but I have never seen any machine that is able to react to completely unexpected behavior like a human.
                    I can't remember a point in my life that people were more greedy, dishonest and disrespectful toward life than today.

                    Comment


                    • #60
                      Originally posted by Adarion View Post
                      But please keep in mind: You do not know about the history of the individual card that is sold to you then. Maybe it was running at the outer rim of specs (or beyond) 24/7 for a long time? Can't imagine that this is good for the card.
                      Cards that were used for mining are actually better to buy second-hand.

                      Silicon chips (CPUs, GPUs) tend to be damaged by temperature changes a lot more than by a consistent high temperature. Having a consistent load on the card actually doesn't damage it as much.

                      With gaming, you launch a game, the gpu heats up quickly, you play for a few hours, and then it cools down, rinse and repeat. These temperature cycles are very bad for the silicon and damage it over time (microscopic expansion and contraction of the materials due to the temperature putting repeated strain). Kind of like bending a wire back and forth over and over again. It physically wears out the material and eventually the strain is too much and it snaps.

                      With mining, the card is running 24/7 and stays at a consistent (albeit hot) temperature. There are no temperature differences to physically strain the materials. This is much more sustainable and better for the gpu, despite the high temperature.

                      The thing that usually gets damaged on mining cards is the fan. If you by a used card from a miner, you are probably gonna have to replace the fan on the card. Other than that, it is probably going to work great.

                      Because of this, configuring your gaming system to automatically switch to mining on the gpu whenever you are not playing games could actually prolong the life of your card, because it avoids the temperature cycles by keeping it consistently under load.

                      Comment

                      Working...
                      X