Announcement

Collapse
No announcement yet.

Intel Arc Graphics A750 + A770 Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by Dukenukemx View Post
    The amount of software compatible with it is surprisingly good. You know, unlike the OpenGL 2.1 driver for M1/M2 devices.
    right but there is a big difference no gamer would ever buy a apple m1/m2 and intel advertise the arc gpus for gamers.

    then the gamers discover intel arc has 1/3 the performance on direcX9 titles
    and intel ARC gpus does not work yet with valve proton... this alone means most games do not run on linux.

    also this is false set of priorities of course the software is compatible but thats not the question at all the question is how good or how bad the games run and if you like micro-stuttering all around then go for it.

    Originally posted by Dukenukemx View Post
    I'm not in the market yet for an upgrade, but when I am, you'd bet Intel is on the table. I would like to see it running DX11 and DX12 games on Linux since on Windows it uses a DX11->DX12 wrapper that greatly loses performance. I wonder if DXVK and VKD3D-Proton does a much better job.
    DXVK/Proton does not work at all right now on intel arc....

    and "upgrade" for most people it is not an upgrade at all they claim the 770 is 19% faster than my vega64 ... means its pointless to spend 400€ just to have 19% higher performance.

    "wrapper that greatly loses performance"

    people report that on directX9 games they have 1/3 of the performance...

    and people report even if the FPS in the games are good it has micro-stuttering problem...

    high FPS does not mean playable...

    in 3 weeks AMD release RDNA3 cards... i am sure as soon as you see the RDNA3 cards this will no longer the case: "you'd bet Intel is on the table"
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #92
      Originally posted by coder View Post
      What I'm really interested to know is how compute performance compares, but I know Michael will get to it when he can.
      Hey, didn't Mike get free multiple Intel Arc GPU cards? Crack of the whip... ;-) However, so far, Mike's Intel Arc 750/770 review pretty much excelled past all other reviews.

      Shrugs, regardless, if I get my hands on one, I'll likely publish my non-biased results. Well, sort of non-biased... Just so sick of buggy closed source code drivers.

      Ditto with problems being likely source code level, or driver and software level bugs. Intel seems to have a good consumer reputation with most products. However, expect the worse ~20% degradation in comparison with similar today's graphics cards.

      Comment


      • #93
        Originally posted by WannaBeOCer View Post
        You are hilarious, so you're going to forget that it's competing with the RX 6600, RTX 3060 just because it outperforms a flagship GPU like the RX Vega 64 by 19% at 1080p. Most people buying an A770 will be gaming at 1440p which it is 26% better than the Vega 64 at.
        The Vega 64 is slower than a GTX 1080 Ti in gaming and will always remain slower than it. You're not fooling anyone. The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder.
        Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram while costing much less than a RX Vega 64 when it came out.
        For a companies first gaming GPU and outperforming Nvidia's second generation ray tracing cores that's a massive achievement.
        "while costing much less than a RX Vega 64 when it came out."

        what a joke you talk no one cares about the prices 5 years ago you can get a vega64 for 250€ on ebay... in the usa even cheaper.

        you just don't get my GTX 1080 Ti argument the 770 competes agaist gpus 5years or older this means intel has a big problem they are 5 years behind competition.

        "The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder"

        you proof by yourself that you are a liar in only 2 sentence... you claim "a vega64 is a power hungry compute card"
        but then you show real numbers: "The A770 does use 31w more than the Vega 64 at idle" and most people run 95% or more of the time in idle... and this: "the Vega 64 uses 67 watts more under load" does not matter at all because this is only 1-5% of the times. my card runs like 23 hours per day in idle and only 1 hour per day gaming... and even these days become rare.

        "Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram​"

        thats all right. but but to pay 400€ to only get 19-26% higher performance and these features is still pricey.

        again if you think intel is good because they compete agaist 5 years old products like the GTX 1080TI then you are out of your mind.
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • #94
          Originally posted by qarium View Post

          "while costing much less than a RX Vega 64 when it came out."

          what a joke you talk no one cares about the prices 5 years ago you can get a vega64 for 250€ on ebay... in the usa even cheaper.

          you just don't get my GTX 1080 Ti argument the 770 competes agaist gpus 5years or older this means intel has a big problem they are 5 years behind competition.

          "The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder"

          you proof by yourself that you are a liar in only 2 sentence... you claim "a vega64 is a power hungry compute card"
          but then you show real numbers: "The A770 does use 31w more than the Vega 64 at idle" and most people run 95% or more of the time in idle... and this: "the Vega 64 uses 67 watts more under load" does not matter at all because this is only 1-5% of the times. my card runs like 23 hours per day in idle and only 1 hour per day gaming... and even these days become rare.

          "Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram​"

          thats all right. but but to pay 400€ to only get 19-26% higher performance and these features is still pricey.

          again if you think intel is good because they compete agaist 5 years old products like the GTX 1080TI then you are out of your mind.
          They're competing with the RX 6600, RTX 3060 which are current generation cards while costing less than their competition. Intel's Alchemist GPUs provide support for DX 12 Ultimate which Vega 10, 20 and RDNA1 doesn't support. While crushing AMD when it comes to ray tracing on their first attempt: https://www.digitaltrends.com/comput...ntel-arc-gpus/

          No one is telling you to upgrade, but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible. Vega GPUs were the best for mining since they are a compute focused card. Majority of the Vega GPUs on ebay were heavily used for mining. $40 more and you can buy an A750, which has a longer warranty and more functionality.

          The Vega 64 is a power hungry compute card, CDNA is based on Vega. High idle power usage is a common driver bug that all the semiconductors experienced, even recently AMD ran into the issue.

          https://www.techpowerup.com/283656/l...er-consumption

          Comment


          • #95
            Originally posted by WannaBeOCer View Post
            They're competing with the RX 6600, RTX 3060 which are current generation cards while costing less than their competition. Intel's Alchemist GPUs provide support for DX 12 Ultimate which Vega 10, 20 and RDNA1 doesn't support. While crushing AMD when it comes to ray tracing on their first attempt: https://www.digitaltrends.com/comput...ntel-arc-gpus/

            No one is telling you to upgrade, but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible. Vega GPUs were the best for mining since they are a compute focused card. Majority of the Vega GPUs on ebay were heavily used for mining. $40 more and you can buy an A750, which has a longer warranty and more functionality.

            The Vega 64 is a power hungry compute card, CDNA is based on Vega. High idle power usage is a common driver bug that all the semiconductors experienced, even recently AMD ran into the issue.

            https://www.techpowerup.com/283656/l...er-consumption
            in my point of view the RX 6600 is a lowend card... the RX6600 is slower than a Vega64...

            tell me what is the raytracing support anyway if the card is to slow for it ?
            there are some games who support software raytracing in shaders on the vega64 also in mesa there is support for it...
            so its raytracing but to slow but the intel cards have raytracing but it is to slow...

            "but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible"

            in my case the cards runs 99% in idle... and the intel cards right now have higher idle consumtion...

            this means it is not power hungry GPU at all...

            in 3 weeks the RDNA3 cards launch and vega,RDNA1,RDNA2 cards will drop in price ... to claim better raytracing for 3 weeks is fraud in 3 weeks RDNA3 has better raytracing than intel arc...
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #96
              Originally posted by qarium View Post

              in my point of view the RX 6600 is a lowend card... the RX6600 is slower than a Vega64...

              tell me what is the raytracing support anyway if the card is to slow for it ?
              there are some games who support software raytracing in shaders on the vega64 also in mesa there is support for it...
              so its raytracing but to slow but the intel cards have raytracing but it is to slow...

              "but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible"

              in my case the cards runs 99% in idle... and the intel cards right now have higher idle consumtion...

              this means it is not power hungry GPU at all...

              in 3 weeks the RDNA3 cards launch and vega,RDNA1,RDNA2 cards will drop in price ... to claim better raytracing for 3 weeks is fraud in 3 weeks RDNA3 has better raytracing than intel arc...
              The A770/A750 are low end cards... We won't see high-end Intel GPUs until next generation Battlemage. The RX 6600 is faster than the Vega 64 at 1080p/1440p and performs the same at 4K. https://www.techpowerup.com/review/g...-eagle/31.html

              The card is able to run a ton of games with ray tracing at 1080p/1440p and I'm sure with XeSS it will be even faster: https://www.techpowerup.com/review/i...c-a770/34.html

              Your point of view is wrong, you bought a flagship from 2017 of course it's still competing with low end cards in 2022. The idle power consumption is a driver bug which I'm sure Intel will fix just like AMD did recently as I previously mentioned. If you noticed from both AMD and Nvidia, the only way to increase gaming performance now adays is to drastically increase power consumption. We see with the RTX 4090 they're using 450w at stock.

              Comment


              • #97
                Originally posted by rogerx View Post
                Hey, didn't Mike get free multiple Intel Arc GPU cards? Crack of the whip... ;-) However, so far, Mike's Intel Arc 750/770 review pretty much excelled past all other reviews.
                Be nice to Michael. It's a one-man operation and he doesn't take any days off (he has a wife + kid to support). He developed and maintains PTS / OpenBenchmarking.org, as well.

                Originally posted by rogerx View Post
                Shrugs, regardless, if I get my hands on one, I'll likely publish my non-biased results. Well, sort of non-biased... Just so sick of buggy closed source code drivers.
                Intel has a "dev cloud" thing. Maybe you could apply for an account and try to run PTS on it.

                Or you could search OpenBenchmarking.org, to see if someone has already run some compute benchmarks on it.

                I'm in no rush, since I'm a little ways off from buying one. I would need a new system to put it in, and that won't happen for at least a couple more months.

                Originally posted by rogerx View Post
                Ditto with problems being likely source code level, or driver and software level bugs. Intel seems to have a good consumer reputation with most products. However, expect the worse ~20% degradation in comparison with similar today's graphics cards.
                According to WannaBeOCer 's link, the raytracing performance is better than their conventional rendering performance, so that could give us some hope.

                I think it's plausible their cards' compute performance is better than their gaming performance, relatively speaking. On a related note, I expect their XMX units to give them an advantage over AMD, on deep learning.

                I'm pretty bummed about the rumor of no fp64 support, but I don't have an immediate need for it. It would be nice to be able to do things like small matrix inversions on the GPU, without hurting performance too badly. A fair compromise would've been to give them scalar-only fp64.
                Last edited by coder; 08 October 2022, 09:07 PM.

                Comment


                • #98
                  Originally posted by qarium View Post

                  right but there is a big difference no gamer would ever buy a apple m1/m2 and intel advertise the arc gpus for gamers.
                  That Vtuber mostly tested things like web browsers and other UI related stuff. OpenGL isn't just for games.
                  then the gamers discover intel arc has 1/3 the performance on direcX9 titles
                  All DX9 games run on potatoes. What people care about is DX11 performance.
                  and intel ARC gpus does not work yet with valve proton... this alone means most games do not run on linux.
                  Who told you this?
                  also this is false set of priorities of course the software is compatible but thats not the question at all the question is how good or how bad the games run and if you like micro-stuttering all around then go for it.
                  I do not like micro-stuttering. If that is the case then good luck Intel.
                  DXVK/Proton does not work at all right now on intel arc....
                  Where is this info?
                  in 3 weeks AMD release RDNA3 cards... i am sure as soon as you see the RDNA3 cards this will no longer the case: "you'd bet Intel is on the table"
                  AMD screwed up when they charged more for the motherboard than the CPU with the Ryzen 7000 series. It seems that AMD and Nvidia don't know what a recession is. Also it doesn't mater if AMD is coming out with RDNA3, because we do want Intel to compete. This isn't a sports team where we root for one side. Wait and see how many GPU's AMD releases that are $300. The answer is zero, at least for 6 months to a year. Intel at least priced their GPU's at mainstream prices.

                  Comment


                  • #99
                    Originally posted by Dukenukemx View Post
                    AMD screwed up when they charged more for the motherboard than the CPU with the Ryzen 7000 series. It seems that AMD and Nvidia don't know what a recession is.
                    So far, all we have are the high-end boards. Hopefully, the lower-end boards will be more affordable.

                    FWIW, I've long criticized PCIe 5.0 for being a step too far, too soon, that inevitably adds cost to motherboards and consumers simply don't need. Not to mention burning more power, as well.

                    I think that, if Intel hadn't jumped to PCIe 5.0 in Alder Lake, AM5 would still be on PCIe 4.0.

                    Comment


                    • Originally posted by coder View Post
                      I think that, if Intel hadn't jumped to PCIe 5.0 in Alder Lake, AM5 would still be on PCIe 4.0.
                      I doubt that, it would have locked them in PCIe 4 for the next 3 generations.

                      Comment

                      Working...
                      X