Announcement

Collapse
No announcement yet.

Intel Arc Graphics A750 + A770 Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • WannaBeOCer
    replied
    Originally posted by qarium View Post

    in my point of view the RX 6600 is a lowend card... the RX6600 is slower than a Vega64...

    tell me what is the raytracing support anyway if the card is to slow for it ?
    there are some games who support software raytracing in shaders on the vega64 also in mesa there is support for it...
    so its raytracing but to slow but the intel cards have raytracing but it is to slow...

    "but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible"

    in my case the cards runs 99% in idle... and the intel cards right now have higher idle consumtion...

    this means it is not power hungry GPU at all...

    in 3 weeks the RDNA3 cards launch and vega,RDNA1,RDNA2 cards will drop in price ... to claim better raytracing for 3 weeks is fraud in 3 weeks RDNA3 has better raytracing than intel arc...
    The A770/A750 are low end cards... We won't see high-end Intel GPUs until next generation Battlemage. The RX 6600 is faster than the Vega 64 at 1080p/1440p and performs the same at 4K. https://www.techpowerup.com/review/g...-eagle/31.html

    The card is able to run a ton of games with ray tracing at 1080p/1440p and I'm sure with XeSS it will be even faster: https://www.techpowerup.com/review/i...c-a770/34.html

    Your point of view is wrong, you bought a flagship from 2017 of course it's still competing with low end cards in 2022. The idle power consumption is a driver bug which I'm sure Intel will fix just like AMD did recently as I previously mentioned. If you noticed from both AMD and Nvidia, the only way to increase gaming performance now adays is to drastically increase power consumption. We see with the RTX 4090 they're using 450w at stock.

    Leave a comment:


  • qarium
    replied
    Originally posted by WannaBeOCer View Post
    They're competing with the RX 6600, RTX 3060 which are current generation cards while costing less than their competition. Intel's Alchemist GPUs provide support for DX 12 Ultimate which Vega 10, 20 and RDNA1 doesn't support. While crushing AMD when it comes to ray tracing on their first attempt: https://www.digitaltrends.com/comput...ntel-arc-gpus/

    No one is telling you to upgrade, but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible. Vega GPUs were the best for mining since they are a compute focused card. Majority of the Vega GPUs on ebay were heavily used for mining. $40 more and you can buy an A750, which has a longer warranty and more functionality.

    The Vega 64 is a power hungry compute card, CDNA is based on Vega. High idle power usage is a common driver bug that all the semiconductors experienced, even recently AMD ran into the issue.

    https://www.techpowerup.com/283656/l...er-consumption
    in my point of view the RX 6600 is a lowend card... the RX6600 is slower than a Vega64...

    tell me what is the raytracing support anyway if the card is to slow for it ?
    there are some games who support software raytracing in shaders on the vega64 also in mesa there is support for it...
    so its raytracing but to slow but the intel cards have raytracing but it is to slow...

    "but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible"

    in my case the cards runs 99% in idle... and the intel cards right now have higher idle consumtion...

    this means it is not power hungry GPU at all...

    in 3 weeks the RDNA3 cards launch and vega,RDNA1,RDNA2 cards will drop in price ... to claim better raytracing for 3 weeks is fraud in 3 weeks RDNA3 has better raytracing than intel arc...

    Leave a comment:


  • WannaBeOCer
    replied
    Originally posted by qarium View Post

    "while costing much less than a RX Vega 64 when it came out."

    what a joke you talk no one cares about the prices 5 years ago you can get a vega64 for 250€ on ebay... in the usa even cheaper.

    you just don't get my GTX 1080 Ti argument the 770 competes agaist gpus 5years or older this means intel has a big problem they are 5 years behind competition.

    "The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder"

    you proof by yourself that you are a liar in only 2 sentence... you claim "a vega64 is a power hungry compute card"
    but then you show real numbers: "The A770 does use 31w more than the Vega 64 at idle" and most people run 95% or more of the time in idle... and this: "the Vega 64 uses 67 watts more under load" does not matter at all because this is only 1-5% of the times. my card runs like 23 hours per day in idle and only 1 hour per day gaming... and even these days become rare.

    "Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram​"

    thats all right. but but to pay 400€ to only get 19-26% higher performance and these features is still pricey.

    again if you think intel is good because they compete agaist 5 years old products like the GTX 1080TI then you are out of your mind.
    They're competing with the RX 6600, RTX 3060 which are current generation cards while costing less than their competition. Intel's Alchemist GPUs provide support for DX 12 Ultimate which Vega 10, 20 and RDNA1 doesn't support. While crushing AMD when it comes to ray tracing on their first attempt: https://www.digitaltrends.com/comput...ntel-arc-gpus/

    No one is telling you to upgrade, but recommending buying a power hungry GPU like a Vega 64 for gaming is terrible. Vega GPUs were the best for mining since they are a compute focused card. Majority of the Vega GPUs on ebay were heavily used for mining. $40 more and you can buy an A750, which has a longer warranty and more functionality.

    The Vega 64 is a power hungry compute card, CDNA is based on Vega. High idle power usage is a common driver bug that all the semiconductors experienced, even recently AMD ran into the issue.

    https://www.techpowerup.com/283656/l...er-consumption

    Leave a comment:


  • qarium
    replied
    Originally posted by WannaBeOCer View Post
    You are hilarious, so you're going to forget that it's competing with the RX 6600, RTX 3060 just because it outperforms a flagship GPU like the RX Vega 64 by 19% at 1080p. Most people buying an A770 will be gaming at 1440p which it is 26% better than the Vega 64 at.
    The Vega 64 is slower than a GTX 1080 Ti in gaming and will always remain slower than it. You're not fooling anyone. The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder.
    Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram while costing much less than a RX Vega 64 when it came out.
    For a companies first gaming GPU and outperforming Nvidia's second generation ray tracing cores that's a massive achievement.
    "while costing much less than a RX Vega 64 when it came out."

    what a joke you talk no one cares about the prices 5 years ago you can get a vega64 for 250€ on ebay... in the usa even cheaper.

    you just don't get my GTX 1080 Ti argument the 770 competes agaist gpus 5years or older this means intel has a big problem they are 5 years behind competition.

    "The RX Vega 64 was a power hungry compute card. Vega is the foundation of CDNA. The A770 does use 31w more than the Vega 64 at idle but the Vega 64 uses 67 watts more under load, runs hotter and louder"

    you proof by yourself that you are a liar in only 2 sentence... you claim "a vega64 is a power hungry compute card"
    but then you show real numbers: "The A770 does use 31w more than the Vega 64 at idle" and most people run 95% or more of the time in idle... and this: "the Vega 64 uses 67 watts more under load" does not matter at all because this is only 1-5% of the times. my card runs like 23 hours per day in idle and only 1 hour per day gaming... and even these days become rare.

    "Intel Arc GPUs have more functionality: ray tracing units, tensor accelerators(XMX engine), AV1, 16GB of ram​"

    thats all right. but but to pay 400€ to only get 19-26% higher performance and these features is still pricey.

    again if you think intel is good because they compete agaist 5 years old products like the GTX 1080TI then you are out of your mind.

    Leave a comment:


  • rogerx
    replied
    Originally posted by coder View Post
    What I'm really interested to know is how compute performance compares, but I know Michael will get to it when he can.
    Hey, didn't Mike get free multiple Intel Arc GPU cards? Crack of the whip... ;-) However, so far, Mike's Intel Arc 750/770 review pretty much excelled past all other reviews.

    Shrugs, regardless, if I get my hands on one, I'll likely publish my non-biased results. Well, sort of non-biased... Just so sick of buggy closed source code drivers.

    Ditto with problems being likely source code level, or driver and software level bugs. Intel seems to have a good consumer reputation with most products. However, expect the worse ~20% degradation in comparison with similar today's graphics cards.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    The amount of software compatible with it is surprisingly good. You know, unlike the OpenGL 2.1 driver for M1/M2 devices.
    right but there is a big difference no gamer would ever buy a apple m1/m2 and intel advertise the arc gpus for gamers.

    then the gamers discover intel arc has 1/3 the performance on direcX9 titles
    and intel ARC gpus does not work yet with valve proton... this alone means most games do not run on linux.

    also this is false set of priorities of course the software is compatible but thats not the question at all the question is how good or how bad the games run and if you like micro-stuttering all around then go for it.

    Originally posted by Dukenukemx View Post
    I'm not in the market yet for an upgrade, but when I am, you'd bet Intel is on the table. I would like to see it running DX11 and DX12 games on Linux since on Windows it uses a DX11->DX12 wrapper that greatly loses performance. I wonder if DXVK and VKD3D-Proton does a much better job.
    DXVK/Proton does not work at all right now on intel arc....

    and "upgrade" for most people it is not an upgrade at all they claim the 770 is 19% faster than my vega64 ... means its pointless to spend 400€ just to have 19% higher performance.

    "wrapper that greatly loses performance"

    people report that on directX9 games they have 1/3 of the performance...

    and people report even if the FPS in the games are good it has micro-stuttering problem...

    high FPS does not mean playable...

    in 3 weeks AMD release RDNA3 cards... i am sure as soon as you see the RDNA3 cards this will no longer the case: "you'd bet Intel is on the table"

    Leave a comment:


  • coder
    replied
    Originally posted by rogerx View Post
    So, does Gimp stutter too while editing photos?
    Not only that, but I also have to wonder if it's even possible there's anything about the hardware itself that makes it so heavily impacted by non-resizable BAR. It seems to me like it should be mostly a software issue, and that means it could improve over time.

    What I'm really interested to know is how compute performance compares, but I know Michael will get to it when he can.

    Leave a comment:


  • rogerx
    replied
    Cannot keep holding my breath on this one. Noticed almost every, if not every, negative review regarding Intel Arc, so far, is Microsoft Windows and game focused, very likely authored by Windows' only users.

    So, does Gimp stutter too while editing photos? How about any application not a 3D game? Hey, how about Adobe Photoshop? I would imagine the performance is negligible.

    Most Linux users likely rarely play any games, and are more focused upon productivity for obvious reasons. (Because Linux users are just better.)

    My preference is Bohemia Interactive game(s), and have only purchased one game in more than a decade. Too busy for wasting time and money on games here. I just desire freedom from the proprietary nVidia driver h*ll, and having a working discrete graphics adapter versus the slow integrated graphics. Would rather spend $300 testing an open source driver Intel discrete graphics adapter rather than another $500 nVidia grpahics adapter.
    Last edited by rogerx; 08 October 2022, 03:19 PM.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by rogerx View Post
    So instead of upgrading your NVIDIA GTX 670 on an older platform with a NVIDIA RTX 3060, which would provide a ~120% performance increase, upgrading with a Intel Arc 750/770 may only see a ~100% performance increase. For Linux users. since the drivers are open source with none of the proprietary visible artifacts, are we really loosing anything by not buying into the NVIDIA RTX 3060 or faster GPU cards? My guess, we are not. Or is this all wishful thinking.
    The 20%-30% loss in average framerate is a fairly minor downside in comparison to the 1% lows that reviewers have seen on windows. They stutter so badly that it's completely unplayable, regardless of the fact that the averages would still appear to be up over 60fps.

    Maybe the linux drivers are different, I don't know. It would be an interesting test for Michael to run. But i sure wouldn't gamble $300 on that blindly without seeing someone try it out first.
    Last edited by smitty3268; 08 October 2022, 01:11 AM.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    i really want you to buy an intel arc 770... really really really... can't wait to the time where you admit that you burned your hand on an intel arc product.
    The amount of software compatible with it is surprisingly good. You know, unlike the OpenGL 2.1 driver for M1/M2 devices. I'm not in the market yet for an upgrade, but when I am, you'd bet Intel is on the table. I would like to see it running DX11 and DX12 games on Linux since on Windows it uses a DX11->DX12 wrapper that greatly loses performance. I wonder if DXVK and VKD3D-Proton does a much better job.
    Last edited by Dukenukemx; 07 October 2022, 09:09 PM.

    Leave a comment:

Working...
X