Windows 11 vs. Linux Benchmarks For Intel Arc B-Series "Battlemage" Shows Strengths & Weaknesses

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • tenchrio
    replied
    Originally posted by avis View Post

    Welcome to r/AMD and r/NVIDIA where this comment and the two below it will be downvoted to hell for being false.
    Popular opinion on Reddit isn't aiding your case, after all Kamala lost despite Reddit believing she would win.

    Originally posted by avis View Post
    People who don't have games and don't play games claim some tech related to gaming that they've never used or tried ... sucks.

    BTW do you "game" on Intel UHD or AMD Vega that's in the cheapest Zen 2 APUs? My condolences.

    Is it Doom 2 or maybe your "GPU" can even drive Quake 1?

    God forbid I mention RTX. What RTX? You can only play 15 year old games.
    Ray Tracing being supported in only 159 games is a selling point? (yes, 159, just go to Nvidia's RTX game engine page set Type to "Games" and Ray Tracing to "Yes" and count them, or use $('table').find('tr.content:visible').length in the JS console, you will get 151 then + 8 from the "Full RT" results)

    Hell let me use my massive jQuery skills further, getting the top 100 games from any Steam Chart website, upper case, put in array, upper case the td record text and we get.... 8 games....

    10 if we include Black Myth Wukong and Cyberpunk with the full rt results.
    It's funny how Linux can run on about 90% of games in the Top 100 according to Protondb and you will rant on about how people can't possibly want to (or should) play on Linux. But a feature that has the inverse of that in terms of implementation becomes a selling point and important feature in your eyes? Really?

    And not all of those games have DLSS, with (Full) RT +DLSS 3 or 3.5 reducing the total numbers to 7. Even on its own DLSS 3(.5) only gets 15 (from the Top 100). DLSS2 does better at 33 (that is still only 1/3 and with RT it drops to 9). Now you will probably whataboutism on FSR but FSR isn't a selling feature for AMD gpus as the feature is supported on Nvidia hardware as well. In fact the non frame generation of FSR 3.1 is supported as far back as the GTX 10 series.

    Now what was it you said again.... oh right:
    Originally posted by avis View Post
    In terms of performance per dollar, yeah, AMD is a tad better but only in raster:
    In games with real RT...

    Which makes up 90% of the top 100 games (numbers goes to a full 100% for top 10 by peak daily players, closest game is Delta Force on 14th place).
    And in some of those 10 games the RT isn't even that great, with Elden Ring I sure as hell am not running it with Ray Tracing, neither are any of my 22 friends that own it. The RT barely looks any better in most areas but will tank the FPS considerably (and no DLSS to save you here). So how many "real RT" games are there even, to care about? 2?

    Which isn't even considering the cases were AMD does kinda beat Nvidia in RT due to a more toned back RT implementation as not every game would benefit from realistic lighting but does implement RT somewhat. Evidence can be found in the RT section of the B580 review you posted where AMD wins RE4 with RT enabled and arguably wins in price performance for the RX 7700XT against the RTX 4060 TI 16 GB (more expensive than the 7700XT in the current market) as it differs only a couple of FPS in Alan Wake 2, Doom Eternal and F1 24 compared to it.

    So god forbid you only play games with "Real RT" because the list of games with that feature and decent enough gameplay to keep an audience seems to be rather small.

    Leave a comment:


  • avis
    replied
    Originally posted by Daktyl198 View Post

    Forgot I was talking to birdie, what a waste of my limited lifespan this has been lol. No use talking with people that have zero reading comprehension but 200% confidence.
    Welcome to r/AMD and r/NVIDIA where this comment and the two below it will be downvoted to hell for being false.

    People who don't have games and don't play games claim some tech related to gaming that they've never used or tried ... sucks.

    BTW do you "game" on Intel UHD or AMD Vega that's in the cheapest Zen 2 APUs? My condolences.

    Is it Doom 2 or maybe your "GPU" can even drive Quake 1?

    God forbid I mention RTX. What RTX? You can only play 15 year old games.
    Last edited by avis; 19 December 2024, 12:32 PM.

    Leave a comment:


  • carewolf
    replied
    Originally posted by avis View Post

    I give credit where it's due regardless of the company.

    You on the other hand continue to claim that DLSS is subpar despite millions of NVIDIA users seing with their own eyes that it's indeed great (at the quality preset).

    If DLSS had been "bad" as you claim, people would not have used it. The vast majority of NVIDIA users enable DLSS even they get sufficient FPS from their GPU.

    It should say something to you, but, alas, "DLSS is bad, I've seen someone use it and I wasn't impressed". What a bad stupid joke.

    Continue to pray in your 7900XT(X).
    I dont know anyone using DLSS, because it looks like shit, and if you add the framegen feels like shit too. It only really outperform oldfashioned upscaling so for people who mismatched their GPU and screen resolution.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by avis View Post
    If DLSS had been "bad" as you claim, people would not have used it.
    Millions of people eat McDonald's every day, so I guess that makes them the king of great food.

    Leave a comment:


  • Daktyl198
    replied
    Originally posted by avis View Post
    ...
    Forgot I was talking to birdie, what a waste of my limited lifespan this has been lol. No use talking with people that have zero reading comprehension but 200% confidence.

    Leave a comment:


  • avis
    replied
    Originally posted by Daktyl198 View Post

    Hardly theoretical. My personal PC runs an AMD GPU, but my best friend runs an Nvidia 4080 (which I convinced him to buy) and I use his PC for gaming all the time. I can very easily tell when DLSS is on vs when it's off. It's very impressive image upscaling, but upscaling can never match native rendering. Maybe you don't play enough games, or maybe your eyes aren't what they used to be.

    Not sure why you're calling me a fanboy. Fanboy of who? AMD? The company I regularly criticize for dumb-ass decisions like spiking the prices of their underperforming GPUs to be nearly on-par with Nvidia's vastly superior GPUs? The one with the shitty drivers on the one platform that matters? If anything, you sound like the fanboy of Nvidia in this thread.
    I give credit where it's due regardless of the company.

    You on the other hand continue to claim that DLSS is subpar despite millions of NVIDIA users seing with their own eyes that it's indeed great (at the quality preset).

    If DLSS had been "bad" as you claim, people would not have used it. The vast majority of NVIDIA users enable DLSS even they get sufficient FPS from their GPU.

    It should say something to you, but, alas, "DLSS is bad, I've seen someone use it and I wasn't impressed". What a bad stupid joke.

    Continue to pray in your 7900XT(X).

    Leave a comment:


  • Daktyl198
    replied
    Originally posted by avis View Post
    Your theoretical speculations are worthless, childish and fanboyish: "I remember many years ago people criticised DLSS 1.0 for its blurriness, yeah, I guess it's still blurry but I've not tried it personally. I will continue however to claim it's just bad." Try that with your mom, not me.
    Hardly theoretical. My personal PC runs an AMD GPU, but my best friend runs an Nvidia 4080 (which I convinced him to buy) and I use his PC for gaming all the time. I can very easily tell when DLSS is on vs when it's off. It's very impressive image upscaling, but upscaling can never match native rendering. Maybe you don't play enough games, or maybe your eyes aren't what they used to be.

    Not sure why you're calling me a fanboy. Fanboy of who? AMD? The company I regularly criticize for dumb-ass decisions like spiking the prices of their underperforming GPUs to be nearly on-par with Nvidia's vastly superior GPUs? The one with the shitty drivers on the one platform that matters? If anything, you sound like the fanboy of Nvidia in this thread.

    Leave a comment:


  • vein
    replied
    Intels cards dont look that bad. Thats great, since competition is always good for us consumers.

    But for me, I will probably stay on AMD. The performance is great and now I have got ROCM working very good and are running local LLM:s on both my GPU and my APU. Very very happy

    Leave a comment:


  • blackshard
    replied
    Originally posted by stormcrow View Post

    There were real games in the benchmark tests. Just because you don't play them doesn't mean they aren't real games that people play. Yall really should stop with this entitled crap that only your experiences and interests matter with everyone else irrelevant. The benchmarks may have no direct relevance to you, and you know what? That's perfectly ok! (GASP!)
    Which one? I only see Counterstrike 2 (2023) and Hitman III (2021), other games are yquake2 and xonotic, which are totally irrelevant against a 2024 GPU!

    You just want to take a look to the frame per second graphs: when your card is doing >= 300fps (topping, in some graphs, at something above 2000fps!!!) it's just benchmark pornography, it does not take anywhere, what is the purpose of that? Should I buy a 2024 GPU to play yquake2??? Or, on the opposite, does the benchmark helps me choose a card to play yquake2???

    I play yquake2 on 5-Watt ten-years-old Mali class GPUs (seriously, I do! https://www.youtube.com/watch?v=IjfbuY48AgM ) and not on 200-Watt brand new GPUs capable of thousands of gigaflops. Also I never played Counterstrike nor Hitman, so your prejudice is a total fail and your final thesis is also obtuse, because those benchmarks not only have no relevance to me, but they have no relevance to anyone.



    Leave a comment:


  • pong
    replied
    Originally posted by Anderse4 View Post

    This has not been true for a long time. Also ipex is no longer required for PyTorch. I’m training a model on my a770 right now with 14gb allocation measured in xpu-smi. The deep learning stack is much better than it was a year ago.
    Thanks for the update, I'm glad if it has improved in some ways which I'll have to try to understand in more depth in time.

    Upon glancing over the bug report I linked I noticed it is still open in IPEX though it is possible they just did not do a good job of closing the issue or updating it / linking it if they did something to address it.

    I am / was aware pytorch 2.5 only recently gained initial official "xpu" native support outside of IPEX though it is not clear to me how the two options compare in capability at this time and to what extent the native xpu support is in most / all respects equal to or better than what is now achievable with IPEX+pytorch. Obviously there is some older SW out there that is still configured / written / documented for only concerning IPEX use with pytorch (including the AFAIK still remaining 2.5 pytorch IPEX support) so it'd be nice for it to be fixed even if IPEX relevance is waning over time / pytorch versions unless it's quite obsolete already. But in some cases it may be easy enough to port SW that was using ipex to just "xpu" native reference.

    I also recently noticed this:


    ..which speaks of some possible work-arounds for the compute-runtime / opencl / level zero levels where the overall 4G limit is also otherwise imposed. Someone had mentioned coming up with DIY allocation workarounds in the other thread I mentioned so maybe these changes improve upon or echo those unofficial community ones for OCL / level-zero.

    I have been meaning to revisit the ARC7 compute status and development but haven't had enough time to track what has happened
    in the last several months (besides the pytorch 2.5 news) so it's good if it has improved in any ways since last I had investigated.

    I am curious if the rumors about a battlemage series card with 24G VRAM might come true for general consumer release and if so what that will be since that's sort of the possibility I was hoping to see eventually from a newer ARC card generation.

    Leave a comment:

Working...
X