Originally posted by sophisticles
View Post
Intel Arc B580 Delivers Promising Linux GPU Compute Potential For Battlemage
Collapse
X
-
"Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."
-
-
Originally posted by Svyatko View Post
Big internal HW changes to better fit games, needs rework for GPGPU.
What concerns me is that this is not a very good card for Blender artists. As an artist, most of the time your GPU is going to be near (but not quite) idle as you create your scenes. It only kicks into high gear when you ask Blender to render the scene. So the majority of your time you're going to be burning between 20-30 Watts more than a card that may cost $50 more depending on sales. I'm sure there's people that don't care about that, but the power use adds up in aggregate plus cooling the ambient temperature in the room, especially if you live in an area with hot summers (like I do) and/or energy bills are expensive (like much of Europe, some regions of the US). I was hoping Battlemage would solve my problem with which card to get if I want to dive into Blender in a big way without Windows, but it doesn't. Not with that power envelope regardless of future Blender render performance optimizations. I'll be looking at AMD or Nvidia instead, and that's unfortunate.Last edited by stormcrow; 13 December 2024, 10:28 AM.
Comment
-
-
Originally posted by stormcrow View Post
They don't need to make the same mistake as AMD with their client cards and ROCm/HIP. Gaming isn't driving GPU purchases any more even for client systems, especially in Linux. Much of gaming has moved to the console market or casual gaming to mobile form factor. Compute is the driving force in GPU performance these days, with the desktop market taking a back seat to data center product cash cows. Annoying, but understandable from a business perspective.
What concerns me is that this is not a very good card for Blender artists. As an artist, most of the time your GPU is going to be near (but not quite) idle as you create your scenes. It only kicks into high gear when you ask Blender to render the scene. So the majority of your time you're going to be burning between 20-30 Watts more than a card that may cost $50 more depending on sales. I'm sure there's people that don't care about that, but the power use adds up in aggregate plus cooling the ambient temperature in the room, especially if you live in an area with hot summers (like I do) and/or energy bills are expensive (like much of Europe, some regions of the US). I was hoping Battlemage would solve my problem with which card to get if I want to dive into Blender in a big way without Windows, but it doesn't. Not with that power envelope regardless of future Blender render performance optimizations. I'll be looking at AMD or Nvidia instead, and that's unfortunate.
Comment
-
-
Originally posted by sophisticles View PostYou mean like Mark Cuban?"Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."
Comment
-
-
Originally posted by sophisticles View Post
Actually what Elon said was that he was taking Tesla private at $420 a share. as a joke for his pot loving girlfriend.
The SEC was not amused and fined him 10 million bucks for his little joke.
My point is it's really easy to say stuff on Twitter, no matter how true it is. And if it makes you and your personally owned business look good, why not say it? Not to say he's just completely making stuff up, maybe there was some brief conversation where the # came up, or some 3rd party evaluation where the number was bandied about as a possibility. But I'm skeptical of just taking his word that he was seriously offered that much and turned it down. Show me the paperwork in that case, because otherwise I'm assuming it was just a "joke".Last edited by smitty3268; 14 December 2024, 08:10 PM.
Comment
-
-
Originally posted by ms178 View PostI am surprised, apart from some odd results, this looks very impressive for a 250 USD card.
previous generation Arc with 16GB VRAM - in many cases today mid-8core CPU render graphics and process ML-tasks faster on CPU-software than on those GPU.
Making 16GB - pointless because if CPU with 32+GB ram do same job faster...
3060 12GB exist - but many people agree that "you do not need 12GB on that slow GPU" - and prefer 4060 that is much faster.
Correct benchmark could be - popular modern multiplayer games - that dont need vram (they fine even on <4GB) - and just look on its performance stutters and 1% low.
Also test on "actual low-mid level CPU" to see if there problems related to CPU speed. (not on 9800x3d that alone cost like two gpus)
Comment
-
Comment