Originally posted by wdb974
View Post
Announcement
Collapse
No announcement yet.
It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously
Collapse
X
-
-
Originally posted by starshipeleven View PostYou should adjust prices, because whoever thinks a 9800 GT is still worth anywhere near 200$ or that a GTX 285 is still worth anywhere near that too is a moron.
Yeah Intel iGPU is so cool that is better than cards I can find on Ebay at like 50$ tops while a modern low-end NVIDIA card that costs less than 80$ pwns it.
I bunched together 2 things,
-Desktop users where the iGPU is in fact shut down but still wasting die space that would have been better used for whatever, even just a bigger L3/4 cache
-Laptop users where the iGPU has the screens attached so it is always on, and while still wasting die space it also wastes thermal budget on a very thermally-constrained part already.
On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.I couldn't care less of overclocking, and a 4-core part is still a 4-core part even without iGPU.
For laptops, like I said, if the skylake GPU is running nothing but the desktop, it will produce milliwatts of heat. Beside the point, if you care about that 0.5-1% of heat output and the display can be driven by the discrete gpu, then just turn off the power to it. It's a PCI-E device. (Ironically, my desktop GTX 960 consumes 15-30 watts at idle.)
I thought you cared about overclocking because you were talking about stuff like 'less power = more clocks'. Also, that already exists anyway. Turbo/Boosting is a TDP/thermal dependent overclock
Comment
-
Originally posted by starshipeleven View PostUmmm..... how did Intel HD graphics beat the iGPU of APUs, again?
Yeah, gamers want performing CPUs, not CPUs that are wasting die area and performance for a bullshit iGPU they never use.
Comment
-
Originally posted by InsideJob View PostAtom processors under Linux are just plain unstable. I finally got most of the CPU issues on my Bay Trail "Pentium" sorted out with recent kernels but the state of video is still horrible. I have to disable desktop effects completely to get a usable system. Funny thing is the situation under Winblows is just as bad... the "n-series" chips cant use the latest video drivers at all. You're basically stuck with whatever video driver the computer came with, FOREVER.
I remain skeptical Intel will ever get their graphics poop in a group.
- Likes 1
Comment
-
Originally posted by starshipeleven View PostWell, that's not exactly "competition", it's more like "obsolete crap off ebay".
Even the lowest end dedicated new NVIDIA GPU runs twice as good as the iGPU, an iGPU can't compete with dedicated cards with dedicated VRAM and their own thermal envelope
Comment
-
Originally posted by Michael_S View Post
To back up what others, like starshipeleven, have said, the big problem there is that the overwhelming majority of consumers are not educated about what you need for a gaming PC. We visit forums like these and sites like Tom's Hardware, PCper, Anandtech, etc... etc... and are surrounded by others who usually have a good understanding. But we represent a tiny corner of the technology consumers.
In my own family, it's extremely common for me to see someone spend big bucks on the CPU and then expect to play any game they want - when they can't even name the discrete graphics card in their machine or don't have one.
- Likes 1
Comment
-
Originally posted by starshipeleven View PostYes it is, if people can't fucking read or ask questions to experts it's their own problem. Really, this applies to every product.
Shopping for toasters, televisions, smart phones, books, shirts, plane fares, apartment rentals, and even pets is substantially less complicated than researching an appropriate purchase for a gaming PC.
But to your point, Intel can't put a label on a particular machine - especially since they're not selling the finished desktop or laptop, Dell/HP/Toshiba/whatever is - indicating its appropriate use and giving a definitive list of games and display resolutions that are supported. And some of the machines can be easily upgraded, and some can't.
Comment
-
Originally posted by caligula View PostMany also exclusively buy gaming laptops. All my Everquest/WOW friends use laptops for gaming. Although they might come with mid range discrete GPUs, they will be outdated quite fast and the upgrade path will be truly expensive compared to selling your dGPU for half the price, then buying a new high end card at the same fixed price. Desktop gaming only costs like $150 a year + random upgrade costs for other gear.
I mean, if you're a 27 year old software developer living in an affordable apartment and making $70k or more per year (or $100k or more per year in some big city), then sure get your $3,500 gaming laptop. But for someone with a family to support, or less of an income, resign yourself to a desktop instead of spending hundreds extra for a gaming experience that's still inferior.
Comment
-
Originally posted by Michael_S View Post
While I agree with you on the principles that apply here, I don't think it's fair to condemn non-technical people for screwing it up.
Shopping for toasters, televisions, smart phones, books, shirts, plane fares, apartment rentals, and even pets is substantially less complicated than researching an appropriate purchase for a gaming PC.
But to your point, Intel can't put a label on a particular machine - especially since they're not selling the finished desktop or laptop, Dell/HP/Toshiba/whatever is - indicating its appropriate use and giving a definitive list of games and display resolutions that are supported. And some of the machines can be easily upgraded, and some can't.
Comment
-
Originally posted by duby229 View PostThat's not true, AMD smashed Intel's product lineup first with the Thunderbird, then with Thoroughbred/Barton and then again with the Sledgehammer/Winchester.
Comment
Comment