Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by wdb974 View Post

    Let's not forget the subpar support we've gotten for GPUs. How many times have I read "Intel GPUs not officially supported" on a game's requirements... Even for BG:EE. Ugh.
    The Steam recommendations on Linux seem bogus. For instance they claim the games require more memory than on Windows. I wonder how that works. My Windows setup uses 1 GB when fully started, my Linux game machine 270 MB. That's 800 MB of headroom. The game might recommend 2 GB on Windows and 3-4 on Linux. Sooo... game uses 1 GB on Windows, 2.7 to 3.7 gigs on Linux. Yea rite.

    Comment


    • #52
      Originally posted by starshipeleven View Post
      You should adjust prices, because whoever thinks a 9800 GT is still worth anywhere near 200$ or that a GTX 285 is still worth anywhere near that too is a moron.

      Yeah Intel iGPU is so cool that is better than cards I can find on Ebay at like 50$ tops while a modern low-end NVIDIA card that costs less than 80$ pwns it.
      I was talking about MSRP/prices at the time. I thought that was obvious enough that I could imply it. Guess I was wrong. I was trying to dispel duby's incorrect notion that integrated can't do 3D. Skylake can do 3D just as fast as a high-end card from 8 years ago. That's good enough to run games that have 2009's tier-grade of quality.

      I bunched together 2 things,
      -Desktop users where the iGPU is in fact shut down but still wasting die space that would have been better used for whatever, even just a bigger L3/4 cache
      -Laptop users where the iGPU has the screens attached so it is always on, and while still wasting die space it also wastes thermal budget on a very thermally-constrained part already.

      On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.
      I couldn't care less of overclocking, and a 4-core part is still a 4-core part even without iGPU.
      On part of how what i7 usually is, having that room spent for something you care about is impossible without switching binning trees. i7 is binned in the consumer tree, and you're going to get what the consumer tree is designed for. If you want that room to be spent on more useful things, then stop buying consumer-grade hardware!

      For laptops, like I said, if the skylake GPU is running nothing but the desktop, it will produce milliwatts of heat. Beside the point, if you care about that 0.5-1% of heat output and the display can be driven by the discrete gpu, then just turn off the power to it. It's a PCI-E device. (Ironically, my desktop GTX 960 consumes 15-30 watts at idle.)

      I thought you cared about overclocking because you were talking about stuff like 'less power = more clocks'. Also, that already exists anyway. Turbo/Boosting is a TDP/thermal dependent overclock

      Comment


      • #53
        Originally posted by starshipeleven View Post
        Ummm..... how did Intel HD graphics beat the iGPU of APUs, again?
        It doesn't need to. It already beats legions of highest end legacy cards, thus people who aren't that much into gaming but want to play their old games can just upgrade the system and don't loose anything. Their power consumption goes down, but everything continues to work. The iGPU practically free, too, included on the chip. I used Ivy Bridge and Haswell graphics for years. Many (old) games were playable. 1440p screen worked just fine even on Ivy Bridge. 1080p video playback was smooth. Now with new iGPUs 4k is smooth, tear free. The same can't be said about nouveau, sometimes nvidia drivers. I can easily think of people who are totally satisfied with such a setup, and those who aren't. But it's easy to upgrade if you need more. Invest $200-300 in a discrete GPU and you're good to go.

        Yeah, gamers want performing CPUs, not CPUs that are wasting die area and performance for a bullshit iGPU they never use.
        Currently many gamers think 7700k is a better gaming CPU than Ryzen 1800X or Threadrippers, even when comparing overclocked AMD vs 7700k. They wouldn't switch even if they got Ryzens for free.

        Comment


        • #54
          Originally posted by InsideJob View Post
          Atom processors under Linux are just plain unstable. I finally got most of the CPU issues on my Bay Trail "Pentium" sorted out with recent kernels but the state of video is still horrible. I have to disable desktop effects completely to get a usable system. Funny thing is the situation under Winblows is just as bad... the "n-series" chips cant use the latest video drivers at all. You're basically stuck with whatever video driver the computer came with, FOREVER.

          I remain skeptical Intel will ever get their graphics poop in a group.
          We still have some Windows (Cherry Trail) tablets at work. We've run the latest updates, but the iGPU also hangs and restarts every 30 minutes on Windows 10. The machines are pretty unstable even for web browsing. I'd imagine the stability could be even better on Linux.

          Comment


          • #55
            Originally posted by starshipeleven View Post
            Well, that's not exactly "competition", it's more like "obsolete crap off ebay".
            Even the lowest end dedicated new NVIDIA GPU runs twice as good as the iGPU, an iGPU can't compete with dedicated cards with dedicated VRAM and their own thermal envelope
            Of course the dedicated GPUs HAVE TO run faster. Otherwise nobody would buy them. The lowest end GPUs actually have same kind of memory with similar perf. For instance 64-bit DDR3. Intel iGPUs have really killed the potential sales of several generations of low power GPUs from Nvidia. There's Geforce 210, 610-620, 710-730, but after that, only recently 1030. It's clear as to why there are so many missing models. All sorts of budget models of Nvidia GPUs have been available since 2001. It's especially nasty situation for HTPC builders since the missing models often were passively cooled.

            Comment


            • #56
              Originally posted by Michael_S View Post

              To back up what others, like starshipeleven, have said, the big problem there is that the overwhelming majority of consumers are not educated about what you need for a gaming PC. We visit forums like these and sites like Tom's Hardware, PCper, Anandtech, etc... etc... and are surrounded by others who usually have a good understanding. But we represent a tiny corner of the technology consumers.

              In my own family, it's extremely common for me to see someone spend big bucks on the CPU and then expect to play any game they want - when they can't even name the discrete graphics card in their machine or don't have one.
              Many also exclusively buy gaming laptops. All my Everquest/WOW friends use laptops for gaming. Although they might come with mid range discrete GPUs, they will be outdated quite fast and the upgrade path will be truly expensive compared to selling your dGPU for half the price, then buying a new high end card at the same fixed price. Desktop gaming only costs like $150 a year + random upgrade costs for other gear.

              Comment


              • #57
                Originally posted by starshipeleven View Post
                Yes it is, if people can't fucking read or ask questions to experts it's their own problem. Really, this applies to every product.
                While I agree with you on the principles that apply here, I don't think it's fair to condemn non-technical people for screwing it up.

                Shopping for toasters, televisions, smart phones, books, shirts, plane fares, apartment rentals, and even pets is substantially less complicated than researching an appropriate purchase for a gaming PC.

                But to your point, Intel can't put a label on a particular machine - especially since they're not selling the finished desktop or laptop, Dell/HP/Toshiba/whatever is - indicating its appropriate use and giving a definitive list of games and display resolutions that are supported. And some of the machines can be easily upgraded, and some can't.

                Comment


                • #58
                  Originally posted by caligula View Post
                  Many also exclusively buy gaming laptops. All my Everquest/WOW friends use laptops for gaming. Although they might come with mid range discrete GPUs, they will be outdated quite fast and the upgrade path will be truly expensive compared to selling your dGPU for half the price, then buying a new high end card at the same fixed price. Desktop gaming only costs like $150 a year + random upgrade costs for other gear.
                  I think 'gaming laptops' are a huge waste of consumer money, and that an awful lot of people that buy them are flat out uninformed. They don't understand the performance gaps between mobile CPUS and desktop CPUs and if they understand GPUs at all they don't understand the performance gap between mobile and desktop GPUs.

                  I mean, if you're a 27 year old software developer living in an affordable apartment and making $70k or more per year (or $100k or more per year in some big city), then sure get your $3,500 gaming laptop. But for someone with a family to support, or less of an income, resign yourself to a desktop instead of spending hundreds extra for a gaming experience that's still inferior.

                  Comment


                  • #59
                    Originally posted by Michael_S View Post

                    While I agree with you on the principles that apply here, I don't think it's fair to condemn non-technical people for screwing it up.

                    Shopping for toasters, televisions, smart phones, books, shirts, plane fares, apartment rentals, and even pets is substantially less complicated than researching an appropriate purchase for a gaming PC.

                    But to your point, Intel can't put a label on a particular machine - especially since they're not selling the finished desktop or laptop, Dell/HP/Toshiba/whatever is - indicating its appropriate use and giving a definitive list of games and display resolutions that are supported. And some of the machines can be easily upgraded, and some can't.
                    Maybe not label, but they can design adequate GPU's but unfortunately they haven't. Also we are gamers so we consider gaming computers, but most won't ever think that way. After all you don't buy gaming smartphones either. You expect that what you pay for is going to work. It's true for Androids, -not- true at all for PC's, and that is entirely Intel's fault.

                    Comment


                    • #60
                      Originally posted by duby229 View Post
                      That's not true, AMD smashed Intel's product lineup first with the Thunderbird, then with Thoroughbred/Barton and then again with the Sledgehammer/Winchester.
                      And lets not forget Opteron/Athlon64. Intel invested $billions into Itanium development, which was their answer to 64 bit computing. Itanium was obliterated in one fell swoop when AMD invented x86-64.

                      Comment

                      Working...
                      X