Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by starshipeleven View Post
    Intel iGPUs (any iGPU) was never marketed for gaming, if people assume bullshit it's their own fault.

    Yeah, it says that I know that the iGPU in the E1 pwns anything less that a modern i5 iGPU (it should be around on par if not better than Intel HD4000), and yet it still isn't marketed for gaming.

    Laptops with AMD APUs like say A10-9600 pwn any iGPU from Intel by a long shot. Except maybe Iris Pro which is far more expensive (and not cost-effective vs laptop dedicated graphics) anyway.

    On such laptops you can actually play some games. For example I can play XCOM 1 on mine.

    Again still not marketed as "gaming".
    Yep, so you admmit freely that on every generation it is Intel's products that determine the bottom line? hmm? Even though it includes almost everything they have? That's what you just said. You've seen the market share numbers. You know exactly what game developers need to target and you also know it's unworkably low.

    So knowing that Intel's market share is what it is, and that almost none of them are capable of adequate gaming, You are actually saying 75% of -ALL- computer users shouldn't game at all because they bought a computer that wasn't meant for gaming.....

    Douchebag for real.


    It's exactly that mentality that has caused Intel to harm PC gaming for decades. Nothing more and nothing less.
    Last edited by duby229; 15 July 2017, 05:47 PM.

    Comment


    • #32
      Originally posted by duby229 View Post
      Multiple people on this thread have called Intel graphics fine.... Really? Apparently you guys never worked in a computer repair shop. Almost all times when someone complains there games don't work well, or that there computer is too slow the solution is a clean up and a graphics card replacement or a new laptop sale. The reason why is usually Intel's incapable graphics. Literally the only product they have that can be called Ok-ish is Iris, and only the ones with eDRAM at that. And they are -WAY- overpriced.
      To back up what others, like starshipeleven, have said, the big problem there is that the overwhelming majority of consumers are not educated about what you need for a gaming PC. We visit forums like these and sites like Tom's Hardware, PCper, Anandtech, etc... etc... and are surrounded by others who usually have a good understanding. But we represent a tiny corner of the technology consumers.

      In my own family, it's extremely common for me to see someone spend big bucks on the CPU and then expect to play any game they want - when they can't even name the discrete graphics card in their machine or don't have one.

      (Edit): Sorry, I didn't connect what I was saying to what you wrote. My point is that most of the people who is upset by the games in their integrated GPU should have never bought that hardware configuration in the first place. It would be fine for some cases, but not the one they got it to handle.

      Comment


      • #33
        Originally posted by Michael_S View Post

        To back up what others, like starshipeleven, have said, the big problem there is that the overwhelming majority of consumers are not educated about what you need for a gaming PC. We visit forums like these and sites like Tom's Hardware, PCper, Anandtech, etc... etc... and are surrounded by others who usually have a good understanding. But we represent a tiny corner of the technology consumers.

        In my own family, it's extremely common for me to see someone spend big bucks on the CPU and then expect to play any game they want - when they can't even name the discrete graphics card in their machine or don't have one.

        (Edit): Sorry, I didn't connect what I was saying to what you wrote. My point is that most of the people who is upset by the games in their integrated GPU should have never bought that hardware configuration in the first place. It would be fine for some cases, but not the one they got it to handle.
        That is -NOT- a consumers responsibility. A consumer goes to Best Buy or Walmart and buys whatever is on sale. It is Intel's job to make sure that what they sell is capable of fulfilling expectations. And It's people like starshipeleven that set expectations so stupidly low that they are unworkable for 75% of people.

        Comment


        • #34
          Can't wait for Intel to start claiming that APUs are glued together garbage with no ecosystem

          Comment


          • #35
            Originally posted by Opossum View Post

            I was looking into installing another (but relatively weaker) graphics card to do a few things like video encoding and display offloading (turns out having a display connected doing nothing but rendering the desktop chops off 1-5% 3D performance). I was thinking of installing my old $200 GeForce 9800 GT (was suuper fast when I first got it as a birthday present), but it turns out (according to online benchmarks) that the integrated GPU of my skylake (HD 530) was over 2x faster than the nvidia.

            I did some more research on this, it turns out a $400 nvidia card from 2009 (GTX 285) is slower than skylake. I wonder what the GTX 1070 and the 1080 would look like in 2025...



            Because it sounds like you are unaware, in contemporary integrated circuit design (like cpu's), we employ or clock gating (takes power to switch transistors, don't do that) and power gating (turn off parts altogether) in effort to save power. So unless you turn on your integrated GPU and attach a monitor to it, more than likely, it will consume 0 power. Dark silicon.

            (and besides, I noticed at least my skylake GPU idling at desktop consumes far, far less than 1 watt of power. Encoding 1080p@60fps video? 3-5 watts. Stress testing with Unigine Heaven? 15 watts. At least, that's what HWiNFO 64 reports.)

            Secondly, if you wish to have more cores, buy enthusiast or server hardware (or AMD's new Zen stuff) that doesn't have a GPU designed in them. The real reason why your i7 has 4 cores and an integrated is that it's a higher-binned consumer CPU. the only difference between a $400 i7 and a $50 celeron from the same generation is that the $50 has less working parts (but still workable under certain QC requirements) than the $400 (if you looked at both chips under a microscope, you'd see an identical chip). Certain parts like 2 of the cores don't work because there are manufacturing defects. There will always be manufacturing defects (and no matter how hard you try, you can't remove them).

            The only thing that would make sense before creating a new chip design altogether, is to add a new binning tier/tier tree that does not have the gpu in part of the QC requirements. There are CPU's that would be in the top 0.1% of overclocks, but literally thrown away by intel because the GPU doesn't work (or perhaps kept by intel to hand out to their employees; or stolen by the Chinese factory workers)
            You know when Skylake first launched It couldn't even run Aero on windows, but most shipped it with Aero enabled it resulted in a horrible end user experience. And there were millions of them sold like that. I still to this day get Skylake systems in for repair where the only real option is GPU upgrade. It's the only thing that can be done., And if it's a laptop, then the only option is to buy a new one.

            It was a real scummy situation where it couldn't even run the desktop comfortably. And it wasn't even the first time. The same -exact- thing happened when XP launched and GMA couldn't run Luna comfortably.

            Comment


            • #36
              Originally posted by Opossum View Post
              I was looking into installing another (but relatively weaker) graphics card to do a few things like video encoding and display offloading (turns out having a display connected doing nothing but rendering the desktop chops off 1-5% 3D performance). I was thinking of installing my old $200 GeForce 9800 GT (was suuper fast when I first got it as a birthday present), but it turns out (according to online benchmarks) that the integrated GPU of my skylake (HD 530) was over 2x faster than the nvidia.

              I did some more research on this, it turns out a $400 nvidia card from 2009 (GTX 285) is slower than skylake. I wonder what the GTX 1070 and the 1080 would look like in 2025...
              You should adjust prices, because whoever thinks a 9800 GT is still worth anywhere near 200$ or that a GTX 285 is still worth anywhere near that too is a moron.

              Yeah Intel iGPU is so cool that is better than cards I can find on Ebay at like 50$ tops while a modern low-end NVIDIA card that costs less than 80$ pwns it.

              Because it sounds like you are unaware, in contemporary integrated circuit design (like cpu's), we employ or clock gating (takes power to switch transistors, don't do that) and power gating (turn off parts altogether) in effort to save power. So unless you turn on your integrated GPU and attach a monitor to it, more than likely, it will consume 0 power. Dark silicon.
              I bunched together 2 things,
              -Desktop users where the iGPU is in fact shut down but still wasting die space that would have been better used for whatever, even just a bigger L3/4 cache
              -Laptop users where the iGPU has the screens attached so it is always on, and while still wasting die space it also wastes thermal budget on a very thermally-constrained part already.

              (and besides, I noticed at least my skylake GPU idling at desktop consumes far, far less than 1 watt of power. Encoding 1080p@60fps video? 3-5 watts. Stress testing with Unigine Heaven? 15 watts. At least, that's what HWiNFO 64 reports.)
              On a part that most likely has around 15W total TDP (and a cooling system sized accordingly), yeah that matters.

              Secondly, if you wish to have more cores, buy enthusiast or server hardware (or AMD's new Zen stuff) that doesn't have a GPU designed in them.
              I want applications to be able to use more than 4 cores, which is a different thing. If most CPUs in the world are still dualcores (like low-end Intel stuff), and even most high-end stuff is a quadcore, then there is little incentive for developers to actually use more than 4 cores outside of specialistic software.

              The real reason why your i7 has 4 cores and an integrated is that it's a higher-binned consumer CPU. the only difference between a $400 i7 and a $50 celeron from the same generation is that the $50 has less working parts (but still workable under certain QC requirements) than the $400 (if you looked at both chips under a microscope, you'd see an identical chip). Certain parts like 2 of the cores don't work because there are manufacturing defects. There will always be manufacturing defects (and no matter how hard you try, you can't remove them).
              This isn't 100% true (lower end parts are lower end designs and are binned but could have never become more than a Pentium, Atoms are their own lines too, it's i3, i5 and i7s that share the same silicon/design with Xeons/low power Xeons).

              And manufacturing defects can be dealt with by making moar cores to begin with (so the lowest end still has 4 working ones).

              The only thing that would make sense before creating a new chip design altogether, is to add a new binning tier/tier tree that does not have the gpu in part of the QC requirements. There are CPU's that would be in the top 0.1% of overclocks, but literally thrown away by intel because the GPU doesn't work (or perhaps kept by intel to hand out to their employees; or stolen by the Chinese factory workers)
              I couldn't care less of overclocking, and a 4-core part is still a 4-core part even without iGPU.

              Comment


              • #37
                Originally posted by duby229 View Post
                You know when Skylake first launched It couldn't even run Aero on windows, but most shipped it with Aero enabled it resulted in a horrible end user experience. And there were millions of them sold like that. I still to this day get Skylake systems in for repair where the only real option is GPU upgrade. It's the only thing that can be done., And if it's a laptop, then the only option is to buy a new one.

                It was a real scummy situation where it couldn't even run the desktop comfortably. And it wasn't even the first time. The same -exact- thing happened when XP launched and GMA couldn't run Luna comfortably.
                Yeah, thanks for reporting, I'll file this near your claims of "I see tearing on Windows with Intel".

                Comment


                • #38
                  Originally posted by starshipeleven View Post
                  Yeah, thanks for reporting, I'll file this near your claims of "I see tearing on Windows with Intel".
                  The proof is in the looking, but you won't do that.... So.....

                  Comment


                  • #39
                    Originally posted by duby229 View Post
                    That is -NOT- a consumers responsibility.
                    Yes it is, if people can't fucking read or ask questions to experts it's their own problem. Really, this applies to every product.

                    A consumer goes to Best Buy or Walmart and buys whatever is on sale.
                    And gets stuff worth what they paid it for. People that does not do research before a consistent purchase is in for bad surprises, in any field.

                    It is Intel's job to make sure that what they sell is capable of fulfilling expectations.
                    And they do. A office PC sold as such will fulfill the expectations of driving screens and do productivity.

                    If someone has unrealistic expectations on it (like say playing AAA games) then it's his own problem.

                    Comment


                    • #40
                      Originally posted by duby229 View Post
                      The proof is in the looking, but you won't do that.... So.....
                      The proof is that any kind of benchmark done on Skylake that you can find on Google didn't detect any such issue.

                      Comment

                      Working...
                      X