Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by duby229 View Post
    Multiple people on this thread have called Intel graphics fine.... Really? Apparently you guys never worked in a computer repair shop. Almost all times when someone complains there games don't work well, or that there computer is too slow the solution is a clean up and a graphics card replacement or a new laptop sale. The reason why is usually Intel's incapable graphics. Literally the only product they have that can be called Ok-ish is Iris, and only the ones with eDRAM at that. And they are -WAY- overpriced.
    Let's not forget the subpar support we've gotten for GPUs. How many times have I read "Intel GPUs not officially supported" on a game's requirements... Even for BG:EE. Ugh.

    Comment


    • #22
      Originally posted by duby229 View Post
      Multiple people on this thread have called Intel graphics fine.... Really? Apparently you guys never worked in a computer repair shop. Almost all times when someone complains there games don't work well, or that there computer is too slow the solution is a clean up and a graphics card replacement or a new laptop sale.
      Apparently you don't work in a repair shop either.

      And iGPUs were never meant to game anyway, it's not like I can game on an AMD E1 APU either.

      Comment


      • #23
        Originally posted by caligula View Post
        Their previous models already beat low end models from competitors. They basically killed the competition. Many office users are quite happy with the iGPU. Now they will make them faster, that is, capable of doing more basic office work and even casual games.
        Ummm..... how did Intel HD graphics beat the iGPU of APUs, again?

        How is that bad when most apps still can't use more than 2-4 cores?
        Less crap on die= less heat = better clocks =better performance.
        Also having consistently more cores in CPUs would send the message to developers.
        Apps don't use more than 4 cores because most CPUs around don't have more than 4 cores anyway.

        Even gamers want Intel CPUs as their single threaded single core perf is so great.
        Yeah, gamers want performing CPUs, not CPUs that are wasting die area and performance for a bullshit iGPU they never use.

        Comment


        • #24
          Intel is just ramping up to compete in an area that AMD is king - the APU.

          Instead of doing what they already do good they're pulling a Canonical here with grandiose ideas of becoming AAA in a new sector instead of doing what they already excell in.

          It'd be like Bill Gates deciding he hasnt yet become #1 in Construction so he's gonna become a contractor and leave behind more important uses of his time. Truely Absurd.

          Comment


          • #25
            Originally posted by starshipeleven View Post
            Apparently you don't work in a repair shop either.

            And iGPUs were never meant to game anyway, it's not like I can game on an AMD E1 APU either.
            Like it or not that's what people do. That's why I don't sell those products. I've been around long enough, I understand what people need.

            EDIT: And btw, you just compared one of AMD's cheapest products to almost all of Intel's products. That says a whole lot whether you recognize it or not.
            Last edited by duby229; 15 July 2017, 03:11 PM.

            Comment


            • #26
              Originally posted by starshipeleven View Post
              Ummm..... how did Intel HD graphics beat the iGPU of APUs, again?
              Lots of low end models are slower than iGPU. Geforce 256, Geforce 2, Geforce 3, Geforce 4, Geforce FX, Geforce x100, Geforce x10 and so on..

              Comment


              • #27
                I think they might be finally realizing that they need to be in the GPU market in order to build their PC platforms. If they let you have a lot of PCIe slots today, you will buy fewer Intels. This is where they are going to lose to AMD in the server market this year.

                If Intel can scale their GPU business up into something credible for high performance compute workloads, and do this before they run out of money, they can still possibly come out on top.

                Comment


                • #28
                  Originally posted by duby229 View Post

                  That's fine if you want to pay waay too much money for it and essentially have 0 requirement for anything 3 dimensional. (EDIT: At least at those resolutions.)
                  I was looking into installing another (but relatively weaker) graphics card to do a few things like video encoding and display offloading (turns out having a display connected doing nothing but rendering the desktop chops off 1-5% 3D performance). I was thinking of installing my old $200 GeForce 9800 GT (was suuper fast when I first got it as a birthday present), but it turns out (according to online benchmarks) that the integrated GPU of my skylake (HD 530) was over 2x faster than the nvidia.

                  I did some more research on this, it turns out a $400 nvidia card from 2009 (GTX 285) is slower than skylake. I wonder what the GTX 1070 and the 1080 would look like in 2025...

                  Originally posted by starshipeleven View Post
                  Ummm..... how did Intel HD graphics beat the iGPU of APUs, again?

                  Less crap on die= less heat = better clocks =better performance.
                  Also having consistently more cores in CPUs would send the message to developers.
                  Apps don't use more than 4 cores because most CPUs around don't have more than 4 cores anyway.

                  Yeah, gamers want performing CPUs, not CPUs that are wasting die area and performance for a bullshit iGPU they never use.
                  Because it sounds like you are unaware, in contemporary integrated circuit design (like cpu's), we employ or clock gating (takes power to switch transistors, don't do that) and power gating (turn off parts altogether) in effort to save power. So unless you turn on your integrated GPU and attach a monitor to it, more than likely, it will consume 0 power. Dark silicon.

                  (and besides, I noticed at least my skylake GPU idling at desktop consumes far, far less than 1 watt of power. Encoding 1080p@60fps video? 3-5 watts. Stress testing with Unigine Heaven? 15 watts. At least, that's what HWiNFO 64 reports.)

                  Secondly, if you wish to have more cores, buy enthusiast or server hardware (or AMD's new Zen stuff) that doesn't have a GPU designed in them. The real reason why your i7 has 4 cores and an integrated is that it's a higher-binned consumer CPU. the only difference between a $400 i7 and a $50 celeron from the same generation is that the $50 has less working parts (but still workable under certain QC requirements) than the $400 (if you looked at both chips under a microscope, you'd see an identical chip). Certain parts like 2 of the cores don't work because there are manufacturing defects. There will always be manufacturing defects (and no matter how hard you try, you can't remove them).

                  The only thing that would make sense before creating a new chip design altogether, is to add a new binning tier/tier tree that does not have the gpu in part of the QC requirements. There are CPU's that would be in the top 0.1% of overclocks, but literally thrown away by intel because the GPU doesn't work (or perhaps kept by intel to hand out to their employees; or stolen by the Chinese factory workers)

                  Comment


                  • #29
                    Originally posted by duby229 View Post
                    Like it or not that's what people do.
                    Intel iGPUs (any iGPU) was never marketed for gaming, if people assume bullshit it's their own fault.

                    EDIT: And btw, you just compared one of AMD's cheapest products to almost all of Intel's products. That says a whole lot whether you recognize it or not.
                    Yeah, it says that I know that the iGPU in the E1 pwns anything less that a modern i5 iGPU (it should be around on par if not better than Intel HD4000), and yet it still isn't marketed for gaming.

                    Laptops with AMD APUs like say A10-9600 pwn any iGPU from Intel by a long shot. Except maybe Iris Pro which is far more expensive (and not cost-effective vs laptop dedicated graphics) anyway.

                    On such laptops you can actually play some games. For example I can play XCOM 1 on mine.

                    Again still not marketed as "gaming".
                    Last edited by starshipeleven; 15 July 2017, 05:28 PM.

                    Comment


                    • #30
                      Originally posted by caligula View Post
                      Lots of low end models are slower than iGPU. Geforce 256, Geforce 2, Geforce 3, Geforce 4, Geforce FX, Geforce x100, Geforce x10 and so on..
                      Well, that's not exactly "competition", it's more like "obsolete crap off ebay".
                      Even the lowest end dedicated new NVIDIA GPU runs twice as good as the iGPU, an iGPU can't compete with dedicated cards with dedicated VRAM and their own thermal envelope, but at least AMD desktop APUs are better than new low-end cards.

                      Competition for iGPUs is APUs from AMD, and Intel won because it has higher CPU power, and more marketing, and more mafia.
                      Last edited by starshipeleven; 15 July 2017, 05:27 PM.

                      Comment

                      Working...
                      X