Announcement

Collapse
No announcement yet.

It Looks Like Intel Could Begin Pushing Graphics Tech More Seriously

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by duby229 View Post

    You know -just- as well as I do that isn't how it is. Most people aren't gamers but every single last one of them expects that when they buy a game it's gonna work for them. And 75% of people right now that just isn't the case. Their expectations are not met.
    A lot of people don't call themselves gamers no, but they also don't go out and buy new AAA games and try to run them on their "Media PCs". With most of the large games that would be really hit by iGPU performance being available through Steam who have their own largely trustworthy stats we can see the percentage of them using Intel iGPUs. It's not that big. Riot, makers of LoL, don't provide stats but I can't imagine it would be overly different to that which steam presents, but it's also not that demanding a game by todays standards and runs fine enough on desktop iGPUs.

    Comment


    • #72
      Obviosly, PlayStation is for gaming and Epyc is not for gaming



      Of course somone could game on RPi if he wants

      Comment


      • #73
        Originally posted by wizard69 View Post
        The problem is you blame Intel for software issues. Microsoft out modes old hardware with new Windows releases on purpose to drive sales of Windows Licenses. It really doesn't matter what Intel does because MS needs the license sales. The performance of Windows will always suck on baseline hardware.

        As for gaminh i think you need to open your eyes a bit the overwhelming majority of PCs out there never see 3D games installed. You seem to ignore the corporate market and other use cases where 2D is the only thing of importance. You can say gaming is important all you want but reality is a different story.

        By the way i totally believe that people looking for a gaming machine screw up and buy a machine with an integrated GPU perform well. That isn't Intels fault though.

        In any event i still think you mis on the tech side of the equation. Intel has no choice but to drive GPU performance and they will do that by using the available dies space a process shrink offers. They are aware of what is needed in the future (it isn't all 3D gaming) and will pursue the required performance to compete with AMD and Apple. Yes Apple because iPad has morphed into a very capable gaming machine and the leaks about Apples Machine Learning chip probably has Intel very concerned.

        In any event trying to apply the needs of the rarified world of 3D gaming to the general PC using population is just foolish.
        The advantages of the latest shrink and so on has been true for every generation released at the time it was released. It's not any different today than it ever was. And through all of it Intel has never set the minimum performance bar for a given generation high enough. Not once. You can try to make the claim all you want that people don't game, But in fact you are wrong about that. It may not be as serious as it would be for a gamer, but it is reasonable for folks to expect a game they bought to work.

        It is not foolish to expect Intel's consumer products should perform well enough to play the games available during that products generation.

        Comment


        • #74
          Originally posted by duby229 View Post
          But windows can't tear, huh? Yes it can and in fact does badly.
          If the issue is only in some specific programs like say firefox, then the issue is of those programs, not "windows".

          Chrome does not tear, Edge does not tear, MSOffice does not tear, also Firefox does not usually tear on Windows if you enable smooth scrolling or change hardware acceleration options.

          Comment


          • #75
            Originally posted by duby229 View Post
            Those aren't the only things that people do. And who do you think you are to tell people that because they chose to buy an Intel system, (almost definitely at a sales reps recommendation) that they shouldn't be gaming? Really?
            I think that if people make uninformed choices it's their own problem.

            Really if someone buys a normal boring car and expects it to be good for racing it's his own problem. You're saying that car manufacturers should only sell racing cars because people can't make informed choices.

            Comment


            • #76
              Originally posted by caligula View Post
              It doesn't need to. It already beats legions of highest end legacy cards, thus people who aren't that much into gaming but want to play their old games can just upgrade the system and don't loose anything.
              "legions of highest end cards from 1990" maybe. Its 3D performance sucks even if compared to a GT 9800 that is as old as I can realistically go (driver support). Case in point, I could play Warfarm (Warframe) with the mentioned card while on the HD4000 of my Xeon it could not run decently.

              The iGPU practically free, too, included on the chip.
              It's not "free" it is a tradeoff where you get less features like cores or caches to fit that iGPU.

              I used Ivy Bridge and Haswell graphics for years.
              I still use and have no intention of dumping my Ivy Bridge Xeon that has a iGPU, so I know what it can and cannot do.

              Currently many gamers think 7700k is a better gaming CPU than Ryzen 1800X or Threadrippers, even when comparing overclocked AMD vs 7700k. They wouldn't switch even if they got Ryzens for free.
              Marketing is irrelevant. Dumb people still buy new Intel CPUs (and mobo and sometimes RAM) even when their "old" Intel rig is like 5% less powerful than the new, just because they are still in the "must upgrade all every 3 years" mindset.

              Smarter ones ask around on forums or look at benchmarks.

              Comment


              • #77
                Originally posted by caligula View Post
                Cherrytrail is already an obsolete legacy platform.
                Same story with every Intel's generation, right? In the beginning something doesn't work well because it's brand new, so some issues is expected, then new generation is released, so those issues doesn't get fixed, because fixing those issues won't increase any sales except for low end chinese tablets.

                Even Dell 9250 with Skylake freeze from switch Gnome Terminal to fullscreen mode from time to time. Which generation is expected to work well? Sandy Bridge?

                Comment


                • #78
                  Originally posted by caligula View Post
                  Of course the dedicated GPUs HAVE TO run faster. Otherwise nobody would buy them. The lowest end GPUs actually have same kind of memory with similar perf. For instance 64-bit DDR3. Intel iGPUs have really killed the potential sales of several generations of low power GPUs from Nvidia. There's Geforce 210, 610-620, 710-730, but after that, only recently 1030. It's clear as to why there are so many missing models. All sorts of budget models of Nvidia GPUs have been available since 2001.
                  Lowest end cards were for "just showing stuff on screen" + hardware acceleration of media, which is more or less what the iGPU is doing, not for gaming.

                  It's especially nasty situation for HTPC builders since the missing models often were passively cooled.
                  I didn't see HTPC people cry much about that. HTPC users don't need newer 3D APIs, and after iGPUs could drive 4K properly (years ago, APUs got there first, btw) the last reason to get a dedicated low end card in HTPCs dropped.

                  The ones crying were people that wanted a kinda-game-console thing, or just clueless ones.

                  Comment


                  • #79
                    Originally posted by Michael_S View Post
                    While I agree with you on the principles that apply here, I don't think it's fair to condemn non-technical people for screwing it up.

                    Shopping for toasters, televisions, smart phones, books, shirts, plane fares, apartment rentals, and even pets is substantially less complicated than researching an appropriate purchase for a gaming PC.
                    I condemn anyone that does not do his own research and buys whatever is on sale.

                    It is the best way to get fucked sideways and end up with whatever random crap the seller was trying to get rid of (which should already speak about how much it is desirable). It is a sign of stupidity, and I despise stupidity.
                    Humans are called "homo sapiens sapiens", which is "man who knows that he knows", it's the meta-thinking and abstraction ability are what makes a human, not the ability to talk and understand spoken language.

                    The complexity of the field is not relevant, if you need something you should be able to do a decent choice. If you know you don't know enough to tell good from bad you look for experts and reviews, in real life or the Internet. Not assume whatever is on sale is OK.

                    And I'll also respectfully disagree on the level of complexity of smartphones, TVs, apartment rentals (and housing in general), and pets.

                    But to your point, Intel can't put a label on a particular machine - especially since they're not selling the finished desktop or laptop, Dell/HP/Toshiba/whatever is - indicating its appropriate use and giving a definitive list of games and display resolutions that are supported. And some of the machines can be easily upgraded, and some can't.
                    And they usually do. I've yet to see devices with only iGPUs being marketed as "gaming", outside of dodgy internet ebay auctions, anyway.

                    Comment


                    • #80
                      Originally posted by duby229 View Post
                      Maybe not label, but they can design adequate GPU's but unfortunately they haven't. Also we are gamers so we consider gaming computers, but most won't ever think that way.
                      Waah waah all cars aren't race cars, I want all car manufacturers to make only race cars because all people need race cars and can't tell the difference, waah waah.

                      After all you don't buy gaming smartphones either. You expect that what you pay for is going to work. It's true for Androids, -not- true at all for PC's, and that is entirely Intel's fault.
                      HAHAHAHAAHAHAHAAHAHAHAHAHAHAAHAAAGAGAHAGDGFWFRUUAU HHAILCHTULHU!

                      Really, you're so funny. People looks for gaming smartphones too, and if you get in any kind of Android forums you see plenty of people complaining that their crap phone can't run heavy Android games.

                      Or on XDA, where people tweak the hell out of their teapots (adding swap, changing governors, hacking kernel) to be able to game.

                      Comment

                      Working...
                      X