Announcement

Collapse
No announcement yet.

Trying Intel Kabylake Graphics With Dawn of War III Vulkan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by uid313 View Post
    The GPU on Kaby Lake is pretty much the same as Skylake which was pretty much the same as Broadwell which was pretty same as Haswell.

    Hopefully AMD will soon have out their next generation of Ryzen CPUs with built-in GPU and some serious graphics power.
    Skylake had a distinctly better IGP than Haswell; not too sure about Broadwell since so few of those chips actually came with an IGP. Aside from DDR4 support, pretty much the most compelling reason to get a Skylake system (in particular, a laptop) was the IGP. At the time of it's release, the DDR4 support was as underwhelming as Ryzen's was.

    I've heard rumors that AMD intends to have a Vega-based GPU for their first AM4 APUs. This is a bit surprising, since AMD has had a tendency to use relatively outdated GPU hardware for their APUs. I think I remember hearing somewhere it'll have roughly RX 560 performance, which is seriously impressive. The 560 makes for a decent budget GPU; if you can Crossfire it, you'd get a pretty solid 1080p gaming rig for a low price.

    Comment


    • #12
      Originally posted by Wielkie G View Post

      Min FPS is useless if you don't know whether it happened once or every n seconds.

      A 95th or 99th percentile would be better.
      I am very sorry, but I lecture statistics and this use of 95th percentile is wrong. 95th percentile tells you what is the lowest value which is higher than 95% of the values in list -- which is quite the opposite of what you want, you what to see what is the lowest value which is faster that 5% of values in list, hence 5th percentile.

      Comment


      • #13
        Originally posted by nocri View Post

        I am very sorry, but I lecture statistics and this use of 95th percentile is wrong. 95th percentile tells you what is the lowest value which is higher than 95% of the values in list -- which is quite the opposite of what you want, you what to see what is the lowest value which is faster that 5% of values in list, hence 5th percentile.
        For FPS, yes. But if you're talking frame time (e.g. 16ms to render a frame for ~60fps), then 95th/99th percentile frame time is a commonly used method of measuring stuttering/badness in video gaming performance. FPS is good for min/max/average, but when you want to start quantifying parts of the game play experience that lead to stuttering/choppiness, frame time is the accepted way to measure things, and a good part of the GPU/gaming review industry has settled on 99th percentile frame times as a way to find which cards have issues with dips in framerates caused by things like streaming loads, draw-time compilation, etc..

        Wielkie G failed to mention that he was addressing 99th percentile frame time, instead of FPS, but it's common enough around here that most of us probably knew what he was talking about.

        Comment


        • #14
          Originally posted by edoantonioco View Post
          but on mac it is supported
          Apple stuff has also better GPUs too, not just Intel iGPUs, they even have Iris Plus that is on par with AMD APUs.

          Comment


          • #15
            Originally posted by schmidtbag View Post
            The 560 makes for a decent budget GPU; if you can Crossfire it, you'd get a pretty solid 1080p gaming rig for a low price.
            If they pull that off, it's gonna sell like hot porn.

            Comment


            • #16
              Well, the performance looks about right for Intel iGPUs, good to see that it is actually working at all.

              Comment


              • #17
                Blah, people can play this at 540p on iGPUs with a bit of AA it would look crisp like 720p without AA OK it is not as crisp as 8K, but nowhere near 32K and so on ... not sure how much but i guess 20-30 fps should be possible

                People played these RTSes at 15 fps native and much lower res 20ish years ago and no one complained
                Last edited by dungeon; 08 June 2017, 07:18 PM.

                Comment


                • #18
                  Originally posted by starshipeleven View Post
                  Apple stuff has also better GPUs too, not just Intel iGPUs, they even have Iris Plus that is on par with AMD APUs.
                  Yea but at a 300% plus pricetag. I've just bought a discounted Acer Aspire S13 for 500.-
                  Last edited by mike44; 09 June 2017, 12:53 AM.

                  Comment


                  • #19
                    Originally posted by mike44 View Post
                    Yea but at a 300% plus pricetag.
                    Kinda well-known. I was pointing out that this game is supported on MacOS because Apple also sells devices that have something (much) better than Intel default iGPUs, not that Apple is the brand to buy.

                    I've just bought a discounted Acer Aspire S13 for 500.-
                    It has standard Intel bullshit graphics (and the usual fake i5 processors, those ending with U are technically mobile i3 processors with higher clocks, dualcore with hyperthreading, and it's also an Acer).
                    Sure the screen is an IPS so it's going to be awesome, and the ssd will also be very welcome on a laptop. USB 3.0 are plentiful.
                    I personally think a 13.3 inch screen is too small, but I assume you chose the best for your needs.

                    Comment


                    • #20
                      Yeah, it would be really nice if we had the useful high percentile frame latency numbers. 99th percentile is not high enough, because even if your 99th percentile latency is good, that could mean that you get a dropped frame once every second and a half or so, which is not particularly good.

                      I think ultimately the most important stat is peak frame time during run (though there are some exceptions, long frames on loading screens, or the first frame involving a bunch of shader compiles). So maybe it would be good to trim the highest two or three frame latencies in a test run, then show the max of the rest.

                      Comment

                      Working...
                      X