Announcement

Collapse
No announcement yet.

Ubuntu 20.10 / GNOME 3.38 Could See Better Intel Gen9 Graphics Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by joepadmiraal View Post

    Not necessarily.
    I' running a Dell XPS laptop with it's display on 4k/60Hz and an external monitor at 4k/60Hz on Fedora/Gnome/Xorg.
    I do as well (4k60 TV and 4k60 monitor), just that I have to manually configure it on X.
    On wayland, for my old laptop I also have to manually configure the official gpu max resolution (2560x1600) with a kernel boot parameter (video=HDMI-A-1:2560x1600@30), otherwise it doesn't reach that max resolution limit out of the box (only 2K).

    Either I'm really unlucky or the display mangement is still lacklustre.

    Comment


    • #22
      Let's do the math:
      A single 4K monitor is 4 times a FHD, so: 1920 * 1080 * 4 = 8,294,400 pixels
      To get 60fps: 8,294,400 * 60 = 497,664,000
      That is approximately 0.5 gpixels/second
      UHD 630 should be theoretically capable of 3.3 gpixels/second according to this: https://www.techpowerup.com/gpu-spec...hics-630.c3107
      Of course the specs are theoretical and there is also possible duplication of pixel copying due to things like effects and transparency, add to that the fact that you don't want to run your CPU at max frequency and drain your battery just for some fancy UI animations, but we are talking about more than 6 times the amount of theoretical needed bandwidth, which is plenty of extra buffer. So UHD 630 should definitely be capable of running a single monitor at 4k/60fps without turning your laptop into a vacuum cleaner.

      Comment


      • #23
        Originally posted by sarmad View Post
        Let's do the math:
        A single 4K monitor is 4 times a FHD, so: 1920 * 1080 * 4 = 8,294,400 pixels
        To get 60fps: 8,294,400 * 60 = 497,664,000
        That is approximately 0.5 gpixels/second
        UHD 630 should be theoretically capable of 3.3 gpixels/second according to this
        There are several things to consider here:

        First, the fill rate specified is typically a theoretical maximum under ideal conditions. Unless all you do is drawing single-color opaque full-screen rects, you won't achieve it in practice. In particular, fill rate specifications don't consider blending, which is used all the time for desktop compositing and has a significant overhead. You also need to consider texture sampling, which interacts with rendering (due to bandwidth constraints) and in some cases shared cache.

        Secondly, there's a LOT more going on than just desktop compositing nowadays. Think of your web browser, for instance, it typically heavily uses the GPU for rendering. The same is true for many contemporary UI toolkits, like Qt. So in practice, applications and desktop compositing compete for GPU processing power and you can't use it all just for compositing. On top of all that, on mobile SoCs, the CPU part of the package also competes for memory bandwidth and power budget with the GPU.

        Third, parallelism/concurrency and scheduling isn't as good as it theoretically could be. Frames need to be scheduled to be finished on time, and if you're close to the edge (e.g. your frame takes approximately ~14 ms to render to a screen with 60 Hz / 16.6 ms frame duration), you can quite easily miss a frame, even though ideally everything should still be fine, resulting in stutter.

        Comment


        • #24
          I hope to hell those optimizations land upstream in the end.

          Comment


          • #25
            If your Intel graphics are having a hard time with 4K@60Hz check your RAM setup. If it is single channel that's probably the problem.

            At least that's what cripples my T580, it got shipped with a single 16 GB RAM module. My older Dells with 4XXX and 7XXX CPUs have no problem driving dual 4K displays, and they both have 16 GB in 2x8GB.

            Comment

            Working...
            X