Announcement

Collapse
No announcement yet.

Ubuntu 20.10 / GNOME 3.38 Could See Better Intel Gen9 Graphics Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Zan Lynx
    replied
    If your Intel graphics are having a hard time with [email protected] check your RAM setup. If it is single channel that's probably the problem.

    At least that's what cripples my T580, it got shipped with a single 16 GB RAM module. My older Dells with 4XXX and 7XXX CPUs have no problem driving dual 4K displays, and they both have 16 GB in 2x8GB.

    Leave a comment:


  • intelfx
    replied
    I hope to hell those optimizations land upstream in the end.

    Leave a comment:


  • brent
    replied
    Originally posted by sarmad View Post
    Let's do the math:
    A single 4K monitor is 4 times a FHD, so: 1920 * 1080 * 4 = 8,294,400 pixels
    To get 60fps: 8,294,400 * 60 = 497,664,000
    That is approximately 0.5 gpixels/second
    UHD 630 should be theoretically capable of 3.3 gpixels/second according to this
    There are several things to consider here:

    First, the fill rate specified is typically a theoretical maximum under ideal conditions. Unless all you do is drawing single-color opaque full-screen rects, you won't achieve it in practice. In particular, fill rate specifications don't consider blending, which is used all the time for desktop compositing and has a significant overhead. You also need to consider texture sampling, which interacts with rendering (due to bandwidth constraints) and in some cases shared cache.

    Secondly, there's a LOT more going on than just desktop compositing nowadays. Think of your web browser, for instance, it typically heavily uses the GPU for rendering. The same is true for many contemporary UI toolkits, like Qt. So in practice, applications and desktop compositing compete for GPU processing power and you can't use it all just for compositing. On top of all that, on mobile SoCs, the CPU part of the package also competes for memory bandwidth and power budget with the GPU.

    Third, parallelism/concurrency and scheduling isn't as good as it theoretically could be. Frames need to be scheduled to be finished on time, and if you're close to the edge (e.g. your frame takes approximately ~14 ms to render to a screen with 60 Hz / 16.6 ms frame duration), you can quite easily miss a frame, even though ideally everything should still be fine, resulting in stutter.

    Leave a comment:


  • sarmad
    replied
    Let's do the math:
    A single 4K monitor is 4 times a FHD, so: 1920 * 1080 * 4 = 8,294,400 pixels
    To get 60fps: 8,294,400 * 60 = 497,664,000
    That is approximately 0.5 gpixels/second
    UHD 630 should be theoretically capable of 3.3 gpixels/second according to this: https://www.techpowerup.com/gpu-spec...hics-630.c3107
    Of course the specs are theoretical and there is also possible duplication of pixel copying due to things like effects and transparency, add to that the fact that you don't want to run your CPU at max frequency and drain your battery just for some fancy UI animations, but we are talking about more than 6 times the amount of theoretical needed bandwidth, which is plenty of extra buffer. So UHD 630 should definitely be capable of running a single monitor at 4k/60fps without turning your laptop into a vacuum cleaner.

    Leave a comment:


  • Mez'
    replied
    Originally posted by joepadmiraal View Post

    Not necessarily.
    I' running a Dell XPS laptop with it's display on 4k/60Hz and an external monitor at 4k/60Hz on Fedora/Gnome/Xorg.
    I do as well (4k60 TV and 4k60 monitor), just that I have to manually configure it on X.
    On wayland, for my old laptop I also have to manually configure the official gpu max resolution (2560x1600) with a kernel boot parameter (video=HDMI-A-1:[email protected]), otherwise it doesn't reach that max resolution limit out of the box (only 2K).

    Either I'm really unlucky or the display mangement is still lacklustre.

    Leave a comment:


  • Space Heater
    replied
    Originally posted by CochainComplex View Post
    Youtube [email protected] feels more like [email protected] but not really lame. "normal" 4k clips (with 29fps?) are fine.
    Are you using hardware acceleration when decoding video? If not, then the system's performance doesn't have a lot to do with the gpu.

    Leave a comment:


  • V1tol
    replied
    Actually I think it is quite possible to run [email protected] for desktop tasks on UHD630. Judging by my macmini 2012 with HD4000 Intel GPU that can run two 1080p displays perfectly smooth on 60 FPS (in macOS obviously). In pixels that's only 2 times smaller than 4K. And UHD630 in average is 3-4 times faster than old Ivy Bridge graphics.

    Leave a comment:


  • joepadmiraal
    replied
    Originally posted by Mez' View Post
    Another flaw of Gnome then.
    Not necessarily.
    I' running a Dell XPS laptop with it's display on 4k/60Hz and an external monitor at 4k/60Hz on Fedora/Gnome/Xorg.

    Leave a comment:


  • peterdk
    replied
    Originally posted by Mez' View Post
    Another flaw of Gnome then.
    I am running without issues on Ubuntu 20.04 Gnome with 4K 60Hz on NVIDIA. Out of the box. Had not a single issue.

    Leave a comment:


  • Mez'
    replied
    Originally posted by kcrudup View Post

    This isn't (necessarily?) true- I have three monitors: [email protected] on the left, my laptop's [email protected], and a [email protected] on the right, KDE gives me that right out of the box.
    Another flaw of Gnome then.

    Leave a comment:

Working...
X