Announcement

Collapse
No announcement yet.

Ubuntu 20.10 / GNOME 3.38 Could See Better Intel Gen9 Graphics Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by CochainComplex View Post

    Gaming 4k@60 fps is also very untypical because it might not be even possible with a nvidia 2080 rtx.
    Oh, it is perfectly possible. The only barrier is some morons trying to configure the graphics options to max settings, on hardware configurations were it is not recommended. I actually saw a "youtuber" activating 8X AA on a 4k monitor and them declaring 4k is not viable on today's cards...

    To give you a example, I have (today considered a entry level setup) a old non-overclocked i7 3770k + a RX 570 4GB card, both can render Euro Truck Simulator 2, on Linux, at 4k 60FPS on near max settings. And on Windows the performance is even better.

    So the biggest barrier for 4k 60FPS barrier is not hardware, but some people pride to claim their rig can play everything at max settings. Even if it make little to no difference from medium/high settings, except sunk your FPS.

    Comment


    • #12
      Originally posted by M@GOid View Post

      Oh, it is perfectly possible. The only barrier is some morons trying to configure the graphics options to max settings, on hardware configurations were it is not recommended. I actually saw a "youtuber" activating 8X AA on a 4k monitor and them declaring 4k is not viable on today's cards...

      To give you a example, I have (today considered a entry level setup) a old non-overclocked i7 3770k + a RX 570 4GB card, both can render Euro Truck Simulator 2, on Linux, at 4k 60FPS on near max settings. And on Windows the performance is even better.

      So the biggest barrier for 4k 60FPS barrier is not hardware, but some people pride to claim their rig can play everything at max settings. Even if it make little to no difference from medium/high settings, except sunk your FPS.
      Ok you are totally right about AA ...it stops making sense at higher resolutions. And some settings are really bs.
      But why High Settings? Well, before I start making Pictures more Crips (more details) I will crank up Geometry ...etc. Because if I have to choose between nice and smooth graphics vs. High Res 4k and lower settings - I will go for low res but max settings....I dont care if I can play e.g. CS Go at 4k with 100fps but then it looks like lego.

      Besides If you have 4k lower settings are even worse because differences are more obvious. The natural blurring of a lower res ist not there anymore.

      That is the reason why people go Max settings first and then Max resolution second. .
      Isnt that the reason for 4k the changes are more subtile and you need a higher resolution to see the advantage of them?

      Comment


      • #13
        Originally posted by CochainComplex View Post

        Ok you are totally right about AA ...it stops making sense at higher resolutions. And some settings are really bs.
        But why High Settings? Well, before I start making Pictures more Crips (more details) I will crank up Geometry ...etc. Because if I have to choose between nice and smooth graphics vs. High Res 4k and lower settings - I will go for low res but max settings....I dont care if I can play e.g. CS Go at 4k with 100fps but then it looks like lego.

        Besides If you have 4k lower settings are even worse because differences are more obvious. The natural blurring of a lower res ist not there anymore.

        That is the reason why people go Max settings first and then Max resolution second. .
        Isnt that the reason for 4k the changes are more subtile and you need a higher resolution to see the advantage of them?
        Personally I not even have a 4k monitor. I only tested it when I took my rig to a friend's house to test in his high-end TV.

        I am fond on cranking-up texture and geometry settings, because those are the most visible and appealing to me. Light effects took a back seat because I less sensible to them and how heavy they are, hardware wise.

        In the end, I choose a middle ground in graphics settings on a game. I don't like aliasing and blurriness, so playing in native resolution is a must. Usually, is not the textures and geometry that kills performance the most, is the lighting effects. So instead of lowering the resolution (I have a 1080p monitor), I usually crank-down lighting settings while preserving texture/geometry.

        The point I was trying to make is that 4k is not that big deal some people think it is. It became very clear to me that, if your rig is high end (RTX-2080-like) you can play 4k/60FPS all day long if you avoid some ridiculous settings, like turning AA to 8X.

        Comment


        • #14
          Originally posted by brent View Post
          UHD 620/630 graphics are simply a bit too slow for a smooth 4K experience, despite the name. Remember, Gen9 graphics go back to 2015 (Skylake) and are more or less unchanged since then, except for some clock speed improvements. Windows has a well-optimized compositor, but the 4K experience is rather sluggish with UHD 620/630 on this OS, too.

          Apple only uses high DPI displays. Do you think it is a surprise that Apple only uses Iris graphics? Nope. Iris iGPUs have roughly twice the number of shader units and they benefit from fast last level cache memory. You simply need that kind of power for smooth 4K desktop rendering.

          So, if GNOME gets some further performance improvements for systems with slow GPUs that's great, but it's unlikely these slow iGPUs will ever work well for 4K.
          I agree. When I got my 4K monitor years ago, I first tested it with the integrated card (an HD 530). It was too heavy and the card often dropped to 30.

          Comment


          • #15
            Originally posted by Mez' View Post
            With Xorg, you need to use cvt and a conf file with xrandr --addmode and --newmode in your xorg.conf.d folder to get anything else than 4k30.
            This isn't (necessarily?) true- I have three monitors: 1080p@75Hz on the left, my laptop's 1920x1200@60Hz, and a 4K@60Hz on the right, KDE gives me that right out of the box.

            Comment


            • #16
              Originally posted by kcrudup View Post

              This isn't (necessarily?) true- I have three monitors: 1080p@75Hz on the left, my laptop's 1920x1200@60Hz, and a 4K@60Hz on the right, KDE gives me that right out of the box.
              Another flaw of Gnome then.

              Comment


              • #17
                Originally posted by Mez' View Post
                Another flaw of Gnome then.
                I am running without issues on Ubuntu 20.04 Gnome with 4K 60Hz on NVIDIA. Out of the box. Had not a single issue.

                Comment


                • #18
                  Originally posted by Mez' View Post
                  Another flaw of Gnome then.
                  Not necessarily.
                  I' running a Dell XPS laptop with it's display on 4k/60Hz and an external monitor at 4k/60Hz on Fedora/Gnome/Xorg.

                  Comment


                  • #19
                    Actually I think it is quite possible to run 4K@60 for desktop tasks on UHD630. Judging by my macmini 2012 with HD4000 Intel GPU that can run two 1080p displays perfectly smooth on 60 FPS (in macOS obviously). In pixels that's only 2 times smaller than 4K. And UHD630 in average is 3-4 times faster than old Ivy Bridge graphics.

                    Comment


                    • #20
                      Originally posted by CochainComplex View Post
                      Youtube 4k@60fps feels more like 4k@30fps but not really lame. "normal" 4k clips (with 29fps?) are fine.
                      Are you using hardware acceleration when decoding video? If not, then the system's performance doesn't have a lot to do with the gpu.

                      Comment

                      Working...
                      X