Announcement

Collapse
No announcement yet.

Intel Core i7 1165G7 "Tiger Lake" Linux Performance With The Dell XPS 13 9310

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    damn, why is frequency scaling so broken...

    Comment


    • #12
      Originally posted by treba View Post
      We just ordered a bunch of Tuxedo Pulse 14 Laptops with Ryzen 4800H - I assume they will run cycles around what's shown here :/
      Can you give Michael ssh access to one of them pretty please?

      Comment


      • #13
        The Core i7 1165G7 CPU package power consumption was 19 Watts under all the benchmarks on average but with a reported peak of 46 Watts.
        So basically it's running at 15W sustained load with the 28s long short boost mode. This effects longer and heavier benchmarks in particular, it explains the mediocre performance. The question is if this is Dell or Linux related.

        Comment


        • #14
          Originally posted by birdie View Post

          Single threaded performance regressions look outrageous. The 1065G7 boost up to 3.9GHz, the 1165G7 boosts up to 4.7GHz and loses?
          Check the test, it isn't boosting to 4.7 Ghz. It boost up to 3100 Mhz which is laughable.

          Comment


          • #15
            Typo:

            Originally posted by phoronix View Post
            Likewise with Tenent's NCNN inference framework there

            Comment


            • #16
              Is the laptop vPro? Because new hardware based mitigations are available with vPro branded Intel systems.

              Comment


              • #17
                Originally posted by HEL88 View Post

                Dell XPS is the way better laptop than these Lenovo. 250/300 vs 500 nits ?? Really? etc,
                The HP Tigerlakes are all 250 nits as well.

                Comment


                • #18
                  For me, this benchmark is all about the Xe graphics. I'm looking forward to 8K graphics with HDMI 2.1 out of an Intel NUC. Here is proof that Xe graphics with a Mesa driver is real.

                  Comment


                  • #19
                    Originally posted by Clive McCarthy View Post
                    For me, this benchmark is all about the Xe graphics. I'm looking forward to 8K graphics with HDMI 2.1 out of an Intel NUC. Here is proof that Xe graphics with a Mesa driver is real.
                    Yeah is interesting to have a better GPU from Intel but if they keep this horrible price scheme is not worth it at all + i seriously doubt that puny GPU can handle much at 4k let alone 8k and the same goes for Renoir GPU, simply put DDR4 does not have enough bandwidth but could be interesting for next models on DDR5

                    Comment


                    • #20
                      For my purposes, I don't really care about raw GPU performance. I'm not a game player nor a game developer. I am interested in 8K for my artwork. As far as I know there are no 8K graphics cards out there with HDMI 2.1 for any price, at any level. Nvidia & AMD provide hardware for the game market which isn't about 4K let alone 8K. I'm told that most game players only have HD but they do want faster frame rates. For me, 30fps is fine.

                      The current Intel GPUs handle 4K well and have done so for many years. I have many artworks that use Intel NUCs at 4K.

                      Comment

                      Working...
                      X