Announcement

Collapse
No announcement yet.

Sandy Bridge Become Quicker With Linux 3.3, 3.4

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Sandy Bridge Become Quicker With Linux 3.3, 3.4

    Phoronix: Sandy Bridge Become Quicker With Linux 3.3, 3.4

    With the release of the Linux 3.3 kernel being imminent and the Linux 3.4 kernel drm-next already offering lots of changes, here are some Intel Sandy Bridge benchmarks comparing the Linux 3.2 kernel to a near-final Linux 3.3 kernel and then the drm-next kernel that's largely a 3.3 kernel but with the DRM driver code that will work its way into Linux 3.4.

    http://www.phoronix.com/vr.php?view=17157

  • #2
    Thank you, Intel!

    Comment


    • #3
      speed up in Ubuntu 12.04

      I thought that major performance speed up in kernel 3.3 is thanks to RC6 support, isn't it? And Ubuntu 12.04 has backported RC6 support for Intel.

      Comment


      • #4
        How many more speedups can intel squeeze out?

        Now i'm really wondering how the linux performance compares to the windows performance since they seem to keep improving the linux version all the time.

        Comment


        • #5
          Originally posted by bongmaster2
          y ivy bridge and haswell will make middle class laptop graphic cards obsolete.
          the only thing missing is dafault vaapi support for gstreamer , vappi, flash, html5/vp8.
          that is very unlikely, it can be said of AMD's Trinity, and future APUs but Intel will probably continue to lag around the bottom end of whatever the current generation at the time will be. Particularly given I doubt they'll be competitive by Ivy bridge and by the time Haswell comes out, AMD will have had time to completely optimize GCN and we have yet to see where Kepler stands.

          Comment


          • #6
            Originally posted by bongmaster2
            well ivy bridge is close to liano. there are only 10-20% missing. trinity ofc wil daramatically increase the performance from ca. 6570 to 7750 this summer, while intel will reach that level with haswell in spring 2013.
            Ivy Bridge might be close to Llano, however it will be competing with Trinity, and the rest of the 7xxx series lineup, and I seriously doubt Haswell which will be competing with Trinity's Successor and the 8xxx series lineup, unless they've rearchitected their GPU , will go much past Llano.

            Comment


            • #7
              Originally posted by bongmaster2
              yes amd will always be 1 generation ahead. but haswel will be a mega step forward. they are almost only focusing on the gpu.
              while ivy bridge gives ca 20-50% performance boost, haswell is going to add 50% and more to ivy bridge.
              okay lets just say for a moment that you're right and somehow Haswell does manage to pull off a big boost (I don't trust that given Intel's GPU history). Even so It'll still be at a major performance loss versus whatever the current Midrange at the time is. Plus the kinds of people who go for midrange discrete graphics in a laptop aren't going to be going for intel gpus anyway.

              Comment


              • #8
                Originally posted by bongmaster2
                haswell will be a boost like from gma to sandy bridge. y haswell will be lower midrange. but that will be enough for all games if you run them on 1366x768 on high-max details. trinity ofc will provide high midrange performance almost at least 6 months earlier.
                All games, or all Linux games? if we're talking just native linux games I'll concede that point, but all games in general... no

                Comment


                • #9
                  Originally posted by bongmaster2
                  y ivy bridge and haswell will make middle class laptop graphic cards obsolete.
                  the only thing missing is dafault vaapi support for gstreamer , vappi, flash, html5/vp8.
                  Haswell might, but IB is only going to knockout low-end cards.
                  As for gstreamer, it drives me crazy as well. I know that you splitted-desktop has the gstreamer-vaapi plugin, and that, along with the i965 driver and libva should give you hw accel but I've never managed to get it to work on Fedora.

                  Comment


                  • #10
                    Originally posted by bongmaster2
                    haswell will be a boost like from gma to sandy bridge. y haswell will be lower midrange. but that will be enough for all games if you run them on 1366x768 on high-max details. trinity ofc will provide high midrange performance almost at least 6 months earlier.
                    I think you're right about the performance increase, but that won't mean its able to run any game at high settings @768.

                    Comment


                    • #11
                      Originally posted by bongmaster2
                      yes amd will always be 1 generation ahead. but haswel will be a mega step forward. they are almost only focusing on the gpu. while ivy bridge gives ca 20-50% performance boost, haswell is going to add 50% and more to ivy bridge.
                      Yep2.. With mega price increase as well. Congrats!

                      Originally posted by bongmaster2
                      haswell will be a boost like from gma to sandy bridge. y haswell will be lower midrange. but that will be enough for all games if you run them on 1366x768 on high-max details. trinity ofc will provide high midrange performance almost at least 6 months earlier.
                      Hm.. with the like of gt540m or 6650m, middle class laptop gpu that still struggle to run high details @768, you expect haswell gpu to run that fluidly, haha.. What are you drinking, sir? http://www.notebookcheck.net/NVIDIA-...M.41715.0.html

                      Comment


                      • #12
                        Originally posted by liam View Post
                        I think you're right about the performance increase, but that won't mean its able to run any game at high settings @768.
                        Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

                        My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

                        And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .

                        Comment


                        • #13
                          Originally posted by eugeni_dodonov View Post
                          Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

                          My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

                          And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .
                          I've been hearing stirrings about "4K" (and of course the marketing types screwed up the only sane naming scheme there was for resolutions, that 4k means on the horizontal side..) screens, and while I don't expect laptops will be getting those, it does point to a pixel density increase in the near future.

                          Comment


                          • #14
                            Originally posted by Luke_Wolf View Post
                            I've been hearing stirrings about "4K" (and of course the marketing types screwed up the only sane naming scheme there was for resolutions, that 4k means on the horizontal side..) screens, and while I don't expect laptops will be getting those, it does point to a pixel density increase in the near future.
                            Ditto on the naming scheme.
                            The 4k prototypes look amazing, but last year Sharp demoed an 8k display. So that, along with the new iPad or even the upcoming transformer prime infinity lead me to think that ~200+ dpi screens for laptops are very possible. The main problem will be UI. Gtk had (has?) a resolution independent branch but I don't know how well it works.

                            Comment


                            • #15
                              Originally posted by eugeni_dodonov View Post
                              Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

                              My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

                              And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .
                              Is there a sight probability of Intel to deliver IGP-capable Xeon, which graphics core(s) can be combined in SLI-way on 2- or 4- socket mainboards?
                              I understand this might sound unrealistic, but currently nvidia support of SLI is completely unoptimized, and AMD crossfire are N+1 pass-throughs.

                              Comment

                              Working...
                              X