Announcement

Collapse
No announcement yet.

Linux 3.2 To 3.8 Kernels With Intel Ivy Bridge Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux 3.2 To 3.8 Kernels With Intel Ivy Bridge Graphics

    Phoronix: Linux 3.2 To 3.8 Kernels With Intel Ivy Bridge Graphics

    With the Intel Haswell product launch coming up soon, here's a look at how the Intel "Ivy Bridge" HD 4000 graphics support has matured on the seven most recent Linux kernel releases. This benchmarking shows how the performance of the Intel DRM driver has changed between the Linux 3.2 kernel and the Linux 3.8 kernel that's presently under development when using the integrated graphics found on the latest-generation Core i7 CPU.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Kernel 3.8 has regressed in many games, can you find the bad commit, Michael?

    Comment


    • #3
      Originally posted by BO$$ View Post
      It's good to know that in a year nothing has been done performance wise. Even worse, it seems to be slower these days, but I think that is just a temporary regression.
      Man, you have read labels wrong.

      3.2 was oldest one :P

      Ok joking aside. It just mean that more important is MESA for 3D content.

      Comment


      • #4
        I expect you'll see more difference with the Mesa benchmarks. The kernel code mostly exposes capabilities that userspace code can exercise.

        On the other hand, the whole exercise is complicated a bit by the simultaneous advances in GL levels... it's not unusual for game code to take advantage of new GL features so you end up getting more work done by the graphics stack and a slicker looking result, but the performance stays constant or even slows down a bit.

        If you really want to isolate performance from functionality I don't think there is any general alternative to intercepting and fudging the capabilities exposed by driver to application, so that the game is forced to run the same code paths even if a later driver enables more GL extensions and hence more complex rendering code in the application. IIRC a few games have the ability to force specific rendering paths even if the driver stack allows better ones, but not real sure about that.
        Test signature

        Comment


        • #5
          I wonder, if the Intel driver used Gallium3D instead of Mesa, would it be faster today (would it be more/less stable too)?

          Comment


          • #6
            Any news about that regression in kernel 3.8 ?

            Comment

            Working...
            X