Announcement

Collapse
No announcement yet.

Sandy Bridge Become Quicker With Linux 3.3, 3.4

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by bongmaster2
    yes amd will always be 1 generation ahead. but haswel will be a mega step forward. they are almost only focusing on the gpu. while ivy bridge gives ca 20-50% performance boost, haswell is going to add 50% and more to ivy bridge.
    Yep2.. With mega price increase as well. Congrats!

    Originally posted by bongmaster2
    haswell will be a boost like from gma to sandy bridge. y haswell will be lower midrange. but that will be enough for all games if you run them on 1366x768 on high-max details. trinity ofc will provide high midrange performance almost at least 6 months earlier.
    Hm.. with the like of gt540m or 6650m, middle class laptop gpu that still struggle to run high details @768, you expect haswell gpu to run that fluidly, haha.. What are you drinking, sir? http://www.notebookcheck.net/NVIDIA-...M.41715.0.html

    Comment


    • #12
      Originally posted by liam View Post
      I think you're right about the performance increase, but that won't mean its able to run any game at high settings @768.
      Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

      My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

      And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .

      Comment


      • #13
        Originally posted by eugeni_dodonov View Post
        Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

        My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

        And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .
        I've been hearing stirrings about "4K" (and of course the marketing types screwed up the only sane naming scheme there was for resolutions, that 4k means on the horizontal side..) screens, and while I don't expect laptops will be getting those, it does point to a pixel density increase in the near future.

        Comment


        • #14
          Originally posted by Luke_Wolf View Post
          I've been hearing stirrings about "4K" (and of course the marketing types screwed up the only sane naming scheme there was for resolutions, that 4k means on the horizontal side..) screens, and while I don't expect laptops will be getting those, it does point to a pixel density increase in the near future.
          Ditto on the naming scheme.
          The 4k prototypes look amazing, but last year Sharp demoed an 8k display. So that, along with the new iPad or even the upcoming transformer prime infinity lead me to think that ~200+ dpi screens for laptops are very possible. The main problem will be UI. Gtk had (has?) a resolution independent branch but I don't know how well it works.

          Comment


          • #15
            Originally posted by eugeni_dodonov View Post
            Just a guess from my side, and please do not take this as some official comment about future GPU architectures .

            My opinion is that with Ultrabooks, notebooks and smartphones, the highest resolution that still matters is the one your notebook/ultrabook/phone supports. Which is lately standardized at around 1366x768 or 1600x900 or even full-hd (1920x1080) for 90% of them, if you could run any 3D application with 60fps, this should cover all the essential needs for pretty much everyone out there.

            And if you want to run higher resolution monitors with ultimate game settings, you are going to use a dedicated graphics card anyway, no matter how good integrated cards are .
            Is there a sight probability of Intel to deliver IGP-capable Xeon, which graphics core(s) can be combined in SLI-way on 2- or 4- socket mainboards?
            I understand this might sound unrealistic, but currently nvidia support of SLI is completely unoptimized, and AMD crossfire are N+1 pass-throughs.

            Comment


            • #16
              Originally posted by bongmaster2
              yes amd will always be 1 generation ahead. but haswel will be a mega step forward. they are almost only focusing on the gpu.
              while ivy bridge gives ca 20-50% performance boost, haswell is going to add 50% and more to ivy bridge.

              Much more for the GT3 part. The GT3 part comes with 40 EUs, stacked DRAM and possibly further increased IPC. This is a much bigger step than Sandy to Ivy (12 vs 16 EUs). In the mobile segment I wouldn't be surprised if they can close the gap to AMDs Kaveri APU.

              Comment

              Working...
              X