Announcement

Collapse
No announcement yet.

The First Experience Of Intel Haswell On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Sonadow View Post
    Chances are it will get picked up, especially by Apple. I can totally imagine them using those processors in their Macbook Pro lines, especially the 13" retina series, if only just to have the HD 5200 drive that ultra high-res display they have.
    you do realize you're wanting to put a minimum of a 37Watt TDP part in not just a 13.3in laptop but a macbook pro right? and that it is more like 47 Watt TDP if they don't pick the slightly lower power one. Not only that but these parts cost $468+ which even if we ignored the thermal issue would cut deeply into their profit margins.

    Now Alienware I would expect to see it available as an option but I doubt people would go with it instead of going with the cheaper variants based on the 4600 and using the savings to get a better dGPU

    Comment


    • #32
      Just going to point it out since no one else has: People complaining about Haswell drawing more power don't know what they're talking about.
      The TDP is higher, but the TDP is *not* the total draw. If you were paying attention, you would have noticed that one of the biggest differences with Haswell is that it moves the voltage regulator on die.
      For the past decade, AMD and Intel have been racing each other to incorporate more components into the CPU die. Memory controllers, integrated GPUs, northbridges, and southbridges have all moved closer to a single package, known as SoCs (system-on-a-chip). Now, with Haswell, Intel is set to integrate another important piece of circuitry. When...

      As in, something extremely power-hungry that used to be on the motherboard is now on the CPU. Of course the TDP is going to go up. But TDP isn't power consumption. The power consumption of the hardware as a whole went down, as others have said, by as much as 50%.

      This is bad news for overclockers, since that's more heat (that used to be somewhere else) you have to dissipate to keep the CPU running, but it is good for mobile.

      Comment


      • #33
        Originally posted by Luke_Wolf View Post
        you do realize you're wanting to put a minimum of a 37Watt TDP part in not just a 13.3in laptop but a macbook pro right? and that it is more like 47 Watt TDP if they don't pick the slightly lower power one. Not only that but these parts cost $468+ which even if we ignored the thermal issue would cut deeply into their profit margins.

        Now Alienware I would expect to see it available as an option but I doubt people would go with it instead of going with the cheaper variants based on the 4600 and using the savings to get a better dGPU
        They'll save power by going with this instead of a discrete gpu. Yes it will be more expensive, which is why i doubt many pc vendors will pick it up, but that's never been a problem for apple.
        Last edited by smitty3268; 02 June 2013, 05:08 PM.

        Comment


        • #34
          Originally posted by tga.d View Post
          Just going to point it out since no one else has: People complaining about Haswell drawing more power don't know what they're talking about.
          The TDP is higher, but the TDP is *not* the total draw. If you were paying attention, you would have noticed that one of the biggest differences with Haswell is that it moves the voltage regulator on die.
          For the past decade, AMD and Intel have been racing each other to incorporate more components into the CPU die. Memory controllers, integrated GPUs, northbridges, and southbridges have all moved closer to a single package, known as SoCs (system-on-a-chip). Now, with Haswell, Intel is set to integrate another important piece of circuitry. When...

          As in, something extremely power-hungry that used to be on the motherboard is now on the CPU. Of course the TDP is going to go up. But TDP isn't power consumption. The power consumption of the hardware as a whole went down, as others have said, by as much as 50%.

          This is bad news for overclockers, since that's more heat (that used to be somewhere else) you have to dissipate to keep the CPU running, but it is good for mobile.
          Correct TDP has nothing directly to do with power consumption, TDP as the acronym spells out is the Thermal Design Power, basically how much exhaust heat is this CPU putting out, and ~40-50 Watts of thermal energy in a tiny laptop case and particularly a macbook pro which means the cooling solution is particularly bad is asking for some serious overheating issues.

          And that all said increased TDP usually has a relationship with increased power draw, and Anandtech and Xbit Labs and other such are all showing increased draw under load. So despite this you either haven't paid any attention to any of the benchmarking sites or Intel really pulled one over on you.

          Comment


          • #35
            Originally posted by smitty3268 View Post
            They'll save power by going with this instead of a discrete gpu. Yes it will be more expensive, which is why i doubt many pc vendors will pick it up, but that's never been a problem for apple.
            Here you may not have caught something the 13.3in Macbook Pro even with Retina doesn't have a dGPU it uses straight up Ivy Bridge (http://www.apple.com/why-mac/compare/notebooks.html) and a dual core i5 or i7 at that. Now if they can get away with that would you care to explain to me why they would go with the 5200 as opposed to the 4600? Furthermore again when they're having to put in dual core parts to stay within the TDP capacity of the design why they would go with things that dump just that much thermal energy?

            Comment


            • #36
              Originally posted by Luke_Wolf View Post
              Correct TDP has nothing directly to do with power consumption, TDP as the acronym spells out is the Thermal Design Power, basically how much exhaust heat is this CPU putting out, and ~40-50 Watts of thermal energy in a tiny laptop case and particularly a macbook pro which means the cooling solution is particularly bad is asking for some serious overheating issues.

              And that all said increased TDP usually has a relationship with increased power draw, and Anandtech and Xbit Labs and other such are all showing increased draw under load. So despite this you either haven't paid any attention to any of the benchmarking sites or Intel really pulled one over on you.
              Tech Report's review has a page comparing what they call "task power" for an x264 encode. Haswell has an increased load draw, it gets done with the job sooner, so it uses less total power to complete it.



              Recent x264 builds advantage of AVX2 and FMA instructions, so it may not broadly apply to all tasks, but it's still an interesting counterpoint to the "BUT THE LOAD POWER DRAW IS HIGHER!1!1!" complaints.

              Comment


              • #37
                Originally posted by Luke_Wolf View Post
                Here you may not have caught something the 13.3in Macbook Pro even with Retina doesn't have a dGPU it uses straight up Ivy Bridge (http://www.apple.com/why-mac/compare/notebooks.html) and a dual core i5 or i7 at that. Now if they can get away with that would you care to explain to me why they would go with the 5200 as opposed to the 4600? Furthermore again when they're having to put in dual core parts to stay within the TDP capacity of the design why they would go with things that dump just that much thermal energy?
                Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.

                Comment


                • #38
                  Originally posted by smitty3268 View Post
                  Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                  With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                  I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                  That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.
                  Again I believe the 5200 is purely for sites like Anandtech, not for consumers and not for OEMs, and sure if Apple could have they would have but here's the problem, they didn't have the thermal headroom to put in a quadcore then why would they now unless you want to suggest that Intel make a special processor line that isn't available to other consumers just for them because none of the ones currently released are dual cores and they run hotter and draw more power than the previous generation.

                  Comment


                  • #39
                    Originally posted by smitty3268 View Post
                    Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                    With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                    I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                    That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.
                    I don't think we'll see it in the Apple products except for the iMac, which is the perfect place for it. The smaller Apple MacBooks simply don't have the ability to use a quad-core because they are so thin that they can only use small cooling solutions and the heat disappation doesn't work well with quad-core CPUs. There are 14-inch high-end quad-core units from other companies like Lenovo and HP but they are thicker with more internal space and therefore able to handle bigger cooling solutions. That's why the quad cores are found in the 15-inch Apple MacBooks.

                    Comment


                    • #40
                      Any test of Haswell is irrelevant without also testing it against the AMD A10-5800K and A10-6800K. Optionally including the A8-3870K as well.

                      Ram speeds will of course include recommended baseline speed and timings as well as overclocked states to see if Intel's GPU architecture scales as well as AMD's with increased memory bandwidth. Before you complain, 2x4Gb of DDR3 2133Mhz is dirt cheap even with a default voltage as low as 1.5v. It's not like I'd expect Larabel to kill his beer and beanbag chair budget on DDR3 2.8Ghz.
                      Last edited by Kivada; 04 June 2013, 04:23 AM.

                      Comment

                      Working...
                      X