Announcement

Collapse
No announcement yet.

The First Experience Of Intel Haswell On Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Sonadow View Post
    Chances are it will get picked up, especially by Apple. I can totally imagine them using those processors in their Macbook Pro lines, especially the 13" retina series, if only just to have the HD 5200 drive that ultra high-res display they have.
    you do realize you're wanting to put a minimum of a 37Watt TDP part in not just a 13.3in laptop but a macbook pro right? and that it is more like 47 Watt TDP if they don't pick the slightly lower power one. Not only that but these parts cost $468+ which even if we ignored the thermal issue would cut deeply into their profit margins.

    Now Alienware I would expect to see it available as an option but I doubt people would go with it instead of going with the cheaper variants based on the 4600 and using the savings to get a better dGPU

    Comment


    • #32
      Just going to point it out since no one else has: People complaining about Haswell drawing more power don't know what they're talking about.
      The TDP is higher, but the TDP is *not* the total draw. If you were paying attention, you would have noticed that one of the biggest differences with Haswell is that it moves the voltage regulator on die.
      http://hothardware.com/News/Haswell-...age-Regulator/
      As in, something extremely power-hungry that used to be on the motherboard is now on the CPU. Of course the TDP is going to go up. But TDP isn't power consumption. The power consumption of the hardware as a whole went down, as others have said, by as much as 50%.

      This is bad news for overclockers, since that's more heat (that used to be somewhere else) you have to dissipate to keep the CPU running, but it is good for mobile.

      Comment


      • #33
        Originally posted by Luke_Wolf View Post
        you do realize you're wanting to put a minimum of a 37Watt TDP part in not just a 13.3in laptop but a macbook pro right? and that it is more like 47 Watt TDP if they don't pick the slightly lower power one. Not only that but these parts cost $468+ which even if we ignored the thermal issue would cut deeply into their profit margins.

        Now Alienware I would expect to see it available as an option but I doubt people would go with it instead of going with the cheaper variants based on the 4600 and using the savings to get a better dGPU
        They'll save power by going with this instead of a discrete gpu. Yes it will be more expensive, which is why i doubt many pc vendors will pick it up, but that's never been a problem for apple.
        Last edited by smitty3268; 06-02-2013, 05:08 PM.

        Comment


        • #34
          Originally posted by tga.d View Post
          Just going to point it out since no one else has: People complaining about Haswell drawing more power don't know what they're talking about.
          The TDP is higher, but the TDP is *not* the total draw. If you were paying attention, you would have noticed that one of the biggest differences with Haswell is that it moves the voltage regulator on die.
          http://hothardware.com/News/Haswell-...age-Regulator/
          As in, something extremely power-hungry that used to be on the motherboard is now on the CPU. Of course the TDP is going to go up. But TDP isn't power consumption. The power consumption of the hardware as a whole went down, as others have said, by as much as 50%.

          This is bad news for overclockers, since that's more heat (that used to be somewhere else) you have to dissipate to keep the CPU running, but it is good for mobile.
          Correct TDP has nothing directly to do with power consumption, TDP as the acronym spells out is the Thermal Design Power, basically how much exhaust heat is this CPU putting out, and ~40-50 Watts of thermal energy in a tiny laptop case and particularly a macbook pro which means the cooling solution is particularly bad is asking for some serious overheating issues.

          And that all said increased TDP usually has a relationship with increased power draw, and Anandtech and Xbit Labs and other such are all showing increased draw under load. So despite this you either haven't paid any attention to any of the benchmarking sites or Intel really pulled one over on you.

          Comment


          • #35
            Originally posted by smitty3268 View Post
            They'll save power by going with this instead of a discrete gpu. Yes it will be more expensive, which is why i doubt many pc vendors will pick it up, but that's never been a problem for apple.
            Here you may not have caught something the 13.3in Macbook Pro even with Retina doesn't have a dGPU it uses straight up Ivy Bridge (http://www.apple.com/why-mac/compare/notebooks.html) and a dual core i5 or i7 at that. Now if they can get away with that would you care to explain to me why they would go with the 5200 as opposed to the 4600? Furthermore again when they're having to put in dual core parts to stay within the TDP capacity of the design why they would go with things that dump just that much thermal energy?

            Comment


            • #36
              Originally posted by Luke_Wolf View Post
              Correct TDP has nothing directly to do with power consumption, TDP as the acronym spells out is the Thermal Design Power, basically how much exhaust heat is this CPU putting out, and ~40-50 Watts of thermal energy in a tiny laptop case and particularly a macbook pro which means the cooling solution is particularly bad is asking for some serious overheating issues.

              And that all said increased TDP usually has a relationship with increased power draw, and Anandtech and Xbit Labs and other such are all showing increased draw under load. So despite this you either haven't paid any attention to any of the benchmarking sites or Intel really pulled one over on you.
              Tech Report's review has a page comparing what they call "task power" for an x264 encode. Haswell has an increased load draw, it gets done with the job sooner, so it uses less total power to complete it.

              http://techreport.com/review/24879/i...ors-reviewed/7

              Recent x264 builds advantage of AVX2 and FMA instructions, so it may not broadly apply to all tasks, but it's still an interesting counterpoint to the "BUT THE LOAD POWER DRAW IS HIGHER!1!1!" complaints.

              Comment


              • #37
                Originally posted by Luke_Wolf View Post
                Here you may not have caught something the 13.3in Macbook Pro even with Retina doesn't have a dGPU it uses straight up Ivy Bridge (http://www.apple.com/why-mac/compare/notebooks.html) and a dual core i5 or i7 at that. Now if they can get away with that would you care to explain to me why they would go with the 5200 as opposed to the 4600? Furthermore again when they're having to put in dual core parts to stay within the TDP capacity of the design why they would go with things that dump just that much thermal energy?
                Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.

                Comment


                • #38
                  Originally posted by smitty3268 View Post
                  Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                  With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                  I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                  That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.
                  Again I believe the 5200 is purely for sites like Anandtech, not for consumers and not for OEMs, and sure if Apple could have they would have but here's the problem, they didn't have the thermal headroom to put in a quadcore then why would they now unless you want to suggest that Intel make a special processor line that isn't available to other consumers just for them because none of the ones currently released are dual cores and they run hotter and draw more power than the previous generation.

                  Comment


                  • #39
                    Originally posted by smitty3268 View Post
                    Because Apple wants more GPU power. They would have used a discrete card instead of that ivy 4000 if they could have, but they decided it would use too much power.

                    With the 5200, they still won't get as much performance as a discrete card, but it won't take as much power either. The power/performance curve has shifted.

                    I believe it is likely exactly what Apple was asking for - you don't think Intel did this without asking OEM's what they wanted, do you? I don't.

                    That said, we'll see. Maybe it will just be for the all in one machines, or larger laptops.
                    I don't think we'll see it in the Apple products except for the iMac, which is the perfect place for it. The smaller Apple MacBooks simply don't have the ability to use a quad-core because they are so thin that they can only use small cooling solutions and the heat disappation doesn't work well with quad-core CPUs. There are 14-inch high-end quad-core units from other companies like Lenovo and HP but they are thicker with more internal space and therefore able to handle bigger cooling solutions. That's why the quad cores are found in the 15-inch Apple MacBooks.

                    Comment


                    • #40
                      Any test of Haswell is irrelevant without also testing it against the AMD A10-5800K and A10-6800K. Optionally including the A8-3870K as well.

                      Ram speeds will of course include recommended baseline speed and timings as well as overclocked states to see if Intel's GPU architecture scales as well as AMD's with increased memory bandwidth. Before you complain, 2x4Gb of DDR3 2133Mhz is dirt cheap even with a default voltage as low as 1.5v. It's not like I'd expect Larabel to kill his beer and beanbag chair budget on DDR3 2.8Ghz.
                      Last edited by Kivada; 06-04-2013, 04:23 AM.

                      Comment


                      • #41
                        Originally posted by TheLexMachine View Post
                        I don't think we'll see it in the Apple products except for the iMac, which is the perfect place for it. The smaller Apple MacBooks simply don't have the ability to use a quad-core because they are so thin that they can only use small cooling solutions and the heat disappation doesn't work well with quad-core CPUs. There are 14-inch high-end quad-core units from other companies like Lenovo and HP but they are thicker with more internal space and therefore able to handle bigger cooling solutions. That's why the quad cores are found in the 15-inch Apple MacBooks.
                        It does help that the MBPs use metal housings, I don't know if they do but it wouldn't surprise me if Apple uses thermal pads on the underside of the mobo to transfer excess heat to the housing. This would likely only drop the idle temps by around 1c and the load temps by around 3-4c but that can make all the difference in the world between having a functional computer and having a brick.

                        Comment


                        • #42
                          Originally posted by krasnoglaz View Post
                          "Photo of Haswell cpu in a motherboard with three phases and MOSFETs without cooling"

                          Good testing platform...
                          Actually knowing Apple they will likely get a custom version of the chip made for them, it worked for the GeForce GTX 680MX in the iMac. It's the fastest single mobile GPU in the Kepler family, but only exists for the iMac since it's too hot and power hungry for even the most elaborate 17" laptops.

                          No, you can't get them for your laptop, while the iMac uses MXM like cards they are not standard and won't work with any standard MXM slot even if they physically fit.

                          Comment


                          • #43
                            Combining the onboard Haswell graphics with a GTX 780 should work fine in Ubuntu 13.10? Any problems to expect?
                            Thanks

                            Comment


                            • #44
                              Originally posted by mike4 View Post
                              Combining the onboard Haswell graphics with a GTX 780 should work fine in Ubuntu 13.10? Any problems to expect?
                              Thanks
                              Massive heat and power consumption problems. The results in the linked articles where with the Thermalright U120E cooler, not the stock cooler. Save your money and use a different non Haswell CPU.

                              Also, the GTX780 is a very minor improvement over the HD7970 yet costs $200 more.

                              So yeah, go with the latests Intel/Nvidia stuff if you have cash to burn on showing off your E-peen, since even the HD7970 only makes any kind of real world difference if you are using multiple 1080p screens, else the extra power is mostly wasted.
                              Last edited by Kivada; 06-04-2013, 09:13 AM.

                              Comment


                              • #45
                                http://www.xbitlabs.com/articles/cpu...-4770k_13.html

                                It doesn't seem even worth updating my 5y old coreDuo...I only wanted to speed up FSX and X-Plane....baaahh...same as Win8 not usable. sigh.

                                Comment

                                Working...
                                X