Announcement

Collapse
No announcement yet.

Intel Core i7 4790K: Devil's Canyon Benchmarks On Ubuntu Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • curaga
    replied
    Glibc had some work to use transactions.

    Leave a comment:


  • user82
    replied
    Does any software in the linux world already take advantage of the multi threading enhancement TSX?
    After quick search it looks like the kernel could use it, but I'm not sure.

    Leave a comment:


  • chrisb
    replied
    Originally posted by yonux View Post
    You're right, but what other part of the computer is hitting power during these benchmarks ?
    I don't think graphic card suddenly hit power then stop ...
    We definitely need more information about this consumption.
    Every part uses power - motherboard, power supply, north and south bridge, RAM, SSD. http://johnsokol.blogspot.co.uk/2008...wer-usage.html

    I dont' have the numbers in front of me, but if I recall for a typical P4 3Ghz system we saw the total average power consumption at something like 75 watts when idle and 150 watts or more under load. With the ACPI on it would drop considerably when idle. (I am not including the monitors that also draw 75 watts or so)

    Again we were more interested in were heat was generated so we measured power dissipation per component which for all purposes is directly equivalent to watts used.

    We found watts for a component was quite different then the watts on the power line.

    Why? Because in this breakdown almost 40 to 50% of the power was lost in PC's power supply's!
    Both main and on the motherboards on board supply's needed for the CPU and chip sets.
    This was very high since most PC power supply were only 60% efficient!
    So all loaded inside the PC show up as almost 2x on the 110 volt power line.

    So of the peak 150W coming in what's left after being stepped down is a remaining 80W or so.

    Hard Drive 12 watts assuming 1 80Gb Maxtor DiamondMax.
    North and South Bridge, 1 to 6 watts
    Support chips, almost 1 maybe 2 watts, things like the NIC and other support components were insignificant.

    CPU which could vary from 20 watts to 100 watts depending on it's load.
    Running like CPU burn, CPU test or CPU stress would max out the CPU's power, again with the power supply low efficiency an 80 watt increase in CPU power use results in an 160 Watt increase on the 110V power line! We didn't not expect this when we started.

    If you add a high end graphics (Nvidia/ATI) card then add on another 40 watts 2x so 80 watts on the power line.

    Another interesting thing was 10 watts for fans!

    Here is another unexpected result, the hotter the system ran the more power each component draw. This could add another 10% or so. So a cold system like just after power up uses less then a hot one.

    Leave a comment:


  • Luke
    replied
    For many uses v-pro is actually DANGEROUS

    Originally posted by const View Post
    i wonder why Intel removed 2 security technologies from i7-4790K (even compared to i7-4790, not to mentioned their older CPUs), that are:

    * "Intel? vPro Technology":



    "Trusted Execution Technology":
    V-pro allows remote access to the machine's firmware, under the OS. I don't know if it still does but during the Core2 era this required the enterprise chipset and Intel's networking. It could be disabled on such a machine by using an add-in network card and not connecting the Intel networking output.

    V-Pro was meant for remote administration, but also offers attack surface for an under-the-OS keylogger, encryption key exporter, or rootkit. It is also a closed firmware application, meaning no way to verify that Microsoft or the NSA don't have access to it unless (maybe) known and proven targets of the NSA care to run a behavioral test and run it many, many times.

    Leaving aside NSA or MS backdoors, a remote administration tool that is not being used is a lot of unnecessary attack surface and a totally unacceptable risk, no matter where it came from. If Coreboot came up with with a remote administration module, I'm sure they would advise it be installed only when it is to be used.

    Trusted Execution Technology is for specialized "trusted computing" applications, you would NOT want Intel to be able to prevent your video player from running because you had installed a video card for which a DRM-cracking driver was available! Both of these tools are essentially for enterprise machines and not only not relevant but undesirable for standalone desktops.

    If you use V-pro yourself, I would advise two LANs: one used only for v-Pro, not connected to the Internet if remote access can be by a dedicated line, otherwise connected only through some kind of firewall server reachable only by an SSH login with pre-shared keys. A second network by a different, non-v-Pro supporting network card for normal network activity.

    Leave a comment:


  • yonux
    replied
    You're right, but what other part of the computer is hitting power during these benchmarks ?
    I don't think graphic card suddenly hit power then stop ...
    We definitely need more information about this consumption.

    Leave a comment:


  • chrisb
    replied
    Originally posted by yonux View Post
    "215 Watts" !!!!!

    I read the article because I'm working in HPC and I'm always aware about CPU perf but wait : 215 Watts !
    How Intel is thinking we can cool a chip consuming 215 Watts ? This is incredible.
    You are quoting total system power not CPU TDP. From the very first page: The Core i7 4790K has an 88 Watt TDP over 84 Watts on the Core i7 4770K

    Leave a comment:


  • grigi
    replied
    If Micheal just runs a simple "time" command for each benchmark he is running, you could determine approximately how many cores it uses, and how much IO affected the benchmark is?

    Leave a comment:


  • vmicho
    replied
    Originally posted by Rexilion View Post
    It would be nice if a testgraph indicated whether the test is single threaded or multithreaded. I'm still pondering whether to purchase my first Intel chip because of it's superior single core performance over AMD.

    It's really really sad AMD has stopped it's fx line. I'm not *that* appealed by APU chips.
    Exactly. I'd like to see some tests marked explicitely as single/multi core. I'd also see some more single core tests (I suppose 1 test here is probably single core, but not sure). Single-core task performance does matter a lot even in 2014.

    Leave a comment:


  • nightmarex
    replied
    Originally posted by Luke View Post
    It shows that the Intel processors able to defeat the AMD FX procs in video editing (the benchmark that matters for my systems) are the ones that cost $100 more for the CPU. My older FX-8120 overclocked to 4.4 GHZ is probably a bit slower than an FX 8350 at the same clocks, but when I first got one Ivy Bridge wasn't out yet either. Only the Core i7's beat the Piledriver in your libx264 test, and interestingly the same was true for Linux kernel compilation, about a minute for a job that seemd to take all night with an Athlon 500 MHZ I had in 2004.

    True, the TDP is less for the Intels, but I suspect that idle/desktop power dissipation can't differ by more than 10W or so. It doesn't really matter that the overclocked AMD might pull 200W for 10 minutes rendering a video if it spends most of its time in the 50W area and can drop into the low 40s when totally idle at the desktop. These tests tell me that buying an i5 setup to replace Bulldozer would always be money down the toilet if video editing is the intended function.

    Too bad AMD is dumping the FX line, though I suspect at the rate they are going the Fusion chips will soon catch up due to the improvements in things like reducing branch predictor misses. They can already match the Phenom II x4 from what I understand, and that's fast enough for most of my work.
    I maybe missing something but I don't see arch optimization for those test even though it's likely the 8350 would still get beat. I know what you mean, I wish AMD would come out with a monster again.

    Leave a comment:


  • dungeon
    replied
    Originally posted by yonux View Post
    "215 Watts" !!!!!
    I read the article because I'm working in HPC and I'm always aware about CPU perf but wait : 215 Watts !
    How Intel is thinking we can cool a chip consuming 215 Watts ? This is incredible.
    I will remember that - Devil's Canyon 215W .

    Leave a comment:

Working...
X