Whoops, sorry about that Michael. I had 2 articles opened side by side and ended up looking at the one and commenting on the other. Apologies!
Originally Posted by Michael
So the performance was pretty close for a lot of a tests. I belive the K1 has a lower power requirement (11 watts I think) so that should put them pretty close if not making the K1 perform better on a per watt basis (to the 5350). I think it all comes down to what you want to do. If you want to make a NAS you'll want the AM1 because it has more SATA ports.
My experience using this board
Originally Posted by riklaunim
So far, it looks like the Jetson TK1 smokes the X2, at least on spot checks. CRay of <120s vs 300 (2.2x faster), LAME encoding of 15.5s vs 79.11s (5x better), etc. Of course, there is a limited sample size due to trouble with porting the test suite. But so far, it looks pretty impressive. I've been using the Jetson TK1 as my primary desktop all weekend with lots of tinkering.
I removed Unity and replace with KDE, where I can turn off compositing.
Performance has been impressive, outperforming mid-range laptops from just a few years ago. I saw a GLMark2 score of 320 at @1920x1080. While this doesn't have the same controls as the Phoronix test suite, this number is 3.5x the Celeron J1900 performance @2.x GHz (and I believe has a higher power draw too). Nexuiz at 1920x1080 with default detail settings ("normal") runs at 60-120fps (with perhaps 80fps average) during single player games. On demo, it ran closer to 200fps.
So ... color me impressed. I am really most impressed by the quantity and quality of software that I can run on ARM! KDE? Yep, and it works quite well. Gimp? Who wudda thunk? Gvcuview? Yup! The only really annoying absences are Flash, Chrome (chromium works very well though), and the Google talk plugin. I haven't tried Skype yet, but I don't have high hopes... although I assume it is running on Surface, so maybe...
I do hope nVidia bump their driver from Beta to "production" level. It is prone to occasional random hiccups under heavy load that kicks me back to the DM or freezes the display and requires a reboot.
But all in all, a great start for a SOC which might just have a great future!
The source isn't available, and MS's Linux binaries are only for x86 (not even x86-64). Porting to WinRT is completely irrelevant.
Originally Posted by deppman
In the same way, most FOSS stuff working on ARM is hardly surprising - gcc and most other popular compilers can compile for it, and since the source is available to everyone recompilation is fairly trivial. The only things not easily recompilable should be those using x86 assembler, SSE etc directly (or assuming bytes to be in the same order on all platforms!).
Wow. Now it is all down to how much power they both use. If they use the same amount of power AMD only has x86 as the main differential. If Nvidia uses more power it fails.
I would like to see performance per watt and ipc for an A15 core and a Jaguar core.
The K1 is looking like a 3.5W-ish chip under "normal" workloads, and 7W-ish chip at full-tilt according to the published pdf. These estimates are from p13 looking at AP+DRAM. NVidia does claim that the actual numbers optimized for mobile will be lower, but I guess we'll see.
Originally Posted by Ferdinand
It I am reading this right, all the AM1 chips have a 25W TDP, so they aren't even in the same league. The fact that the K1 meets or beats these procs on many of the benchmarks is really quite impressive.
On Android, where Intel's code morphing has proven to eat batteries and kill performance for x86, the K1 looks like an easy win over Mullins. Of course, Mullins benefits from the x86 installed base for Linux and Windows platforms.
Last edited by deppman; 05-05-2014 at 06:19 PM.
IIRC the highest power Mullins is 4.5W. Maybe you're thinking about Kabini - the AM1 chips that Michael is testing are Kabini, not Beema or Mullins.
Originally Posted by deppman
Yes, I'm sorry. All the tested AMD chips appear to have a 25W TDP. I mentioned Mullins knowing it claims 4.5W, and I didn't mean to misrepresent it1. I was just, well, mulling over how the Mullins might compete with the K1.
Originally Posted by bridgman
1 I wouldn't be surprised if the actual max is a bit higher as AMD of late has been prone "marketing over-zealousness" by doing things like quoting boost frequencies instead of base clocks on graphics cards. Not that they are by any means alone in that kind of behavior.
Last edited by dungeon; 05-05-2014 at 10:34 PM.
When Athlon 5350 is overclocked (if someone is interested in that game ) at 2600MHz it uses 4-5W more as that utility says it is around max 21W . So if i reading that right (and assuming that utility is right ) TDP value is just design of maximum 25W and Athlon 5350 alone actually never uses that much power .