Originally posted by Michael
View Post
Announcement
Collapse
No announcement yet.
NVIDIA Tegra K1 Compared To AMD AM1 APUs
Collapse
X
-
So the performance was pretty close for a lot of a tests. I belive the K1 has a lower power requirement (11 watts I think) so that should put them pretty close if not making the K1 perform better on a per watt basis (to the 5350). I think it all comes down to what you want to do. If you want to make a NAS you'll want the AM1 because it has more SATA ports.
Comment
-
My experience using this board
Originally posted by riklaunim View PostI have X2. It works quite well as a desktop, but still it's still not a good x86 CPU with GPU that has drivers built for X.org and not Android (Mali). http://openbenchmarking.org/result/1...FO-1305102FO57
So far, it looks like the Jetson TK1 smokes the X2, at least on spot checks. CRay of <120s vs 300 (2.2x faster), LAME encoding of 15.5s vs 79.11s (5x better), etc. Of course, there is a limited sample size due to trouble with porting the test suite. But so far, it looks pretty impressive. I've been using the Jetson TK1 as my primary desktop all weekend with lots of tinkering.
I removed Unity and replace with KDE, where I can turn off compositing.
Performance has been impressive, outperforming mid-range laptops from just a few years ago. I saw a GLMark2 score of 320 at @1920x1080. While this doesn't have the same controls as the Phoronix test suite, this number is 3.5x the Celeron J1900 performance @2.x GHz (and I believe has a higher power draw too). Nexuiz at 1920x1080 with default detail settings ("normal") runs at 60-120fps (with perhaps 80fps average) during single player games. On demo, it ran closer to 200fps.
So ... color me impressed. I am really most impressed by the quantity and quality of software that I can run on ARM! KDE? Yep, and it works quite well. Gimp? Who wudda thunk? Gvcuview? Yup! The only really annoying absences are Flash, Chrome (chromium works very well though), and the Google talk plugin. I haven't tried Skype yet, but I don't have high hopes... although I assume it is running on Surface, so maybe...
I do hope nVidia bump their driver from Beta to "production" level. It is prone to occasional random hiccups under heavy load that kicks me back to the DM or freezes the display and requires a reboot.
But all in all, a great start for a SOC which might just have a great future!
Comment
-
Originally posted by deppman View PostI haven't tried Skype yet, but I don't have high hopes... although I assume it is running on Surface, so maybe...
In the same way, most FOSS stuff working on ARM is hardly surprising - gcc and most other popular compilers can compile for it, and since the source is available to everyone recompilation is fairly trivial. The only things not easily recompilable should be those using x86 assembler, SSE etc directly (or assuming bytes to be in the same order on all platforms!).
Comment
-
Wow. Now it is all down to how much power they both use. If they use the same amount of power AMD only has x86 as the main differential. If Nvidia uses more power it fails.
I would like to see performance per watt and ipc for an A15 core and a Jaguar core.
Comment
-
Originally posted by Ferdinand View PostWow. Now it is all down to how much power they both use. If they use the same amount of power AMD only has x86 as the main differential. If Nvidia uses more power it fails.
I would like to see performance per watt and ipc for an A15 core and a Jaguar core.
It I am reading this right, all the AM1 chips have a 25W TDP, so they aren't even in the same league. The fact that the K1 meets or beats these procs on many of the benchmarks is really quite impressive.
On Android, where Intel's code morphing has proven to eat batteries and kill performance for x86, the K1 looks like an easy win over Mullins. Of course, Mullins benefits from the x86 installed base for Linux and Windows platforms.Last edited by deppman; 05 May 2014, 05:19 PM.
Comment
-
Originally posted by deppman View PostThe K1 is looking like a 3.5W-ish chip under "normal" workloads, and 7W-ish chip at full-tilt according to the published pdf. These estimates are from p13 looking at AP+DRAM. NVidia does claim that the actual numbers optimized for mobile will be lower, but I guess we'll see.
It I am reading this right, all the AM1 chips have a 25W TDP, so they aren't even in the same league. The fact that the K1 meets or beats these procs on many of the benchmarks is really quite impressive.
On Android, where Intel's code morphing has proven to eat batteries and kill performance for x86, the K1 looks like an easy win over Mullins. Of course, Mullins benefits from the x86 installed base for Linux and Windows platforms.Test signature
Comment
-
Originally posted by bridgman View PostIIRC the highest power Mullins is 4.5W. Maybe you're thinking about Kabini - the AM1 chips that Michael is testing are Kabini, not Beema or Mullins.
Footnotes:
1 I wouldn't be surprised if the actual max is a bit higher as AMD of late has been prone "marketing over-zealousness" by doing things like quoting boost frequencies instead of base clocks on graphics cards. Not that they are by any means alone in that kind of behavior.
Comment
-
Originally posted by deppman View PostYes, I'm sorry. All the tested AMD chips appear to have a 25W TDP.
I reading those values from the Asus utility , othewise i don't know how to measure wattage only for processor , on linux that sensor does not work .Last edited by dungeon; 05 May 2014, 09:34 PM.
Comment
-
When Athlon 5350 is overclocked (if someone is interested in that game ) at 2600MHz it uses 4-5W more as that utility says it is around max 21W . So if i reading that right (and assuming that utility is right ) TDP value is just design of maximum 25W and Athlon 5350 alone actually never uses that much power .
Comment
Comment