Announcement

Collapse
No announcement yet.

NVIDIA Tegra K1 Compared To AMD AM1 APUs

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Ferdinand View Post
    How many games does steam have that have an ARM version?
    Zero I think, but it could be used as streamingbox in living room. That would need a steam-on-linux to support arm itself though.

    Comment


    • #32
      Originally posted by tuke81 View Post
      Zero I think, but it could be used as streamingbox in living room. That would need a steam-on-linux to support arm itself though.
      Yes, 0. But that could change. With unreal engine on android arm (as well as a number of other engines), recompiling could be a comparatively small investment that could pay off big. A number of free games compile quite well - the tool chain is there including OGL 4.4. So Nvidia has built it; will they come?

      Also, remember this is a Dev board, so it will stay Linux. But a commercial variant might not be so flexible. SteamOS seems like the best choice, but nvidia have definitely shown excellent android support and even (shudder) windows rt. They might even hedge their bets and release multiple flavors, or maybe all 3 (although I think that unlikely). Obviously this is all speculation, except for the commercial variant part. There I got a pretty good tip . And the implication was that a Linux variant would ship.

      I do think they've done a bang up job with Ubuntu, and would definitely prefer that for a consumer box. But SteamOS certainly has buzz in the right community.

      Comment


      • #33
        need gpu benchmarks...

        I am dying to see the opengl/cl/cuda benchmarks!

        Comment


        • #34
          Originally posted by deppman View Post
          Yes, 0. But that could change. With unreal engine on android arm (as well as a number of other engines), recompiling could be a comparatively small investment
          I think this is a bit too optimistic. It will be more than just recompiling.
          Also I looked at the pictures: That thing does have a fan? Wow. Okay, yes, rel. high clock, 4 cores or something, but somehow ARM chips were always "passive" to me.

          And working I/O is of course nice (but also expected), but I was referring to the sheer number of it. E.g. SATA ports and so on.

          But good to see that e.g. LibO works there, my last experience with ARM was a beagleboard base AI Touchbook which had Open Office 2 extra ported but that was never really updated software wise.

          Also gaming: Well, there are games for this Pandora which is iirc. also some beagle...something board based thingy, but "big" games will probably require good GPU drivers and ask for a lot of GPU and CPU power as well as hard disk / SDD space. Currently ARM isn't grown so far.


          On the other SW / drivers:
          > Oracle Java
          OMG! You poor soul.
          Drivers in the kernel are quite expected to work. Though I admit some things can't be chosen in config on amd64 because they only make sense on x86 (UMC cpus, ISA cards and stuff). Still, 90% should just work. Hplip is userspace so nice to see it was ported. What GPU driver is in use for that board?
          Last edited by Adarion; 05-07-2014, 01:13 PM.

          Comment


          • #35
            [QUOTE=Adarion;415709]I think this is a bit too optimistic. It will be more than just recompiling ...[QUOTE]

            Remember I am referring to a commercial variant that would have the requisite I/O. Let me ask you this:

            If NVidia and Steam were to partner to provide a $150-$200 box to the living room that plays the top Steam titles (with "more growing all the time"), do you think this would be a compelling alternative to Xbone/PS4? I don't know the answer. What if Steam subsidized the cost and got it down to $99?

            Originally posted by Adarion View Post
            Also gaming: Well, there are games for this Pandora which is iirc. also some beagle...something board based thingy, but "big" games will probably require good GPU drivers and ask for a lot of GPU and CPU power as well as hard disk / SDD space. Currently ARM isn't grown so far.
            My initial impression is that the performance is comparable to Intel HD4400 graphics as found in my i7 laptop. This I think is one of the most popular graphic drivers per Steams hardware survey. So Steam at 720p or even 1080p for some games would seem likely. Also, IIRC this chip can be clocked up to 3GHz, so with some work, performance could even be better since we don't need to worry about using say, 10w on a set-top box.


            Originally posted by Adarion View Post
            What GPU driver is in use for that board?
            It's an NVidia 310.x custom beta driver. Hopefully it will get aligned with the other x86 architectures. It freezes or restarts occasionally. I guess that's why the call it "Beta". But performance, again, is quite nice.

            Comment


            • #36
              More reasons for Optimism

              Originally posted by Adarion View Post
              I think this is a bit too optimistic...
              A couple of reasons to be optimistic.

              1. K1 beats HD4400 graphics. Probably beats HD5500 graphics too if allowed to reach 10W.
              2. Steam's source engine has been ported to Android ARM Linux. It works well on a Tegra 4. Expect it to work really well on a K1.

              Comment


              • #37
                Originally posted by deppman View Post
                2. Steam's source engine has been ported to Android ARM Linux. It works well on a Tegra 4. Expect it to work really well on a K1.
                More: Serious Sam 3 works on medium details on K1, Trine 2 works on K1.

                Comment


                • #38
                  Originally posted by deppman View Post
                  A couple of reasons to be optimistic.

                  1. K1 beats HD4400 graphics. Probably beats HD5500 graphics too if allowed to reach 10W.
                  2. Steam's source engine has been ported to Android ARM Linux. It works well on a Tegra 4. Expect it to work really well on a K1.
                  Nice! I hope Steam will become almost cpu architecture independent.

                  Originally posted by deppman View Post
                  If NVidia and Steam were to partner to provide a $150-$200 box to the living room that plays the top Steam titles (with "more growing all the time"), do you think this would be a compelling alternative to Xbone/PS4? I don't know the answer. What if Steam subsidized the cost and got it down to $99?
                  Why would a K1 box cost more than $99? You don't need a battery, screen, speakers, touch, LTE etc. Which I thought were the main things that make a smartphone expensive.

                  Comment


                  • #39
                    Originally posted by deppman View Post
                    A couple of reasons to be optimistic.

                    1. K1 beats HD4400 graphics. Probably beats HD5500 graphics too if allowed to reach 10W.
                    2. Steam's source engine has been ported to Android ARM Linux. It works well on a Tegra 4. Expect it to work really well on a K1.
                    Great point!.

                    NV could make a K1 mini console with this K1 soc priced at probably $129, again optional wireless controller for $39 each. Almost like Fire TV prices but more gaming focussed. It needs a SATA port for SSD or HDD. A couple of mSata ports would be great!. If a K1 rev2 with 384 cuda cores comes into the picture, that could pull $199 pricing.

                    Comment


                    • #40
                      Originally posted by deppman View Post
                      A couple of reasons to be optimistic.

                      1. K1 beats HD4400 graphics. Probably beats HD5500 graphics too if allowed to reach 10W.
                      2. Steam's source engine has been ported to Android ARM Linux. It works well on a Tegra 4. Expect it to work really well on a K1.
                      And now more good news. Looks like that Source engine is going to be used to run Half Life 2 on Android. How long before the entire Source library makes the jump? Portal 2 would be awesome!

                      I am almost convinced that the commercial device will be a steam box. It plays into NVs strengths so well. Also, I expect NV to become the "Apple of Android." They already provide the best Android support outside of Google (or better, in some respects). The also have experience in the entire supply chain - from designing chips to supplier relationships. Therefore, I expect more NV devices like the Tegra Note and Shield. And the brand name has a lot of power associated with "top-notch graphics and gaming".

                      Oh, FYI, the dev board does have a SATA connector.

                      Comment


                      • #41
                        Originally posted by deppman View Post
                        The K1 is looking like a 3.5W-ish chip under "normal" workloads, and 7W-ish chip at full-tilt according to the published pdf. These estimates are from p13 looking at AP+DRAM. NVidia does claim that the actual numbers optimized for mobile will be lower, but I guess we'll see.

                        It I am reading this right, all the AM1 chips have a 25W TDP, so they aren't even in the same league. The fact that the K1 meets or beats these procs on many of the benchmarks is really quite impressive.

                        On Android, where Intel's code morphing has proven to eat batteries and kill performance for x86, the K1 looks like an easy win over Mullins. Of course, Mullins benefits from the x86 installed base for Linux and Windows platforms.
                        I've yet to find any third party power draw benchmarks for the tk1. semiaccurate thought nvidia was being deceptive (http://semiaccurate.com/2014/03/05/n...-number-watts/), and has a history of such behavior (http://semiaccurate.com/2014/01/20/w...eally-perform/). They estimated a power draw of around 35-40W for the board on display.
                        I'm really anxious to michael show us the power draw numbers (assuming he has a kill-a-watt, it's easy, but hopefully the firmware is exporting to /sys).

                        Comment


                        • #42
                          Charlie and Power Draw

                          Originally posted by liam View Post
                          I've yet to find any third party power draw benchmarks for the tk1. semiaccurate thought nvidia was being deceptive (http://semiaccurate.com/2014/03/05/n...-number-watts/), and has a history of such behavior (http://semiaccurate.com/2014/01/20/w...eally-perform/). They estimated a power draw of around 35-40W for the board on display.
                          I'm really anxious to michael show us the power draw numbers (assuming he has a kill-a-watt, it's easy, but hopefully the firmware is exporting to /sys).
                          Charlie is embarrassingly wrong and needs to hand off writing about nVidia to someone who is more objective. His writings about NV are at best semi-lucid. The 30-40W estimate from his "analysis" is absolutely absurd - especially when you consider a GTX750 runs on the same amount of power. The TK1 has less than 1/3 the CUDA count and is optimized for power effeciency.

                          What Charlie was looking at as a T3 powered automotive ECU development board. Since ECU's encounter much harsher conditions than consumer electronics, more robust cooling solutions are required. And justifying his conclusions because NV used an off-the-shelf power adapter instead of spending tens of thousands to build one for demonstrations ... come on, man!

                          Nvidia had these chips running all day in a 7" tablet chassis on numerous outings. If these chips were running 30-40w, the chasis would have literally melted (imaging putting a 30-40W lightbulb in a 7" tablet chassis with no active cooling. Can you say "fire hazzard").

                          So I am willing to put these claims to the test. I am not an EE, but would this do to measure power draw?

                          Comment


                          • #43
                            Originally posted by deppman View Post
                            Charlie is embarrassingly wrong and needs to hand off writing about nVidia to someone who is more objective. His writings about NV are at best semi-lucid. The 30-40W estimate from his "analysis" is absolutely absurd - especially when you consider a GTX750 runs on the same amount of power. The TK1 has less than 1/3 the CUDA count and is optimized for power effeciency.
                            GTX750 uses at least 105 watts in crysis3
                            Originally posted by deppman View Post
                            Nvidia had these chips running all day in a 7" tablet chassis on numerous outings. If these chips were running 30-40w, the chasis would have literally melted (imaging putting a 30-40W lightbulb in a 7" tablet chassis with no active cooling. Can you say "fire hazzard").
                            What charlie said was that the benchmarks and power measurements were not done at the same time and maybe not even on the same hardware. So you have a tablet to show that it runs within a tablet. You have a motherboard with big cooler to show that it is really fast in benchmarks. Phoronix has benchmarks of AMD's socs and those computers use 30-40 watts.
                            Last edited by Ferdinand; 05-10-2014, 01:24 PM.

                            Comment


                            • #44
                              Originally posted by Ferdinand View Post
                              What charlie said was that the benchmarks and power measurements were not done at the same time and maybe not even on the same hardware. So you have a tablet to show that it runs within a tablet. You have a motherboard with big cooler to show that it is really fast in benchmarks. Phoronix has benchmarks of AMD's socs and those computers use 30-40 watts.
                              There is no defense for his completely amateur analysis. Judging power draw on an A/C converter by how warm it is? Pure speculation. If my Jetson TK1 is drawing 40 watts, I will eat my hat.

                              In any event, we will have all that settled tonight, as I will be purchasing a multimeter and will measure Volts and Amps on other the DC or AC line tonight. Or both.

                              Comment


                              • #45
                                Use a "good brand" power usage monitor - it will tell you how much power the whole PC including it power adapter (it's not 100% efficient) is actually using (and it's safe). My PC build with a 10W J1900 can use up to 37,5W (record during xonotic benchmark) of power (1 SSD, two USB devices, two SODIMMs, the motherboard plus a power brick) and in a fanless mesh case the CPU can get up to ~65C.

                                Comment

                                Working...
                                X