Announcement

Collapse
No announcement yet.

Intel Details Lakefield With Hybrid Technology

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by WolfpackN64 View Post

    Isn't this from the new Battlestar Galactica? I don't remember Cylon-Human hybrids from the Original BSG.
    Yes. The new one. The Hybrids were living navigation computers for their Basestars. I figured the new BSG would be pretty well known among this crowd.

    Comment


    • #12
      Originally posted by skeevy420 View Post
      Yes. The new one. The Hybrids were living navigation computers for their Basestars. I figured the new BSG would be pretty well known among this crowd.
      I was barely aware that there was a "new" BSG at all. Spank me daddy.

      Comment


      • #13
        Originally posted by starshipeleven View Post
        Also known as "big.LITTLE" in ARM design, and it's a pretty old concept as well.
        They'd have had it much earlier if they moved to newer processes earlier.

        This was from 2005: https://www.anandtech.com/show/1636/2
        (third picture named Evolutionary Configurable Architecture)

        starshipeleven I watched all of it. Very good show. You should watch it!

        Comment


        • #14
          Originally posted by DavidC1 View Post
          They'd have had it much earlier if they moved to newer processes earlier.

          This was from 2005: https://www.anandtech.com/show/1636/2
          (third picture named Evolutionary Configurable Architecture)
          I know they have always been aware of the concept, I said they never made any real product like that though.

          ARM designs with big.LITTLE exist and have been in use in many mobile and embedded devices, even during Intel's forays into mobile (that eventually failed), when Atoms x5 and x7 were a thing (and would really have benefitted a lot from having an additional strong core to complement the 2-4 crappy ones).

          But no, Intel decided that just blatantly lying about TDP of these chips would have been a better choice. As if none would have figured out that pretty quickly in a battery-powered device.

          Same level of dumb choice of WD deciding to use drive-managed SMD drives in their WD Red drives for NAS, within a few months they upset A LOT of people and the little game was figured out pretty quickly.

          starshipeleven I watched all of it. Very good show. You should watch it!
          Not my thing, especially the newer one. The whole human vs machine philosophy they rely on enrages me to no end.
          Last edited by starshipeleven; 10 June 2020, 05:46 PM.

          Comment


          • #15
            Originally posted by starshipeleven View Post
            But no, Intel decided that just blatantly lying about TDP of these chips would have been a better choice. As if none would have figured out that pretty quickly in a battery-powered device.
            I have a Bay Trail based Tablet. Dell Venue 8 Pro. The experience was pretty good. I'd play games like Simcity 4 on the touchscreen. The challenge and the small form factor made it fun. SC4 is quite demanding on the CPU due to its use of isometric 3D graphics and lack of GPU acceleration. The battery would last 6-8 hours.

            Smartphones may have been an issue but they got the thermal/battery life right on those chips for Tablets.

            Intel was screwing up for some time. Abrupt departure of the CEO in 2013 showed signs of serious internal struggles. Problems were likely brewing for some time before that.

            Not my thing, especially the newer one. The whole human vs machine philosophy they rely on enrages me to no end.
            Hmm, I'm not sure if you are thinking in the same line, but I don't like when movies/entertainment portray AI reaching human intelligence as a sure thing, or that it happens just because it has enough compute power. You cannot create something you don't have a firm understanding of. Most of "AI" is doable today because of massive data available.

            Comment


            • #16
              Originally posted by DavidC1 View Post
              Smartphones may have been an issue but they got the thermal/battery life right on those chips for Tablets.
              Their main target were smartphones, so yes it was a big issue. They really tried to get on the Android bandwagon (kind of late, but not as late as Microsoft at least). Tablets are secondary in the mobile market as they sell much much less (even iPads sales suck if compared to iPhones), you either get into smartphones or you are out of the mobile market completely.

              Intel lied on the TDP, so all the parts that were supposed to go into smartphones either had issues when put into smartphones or didn't sell, Intel tried to pivot into tablets and mini-PCs, but apart from a few select models that had half-decent internal storage, most devices got the lower ends of eMMC and Windows really ran like garbage on that.

              Hmm, I'm not sure if you are thinking in the same line
              No that's not the thing that enrages me. No offense, but I don't want to ruin something you like, so I'm not going into details. It's also offtopic

              Comment


              • #17
                Originally posted by starshipeleven View Post
                Not my thing, especially the newer one. The whole human vs machine philosophy they rely on enrages me to no end.
                Uhh, not to spoil things, but it sounds like you never watched the later parts of the show. And large portions of it are far more interested in human vs human conflict than the machines.
                Last edited by smitty3268; 10 June 2020, 08:59 PM.

                Comment


                • #18
                  Originally posted by programmerjake View Post
                  800MHz base clock? that's slow enough that it might be comparable to Libre-SOC's quadcore processor in multithreaded performance! lkcl
                  Core i5-1030G4 has a base clock of 700MHz. The power of 10nm+!

                  Comment


                  • #19
                    I don't understand one thing. How they going to deal with ISA differences? Tremont cores most likely won't support AVX "class" instructions for example (and perhaps several more). Say application has AVX compiled-in, will OS scheduler know this somehow and choose core accordingly? Will developers have to support this explicitly in applications? How ARM deals with it? (well I guess they use one ISA per family) Could someone explain this to me, thanks.

                    Comment


                    • #20
                      Intel hybrid, aka. twice as many and different bugs and vulnerabilities, ... Also pretty useless. The ULV CPUs were already low-power enough. I got 14h or so battery life from an X1 Carbon: https://www.youtube.com/watch?v=0ZfnAsJmkzU and over 48h of suspend-to-ram. I'd rather see them fix the architectural bugs, and use the logic gates for something more useful, ...

                      Comment

                      Working...
                      X