Announcement

Collapse
No announcement yet.

Intel Announces Loihi 2, Lava Software Framework For Advancing Neuromorphic Computing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Announces Loihi 2, Lava Software Framework For Advancing Neuromorphic Computing

    Phoronix: Intel Announces Loihi 2, Lava Software Framework For Advancing Neuromorphic Computing

    Intel has some new announcements around their neuromorphic computing research...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    "Lava will allow researchers and application developers to build on each other’s progress and converge on a common set of tools, methods, and libraries. Lava runs seamlessly on heterogeneous architectures across conventional and neuromorphic processors, enabling cross-platform execution and interoperability with a variety of Artificial Intelligence, neuromorphic and robotics frameworks. Developers can begin building neuromorphic applications without access to specialized neuromorphic hardware and can contribute to the Lava code base, including porting it to run on other platforms."
    Make up your mind with the Oxford comma (hint: use it). The first sentence has one, the 2nd one doesn't, one of the commas in the 2nd sentence should either be the word "while" or a semi-colon, and the comma in the last sentence can go either way.

    Y'all might be thinking something like, "Why is this moron all up in arms about commas?". Well, the reason is simple: If they can't get the commas right in their press release, the first thing people will see in regards to this, did they get the commas right in their code? Two sentences right next to each other aren't consistent with their formatting standards. Billions of dollars and they can't hire a proofreader to catch the simplest of things like consistency? Does that mean their code is inconsistent as well? Is that why they offer "you can port it for us" as a plus?

    This new intelligence ranges from recommendation systems, automated call centers, and gaming systems in the data center to autonomous vehicles and robots to more intuitive and predictive interfacing with our personal computing devices to smart city and road infrastructure that immediately responds to emergencies.
    One, that's just funny. Gaming systems in the data center. Explains a lot of the 7nm productivity issues.

    Two, this and that and the other and over yonder and and hire a proofreader. See, I tried to find where Michael copy/pasted the above from and, well, that's the 2nd sentence of the Loihi press release. I couldn't read past that due to a combination of laughter and some WTF-ness over and.

    I came into that article trying to find a "floor is lava " joke. My apologies for the rant about grammar. I expect a Fortune 500 company to hold themselves to a higher standard than that.

    As an American, I don't like the idea of a stop sign that'll call an ambulance. Is Intel gonna cover the cost of the ambulance ride their AI called for me?

    Comment


    • #3
      Originally posted by skeevy420 View Post

      I came into that article trying to find a "floor is lava " joke. My apologies for the rant about grammar. I expect a Fortune 500 company to hold themselves to a higher standard than that.

      As an American, I don't like the idea of a stop sign that'll call an ambulance. Is Intel gonna cover the cost of the ambulance ride their AI called for me?
      This reminded me of the marketing fail earlier this year.



      PS: I'm usually the one that rants!

      Comment


      • #4
        So how long til they make a PCI Express card full of neouromorphic processsors?
        Or integrate a small one on a Intel x86 processor?
        Or put a small one on the SoC of a smartphone?
        Or put one on the GPU?

        Comment


        • #5
          Originally posted by uid313 View Post
          So how long til they make a PCI Express card full of neouromorphic processsors?
          Or integrate a small one on a Intel x86 processor?
          Or put a small one on the SoC of a smartphone?
          Or put one on the GPU?
          Or put one in Intel Management Engine?

          Edit: actually they did announce a PCIe card.

          Comment


          • #6
            "...Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization..."

            ...just like "quantum computing". Just like "artificial intelligence. Just like "machine learning".

            Just as with the case of "quantum computing", where the Artificial Intelligentia canNOT even define for you WHAT IT IS, let alone describe how to build such a machine---how, exactly, does one build a "neuromorphic computer"? And don't try to get around this question by telling me about all the software you've got ready to do...what?
            [Just as an aside, and to give you a hearty laugh: Phoronix's spell-checker, here, does not even recognize the word, "neuromorphic". Really a main-stream topic, huh?]

            Here's the real reason for "Intel's Research" coming up with yet another half-baked, can't-be-defined "technological deus-ex-machina": diversion. Deflection. They (just as the other purveyors of these magical solutions) are perilously close to having to actually demonstrate these "technologies" actually working and providing viable solutions which can not be achieved any other way.

            Got to keep coming up with these meaningless buzz-technologies to keep investor cash flowing, don't we?
            Last edited by danmcgrew; 30 September 2021, 09:09 PM.

            Comment


            • #7
              Typo: Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization.

              Comment


              • #8
                Originally posted by danmcgrew View Post
                "...Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization..."

                ...just like "quantum computing". Just like "artificial intelligence. Just like "machine learning".

                Just as with the case of "quantum computing", where the Artificial Intelligentia canNOT even define for you WHAT IT IS, let alone describe how to build such a machine---how, exactly, does one build a "neuromorphic computer"? And don't try to get around this question by telling me about all the software you've got ready to do...what?
                [Just as an aside, and to give you a hearty laugh: Phoronix's spell-checker, here, does not even recognize the word, "neuromorphic". Really a main-stream topic, huh?]

                Here's the real reason for "Intel's Research" coming up with yet another half-baked, can't-be-defined "technological deus-ex-machina": diversion. Deflection. They (just as the other purveyors of these magical solutions) are perilously close to having to actually demonstrate these "technologies" actually working and providing viable solutions which can not be achieved any other way.

                Got to keep coming up with these meaningless buzz-technologies to keep investor cash flowing, don't we?
                machine learning is used everywhere, and has been a game changer enabling things like self driving cars and image recognition just to name a few. the broad spectrum impact on science is already fairly profound, and it is still in its infancy.

                just because you dont know what something is or how it works does not make it any less real

                Comment


                • #9
                  Originally posted by partcyborg View Post
                  machine learning is used everywhere, and has been a game changer enabling things like self driving cars and image recognition just to name a few. the broad spectrum impact on science is already fairly profound, and it is still in its infancy.

                  just because you dont know what something is or how it works does not make it any less real
                  Just to say one: AlphaFold 2.

                  Comment


                  • #10
                    Will a sufficiently advanced neuromorphic architecture enable the creation of a thinking/sentient entity at some point, or is it just a way of doing standard machine learning with much better power efficiency (because of the spiking neural network model)?

                    Comment

                    Working...
                    X