Announcement

Collapse
No announcement yet.

Intel Announces Loihi 2, Lava Software Framework For Advancing Neuromorphic Computing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • danmcgrew
    replied
    Originally posted by jaxa View Post

    Your neural network seems faulty. Get it fixed and then get back to us.

    Reading comprehension is not your strong suit, is it, Room-Temp-IQ?

    Now, follow along closely; get some three-year-old to read this for you, as you apparently need the help of someone with more skills than you have, and then see if you can answer the following, from the original post:

    "And would you mind telling us all---what, exactly and precisely, is this new "neuromorphic architecture"? And what it might be used for? And, most importantly: how does one go about designing a 'neuromorphic computer'?"

    Once again (not that it will make any difference):
    There are quite a few of us ---with Physics, Electrical Engineering, and Electrical and Computer Engineering degrees---who have been designing computers for many years, who have never even HEARD the term; but, then again, we are obviously nowhere near as brilliant, nor instantaneously up to date on whatever latest buzzword-technology comes from the mouths of PR-types, as you are.

    "Your enlightenment is eagerly awaited."

    ---not that it will make any difference. Asking for "enlightenment" from you is, quite obviously, a fool's errand.

    ...and, silly me; I almost forgot:

    ...IN WHAT DISCIPLINES DID YOU EARN ALL YOUR DEGREES?
    Last edited by danmcgrew; 04 October 2021, 09:31 AM.

    Leave a comment:


  • jaxa
    replied
    Originally posted by danmcgrew View Post

    "Asking if [no; THINKING that]'Computers Can Think' is as relevant as asking if [no; THINKING that] 'Submarines Can Swim'."---Edsger W. Dijkstra

    And would you mind telling us all---what, exactly and precisely, is this new "neuromorphic architecture"? And what it might be used for? And, most importantly: how does one go about designing a 'neuromorphic computer'?
    There are quite a few of us who have been designing computers for many, many years who have never even HEARD the term; quite obviously we have been short-changing those who have been depending on us to deliver the latest state-of-the-art hardware.

    Your enlightenment is eagerly awaited; we, obviously, cannot wallow in our stupidity and backward ways any longer---to say nothing about asking for refunds for all the Physics, Electrical Engineering, and Electrical and Computer Engineering degrees.
    Your neural network seems faulty. Get it fixed and then get back to us.

    Leave a comment:


  • danmcgrew
    replied
    Originally posted by jaxa View Post
    Will a sufficiently advanced neuromorphic architecture enable the creation of a thinking/sentient entity at some point, or is it just a way of doing standard machine learning with much better power efficiency (because of the spiking neural network model)?
    "Asking if [no; THINKING that]'Computers Can Think' is as relevant as asking if [no; THINKING that] 'Submarines Can Swim'."---Edsger W. Dijkstra

    And would you mind telling us all---what, exactly and precisely, is this new "neuromorphic architecture"? And what it might be used for? And, most importantly: how does one go about designing a 'neuromorphic computer'?
    There are quite a few of us who have been designing computers for many, many years who have never even HEARD the term; quite obviously we have been short-changing those who have been depending on us to deliver the latest state-of-the-art hardware.

    Your enlightenment is eagerly awaited; we, obviously, cannot wallow in our stupidity and backward ways any longer---to say nothing about asking for refunds for all the Physics, Electrical Engineering, and Electrical and Computer Engineering degrees.

    Leave a comment:


  • danmcgrew
    replied
    Originally posted by tildearrow View Post
    Typo: Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization.
    Leave "neuromorophic" alone, except for for replacing the "ph" with an "n".

    And then use the result as a very valid replacement for the idiotic term, "neuromorphic".

    Leave a comment:


  • jaxa
    replied
    Will a sufficiently advanced neuromorphic architecture enable the creation of a thinking/sentient entity at some point, or is it just a way of doing standard machine learning with much better power efficiency (because of the spiking neural network model)?

    Leave a comment:


  • boboviz
    replied
    Originally posted by partcyborg View Post
    machine learning is used everywhere, and has been a game changer enabling things like self driving cars and image recognition just to name a few. the broad spectrum impact on science is already fairly profound, and it is still in its infancy.

    just because you dont know what something is or how it works does not make it any less real
    Just to say one: AlphaFold 2.

    Leave a comment:


  • partcyborg
    replied
    Originally posted by danmcgrew View Post
    "...Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization..."

    ...just like "quantum computing". Just like "artificial intelligence. Just like "machine learning".

    Just as with the case of "quantum computing", where the Artificial Intelligentia canNOT even define for you WHAT IT IS, let alone describe how to build such a machine---how, exactly, does one build a "neuromorphic computer"? And don't try to get around this question by telling me about all the software you've got ready to do...what?
    [Just as an aside, and to give you a hearty laugh: Phoronix's spell-checker, here, does not even recognize the word, "neuromorphic". Really a main-stream topic, huh?]

    Here's the real reason for "Intel's Research" coming up with yet another half-baked, can't-be-defined "technological deus-ex-machina": diversion. Deflection. They (just as the other purveyors of these magical solutions) are perilously close to having to actually demonstrate these "technologies" actually working and providing viable solutions which can not be achieved any other way.

    Got to keep coming up with these meaningless buzz-technologies to keep investor cash flowing, don't we?
    machine learning is used everywhere, and has been a game changer enabling things like self driving cars and image recognition just to name a few. the broad spectrum impact on science is already fairly profound, and it is still in its infancy.

    just because you dont know what something is or how it works does not make it any less real

    Leave a comment:


  • tildearrow
    replied
    Typo: Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization.

    Leave a comment:


  • danmcgrew
    replied
    "...Neuromorophic computing remains one of the hot areas for research at Intel Labs and eventually will work its way to commercialization..."

    ...just like "quantum computing". Just like "artificial intelligence. Just like "machine learning".

    Just as with the case of "quantum computing", where the Artificial Intelligentia canNOT even define for you WHAT IT IS, let alone describe how to build such a machine---how, exactly, does one build a "neuromorphic computer"? And don't try to get around this question by telling me about all the software you've got ready to do...what?
    [Just as an aside, and to give you a hearty laugh: Phoronix's spell-checker, here, does not even recognize the word, "neuromorphic". Really a main-stream topic, huh?]

    Here's the real reason for "Intel's Research" coming up with yet another half-baked, can't-be-defined "technological deus-ex-machina": diversion. Deflection. They (just as the other purveyors of these magical solutions) are perilously close to having to actually demonstrate these "technologies" actually working and providing viable solutions which can not be achieved any other way.

    Got to keep coming up with these meaningless buzz-technologies to keep investor cash flowing, don't we?
    Last edited by danmcgrew; 30 September 2021, 09:09 PM.

    Leave a comment:


  • numacross
    replied
    Originally posted by uid313 View Post
    So how long til they make a PCI Express card full of neouromorphic processsors?
    Or integrate a small one on a Intel x86 processor?
    Or put a small one on the SoC of a smartphone?
    Or put one on the GPU?
    Or put one in Intel Management Engine?

    Edit: actually they did announce a PCIe card.

    Leave a comment:

Working...
X