Announcement

Collapse
No announcement yet.

Gneural Network: GNU Gets Into Programmable Neural Networks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Gneural Network: GNU Gets Into Programmable Neural Networks

    Phoronix: Gneural Network: GNU Gets Into Programmable Neural Networks

    The inaugural release of Gneural Network is now available, a new GNU Project to implement programmable neural networks...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Nice. We can also beat Lee Sedol with OpenSource in future!

    Comment


    • #3
      Wait, are they really using CVS to distribute source tars? What is this, 1990?

      Please, someone tell me there is a git project repo of this thing.

      Comment


      • #4
        What are the best neural network libraries?

        Comment


        • #5
          That's great. I were writing my diploma project about neural networks, and stuck upon the fact that neural ecosystem is absolutely bad. I must say, it is absolutely surprising fact — if you search a bit about ANN, you would find thousands papers, articles, and blog posts about neural networks, like everything is okay. Check for example that great fun with ANNs.

          The reality is: there's no working LSTM networks in Haskell. Okay, let's create bindings to another library; what do we have in C or C++:
          • RNNLIB. The first library with LSTM, the reference implementation. Turns out, nobody maintains it, and it have some dependencies on particular version of Boost, and a version of… gcc! Yeah, it won't compile on a newer gcc's.
          • Currennt. A CUDA accelerated code. I don't like CUDA for obvious reasons, and don't even have NVidia GPU, but I doubt that the creators left the code without ability of being calculated on CPU. And… it is even worse than RNNLIB state is! It requires CUDA compiler just to compile the library, and CUDA compiler devs were probably either a students, or saboteurs, or just idiots. What else the possible explanation, that every CUDA compiler is tied to particular version of gcc, and can not be used with a higher one (yeah, and the most upvoted answer in the link didn't work for me).
          • CLSTM. It probably works, I didn't manage to compile it because of some dependencies, that require me to upgrade the system, which I have reason to not to do in a few near months.


          That's all I manage to find. All those «great articles and blog posts» you may find, every essay, and everything else, were written in… Python! Yeah, there's a bunch of source code and tutorials for Python, and it does stick very nicely with those little deals, like a small one-off research, or blog post. I do like some features of Python, like simultaneous variables update, but I hope everyone understand that a language without static typing going to turn into a deep hell for maintenance, once the project grow a bit from a «little research». I didn't mean to say anything against dynamic languages — I meant the languages with completely missing static type check. E.g. C# is a dynamic language, but it have at least the bare minimum to be checked in compile time.
          Last edited by Hi-Angel; 12 March 2016, 02:47 PM. Reason: dynamic langs not that bad, that's not what I mean

          Comment


          • #6
            What do you mean neural networks? Like the one in our brain? Wut?

            Comment


            • #7
              Originally posted by rabcor View Post
              What do you mean neural networks? Like the one in our brain? Wut?
              Well, originally based on animal brains.

              Comment


              • #8
                Deep Learning is where the majority of breakthroughs are being made at the moment. It's not clear whether this Neural Network library current supports Deep Feed Forward networks and/or supports various layers popular in the Deep Learning field (ie. Convolutional, ReLu, etc).

                Another key problem with current Deep Learning libraries is they almost exclusively support CUDA instead of OpenCL. I'm hopeful this new Gnu library can support Deep Neural Networks and LSTM with OpenCL and CPU instead of CUDA.

                Comment


                • #9
                  I started writing a guide on neural networks because I really believe that many cool ideas are just not well explained to maximise the widest understanding. I think a 15 year could understand simple neural networks completely.

                  Anyway - I'm not advertising my book here. Instead my blog has some interesting posts - such as:

                  * simple ways to boost performance - in this case to 98% for the handwritten number recognition data set with super simple ideas and code only http://makeyourownneuralnetwork.blog...rmance-to.html

                  * reversing a neural network - after training ask it what the question (not answer) should be to get an insigt into the "mind" of a neural networtk http://makeyourownneuralnetwork.blog...l-network.html

                  * I've put out a free early draft with the section that explains the concepts, seeking feedback http://makeyourownneuralnetwork.blog...ck-wanted.html


                  All the code is on github (GPLv2) https://github.com/makeyourownneural...nneuralnetwork


                  It's amazing - about 30+ lines of simple unoptimised (optimised for readabilty) Python code and you can achieve industry leading performance learning to classify human handwritten numbers. You gotta love computer science!

                  Coming up soon will be a little diversion to explore compelx valued neural networks, implementation in Julia, and also prove you get get it working on a RaspPi Zero!

                  Comment


                  • #10
                    Originally posted by Hi-Angel View Post
                    E.g. C# is a dynamic language, but it have at least the bare minimum to be checked in compile time.
                    It really depends what you mean by "dynamic," but I think it's a bit of a stretch to call C# dynamic. It is almost completely statically typed.

                    Comment

                    Working...
                    X