Announcement

Collapse
No announcement yet.

W3C Publishes Working Draft For Web Neural Network API

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • W3C Publishes Working Draft For Web Neural Network API

    Phoronix: W3C Publishes Working Draft For Web Neural Network API

    The latest JavaScript API to see a public working draft out of the W3C is for a Web Neural Network API...

    https://www.phoronix.com/scan.php?pa...work-API-Draft

  • #2
    Do we really need this?
    Isn't this already possible by compiling down an existing netural network library such as TensorFlow down to WebAssembly with Emscriptem?

    Isn't feature creep a problem and the ever expanding API of web browsers?

    Comment


    • #3
      Originally posted by uid313 View Post
      Do we really need this?
      Isn't this already possible by compiling down an existing netural network library such as TensorFlow down to WebAssembly with Emscriptem?

      Isn't feature creep a problem and the ever expanding API of web browsers?
      My guess is that WebAssembly simply isn’t efficient enough for this kind of bulk calculation.

      The calculation required for ML is huge and for video/audio related ML, they might requires real time processing.

      Also, existing WebGL probably can only used for gaming, but not for ML.

      So they decide to have a native implementation that can take advantage of GPU or whatever accelerator (such as Google’s TPU) in the future and utilize resource in the most efficient way for the native platform.

      Comment


      • #4
        Originally posted by uid313 View Post
        Do we really need this?
        Isn't this already possible by compiling down an existing netural network library such as TensorFlow down to WebAssembly with Emscriptem?

        Isn't feature creep a problem and the ever expanding API of web browsers?
        Optimization.

        Remember that most programs written in WebAssembly run at 50-70% of the native speed.

        Comment


        • #5
          Originally posted by uid313 View Post
          Isn't this already possible by compiling down an existing netural network library such as TensorFlow down to WebAssembly with Emscriptem?
          A lot of phone SoCs now have special-purpose AI accelerator engines that aren't programmable via anything like WebAsm or GPU APIs. So, there needs to be special-purpose support, if web apps are to use them. This also has the potential for better interoperability and efficiency than an ad hoc solution using either of the former methods.

          As for whether web apps really need to use them, I like having the option to do something with a web app, rather than being forced to use a native app. I'm glad they're disabled by default, however, as I wouldn't like web pages to harness my device to do some random cloud inferencing workload for someone else.

          Comment


          • #6
            Things like NoScript will just keep getting more and more important. The API surface of the web is huge and ever growing. Soon there is nothing you can't inside a web browser.

            Comment

            Working...
            X