Announcement

Collapse
No announcement yet.

Intel NPU Library v1.2 Adds Int4 Support & Performance Optimizations

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel NPU Library v1.2 Adds Int4 Support & Performance Optimizations

    Phoronix: Intel NPU Library v1.2 Adds Int4 Support & Performance Optimizations

    Intel released a new version of its NPU Acceleration Library, the user-space Python library for leveraging the Neural Processing Unit (NPU) found within their Core Ultra "Meteor Lake" laptops and upcoming Lunar Lake and Arrow Lake hardware as well...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    It's weird to me to advertise performance improvements for a product that practically nobody has got their hands on. I know there are some 3rd party developers perhaps given engineering samples or just early release products but that's still a small group of people.

    Not that I'm complaining, it's just odd.

    Comment


    • #3
      Originally posted by schmidtbag View Post
      It's weird to me to advertise performance improvements for a product that practically nobody has got their hands on. I know there are some 3rd party developers perhaps given engineering samples or just early release products but that's still a small group of people.

      Not that I'm complaining, it's just odd.
      Already exist products, in the same time that already exists products with qualcoom ai engine in qualcoom notebooks.

      Intel meteor lake notebooks is the first market product with intel NPU (formely knows as intel VPU).
      I really prefer that name VPU versatile processing unit or the formely Vision procesing unit.

      Comment


      • #4
        I noticed in the wccftech coverage of Battlemage that a slide shows Int2 support in XMX ... in the Key Peak Metrics slide.

        Comment


        • #5
          Might be useful, waiting for onnxruntime to implement backend.

          I wonder, how it will work inside docker?

          Comment

          Working...
          X