Habana Labs Opens Up The Code To Their AI Compiler, SynapseAI Core
Intel-owned Habana Labs now has the most open software stack among AI accelerators! While Habana Labs has long provided an open-source, upstream kernel driver for their Gaudi AI training and Goya AI inference accelerators, the user-space portions including their code compiler and run-time library have been closed-source. This has been a thorn for upstream kernel developers and their standards, but now Habana Labs has open-sourced their user-space components too.
Habana Labs has now open-sourced their Tensor Processing Core "TPC" LLVM compiler as well as SynapseAI Core as a reference implementation to their SynapseAI API implementation. SynapseAI Core has all the functionality needed to run deep learning training workloads on Gaudi. However, Habana Labs does note that their closed-source library remains more optimized at least for now.
Oded Gabbay of Habana Labs who has served as their Linux kernel driver maintainer went on to add, "It is important to note we provided all the necessary APIs to connect this library to any Deep Learning frameworks by writing appropriate backends in the frameworks and by writing more TPC kernels to implement the different operators."
The initial "v1.0" release of their TPC LLVM code compiler was just released this morning for their public code drop. Similarly there is also a v1.0 cut of their initial SynapseAI Core via GitHub.
Oded announced this open-source user-space achievement today to the upstream kernel developers.
Having a working open-source user-space (previously they just had an open-source thunk library) is great news and should allow more of their Linux kernel changes to be upstreamed. Recently Habana Labs had been taking a lot of heat over their proposed changes to their open-source kernel driver but lacking any open-source user-space client to the changes and bypassing some of the upstream standards and requirements. Hopefully that will be all good now moving forward thanks to this open-source user-space.
Habana Labs has now open-sourced their Tensor Processing Core "TPC" LLVM compiler as well as SynapseAI Core as a reference implementation to their SynapseAI API implementation. SynapseAI Core has all the functionality needed to run deep learning training workloads on Gaudi. However, Habana Labs does note that their closed-source library remains more optimized at least for now.
Oded Gabbay of Habana Labs who has served as their Linux kernel driver maintainer went on to add, "It is important to note we provided all the necessary APIs to connect this library to any Deep Learning frameworks by writing appropriate backends in the frameworks and by writing more TPC kernels to implement the different operators."
The initial "v1.0" release of their TPC LLVM code compiler was just released this morning for their public code drop. Similarly there is also a v1.0 cut of their initial SynapseAI Core via GitHub.
Oded announced this open-source user-space achievement today to the upstream kernel developers.
Having a working open-source user-space (previously they just had an open-source thunk library) is great news and should allow more of their Linux kernel changes to be upstreamed. Recently Habana Labs had been taking a lot of heat over their proposed changes to their open-source kernel driver but lacking any open-source user-space client to the changes and bypassing some of the upstream standards and requirements. Hopefully that will be all good now moving forward thanks to this open-source user-space.
3 Comments