Toshiba Looks To Upstream DNN Image Processing Accelerator Driver For Their Visconti SoC
Toshiba's Visconti SoC provides an optimized image recognition processor and geared for advanced driver assistance system (ADAS) solutions for automobiles and similar modern use-cases. Toshiba engineers are now publishing patches for their DNN image processing accelerator driver with hopes of getting the code upstreamed into the mainline Linux kernel.
Toshiba has begun posting their DNN image processing accelerator driver code for the Visconti SoC with an eye toward upstreaming it into the Linux kernel. The Visconti DNN kernel driver in its initial form is just 1.8k lines of code.
Though as of writing I haven't yet seen any open-source user-space patches for making use of this driver for DNN inference workloads. The kernel documentation does talk though of an offline compiler for generating binaries from Caffe/ONNX compatible CNN models as well as a "libdrvutil" user-space library for building a descriptor for neural network inference. So we'll see if this all comes out as open-source as without an open user-space this driver wouldn't be accepted to the mainline kernel.
This driver is currently geared for their Visconti 5 hardware IP. Besides their in-house DNN accelerator, the Visconti 5 SoC makes use of Cortex-A53 and Cortex-R4 cores.
In any event this Toshiba Visconti DNN driver is now yet another accelerator looking towards the mainline kernel. The current code for this Toshiba DNN image processing accelerator driver can be found for review on the kernel mailing list.
Toshiba has begun posting their DNN image processing accelerator driver code for the Visconti SoC with an eye toward upstreaming it into the Linux kernel. The Visconti DNN kernel driver in its initial form is just 1.8k lines of code.
Though as of writing I haven't yet seen any open-source user-space patches for making use of this driver for DNN inference workloads. The kernel documentation does talk though of an offline compiler for generating binaries from Caffe/ONNX compatible CNN models as well as a "libdrvutil" user-space library for building a descriptor for neural network inference. So we'll see if this all comes out as open-source as without an open user-space this driver wouldn't be accepted to the mainline kernel.
This driver is currently geared for their Visconti 5 hardware IP. Besides their in-house DNN accelerator, the Visconti 5 SoC makes use of Cortex-A53 and Cortex-R4 cores.
In any event this Toshiba Visconti DNN driver is now yet another accelerator looking towards the mainline kernel. The current code for this Toshiba DNN image processing accelerator driver can be found for review on the kernel mailing list.
1 Comment