Google Moves Ahead With Contributing The MLIR Machine Learning IR To LLVM
Back in April we wrote about MLIR as Google's new IR designed for machine learning. This intermediate representation was designed for use by any machine learning framework and now this common format is being contributed to LLVM.
As noted back then, LLVM founder Chris Latner was among those at Google involved in the development of MLIR. As such, it was just a matter of time before this common IR for machine learning was ready to become part of LLVM.
In a Google blog post by Chris Lattner as well as TensorFlow product manager Tim Davis, they announced their contribution of this IR to the LLVM Foundation.
More details at blog.google.
As noted back then, LLVM founder Chris Latner was among those at Google involved in the development of MLIR. As such, it was just a matter of time before this common IR for machine learning was ready to become part of LLVM.
In a Google blog post by Chris Lattner as well as TensorFlow product manager Tim Davis, they announced their contribution of this IR to the LLVM Foundation.
MLIR aims to be the new standard in ML infrastructure and comes with strong support from global hardware and software partners including AMD, ARM, Cerebras, Graphcore, Habana, IBM, Intel, Mediatek, NVIDIA, Qualcomm Technologies, Inc, SambaNova Systems, Samsung, Xiaomi, Xilinx—making up more than 95 percent of the world’s data-center accelerator hardware, more than 4 billion mobile phones and countless IoT devices. At Google, MLIR is being incorporated and used across all our server and mobile hardware efforts.
Machine learning has come a long way, but it's still incredibly early. With MLIR, AI will advance faster by empowering researchers to train and deploy models at larger scale, with more consistency, velocity and simplicity on different hardware. These innovations can then quickly make their way into products that you use every day and run smoothly on all the devices you have—ultimately leading to AI being more helpful and more useful to everyone on the planet.
More details at blog.google.
8 Comments