PyTorch 2.6 Delivers FP16 Support For x86 CPUs, Better Intel GPU Experience

Written by Michael Larabel in Programming on 29 January 2025 at 12:49 PM EST. 3 Comments
PROGRAMMING
PyTorch 2.6 is out today as the newest feature release to this widely-used machine learning library.

With PyTorch 2.6 there is now FP16 support for x86 CPUs both in the eager and Inductor modes. With Intel Xeon 6 P "Granite Rapids" processors this means taking advantage of Flaot16 with the Advanced Matrix Extensions (AMX). In the prior PyTorch 2.5 release the Float16 CPU support was considered prototype-level but now is considered beta-level with better performance and verifying its functionality across a range of workloads.

PyTorch 2.6 Intel Granite Rapids and Arc Graphics


Meanwhile appearing in PyTorch 2.6 in prototype form is improving the user experience with Intel graphics. Both for Intel discrete and accelerated graphics there is better support with PyTorch 2.6, especially on Microsoft Windows. There is an easier software setup experience, improved Windows binaries, and enhanced coverage of Aten operators on Intel GPUs with SYCL kernels.

PyTorch 2.6 also brings several improvements to PT2, FlexAttention support on x86 CPUs for large language models (LLMs), and various other improvements.

Downloads and more details on today's PyTorch 2.6 release via GitHub.
Related News
About The Author
Michael Larabel

Michael Larabel is the principal author of Phoronix.com and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and OpenBenchmarking.org automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via MichaelLarabel.com.

Popular News This Week