FreeTensor: A Free-Form DSL with Holistic Optimizations for Irregular Tensor Programs

Abstract

Tensor programs are of critical use in many domains. Existing frameworks, such as PyTorch, TensorFlow, and JAX, adopt operator-based programming to ease programming, increase performance, and perform automatic differentiation. However, as the rapid development of tensor programs, operator-based programming shows significant limitations for irregular patterns since a large amount of redundant computation or memory access is introduced. In this work, we propose FreeTensor, a free-form domain specific language which supports redundancy-avoid programming by introducing fine-grained control flow. With optimizations including partial evaluation, dependence-aware transformations, and fine-grained automatic differentiation, FreeTensor is able to generate high performance tensor programs on both CPU and GPU. Experiments show a speedup over existing tensor programming frameworks up to 5.10× (2.08× on average) without differentiation, and up to 127.74× (36.26× on average) after differentiation, for typical irregular tensor programs. Source code is available at https://github.com/roastduck/FreeTensor.

Publication
Proceedings of the 43rd ACM SIGPLAN International Conference on Programming Language Design and Implementation (PLDI ’22)
Shizhi Tang
Shizhi Tang
Ph.D. Student
Jidong Zhai
Jidong Zhai
Associate Professor
(特别研究员、博士生导师)
Liyan Zheng
Liyan Zheng
Ph.D. Student

zly

Chen Zhang
Chen Zhang
Ph.D. Student