MLIR Learning Resources
Very interested in diving into the weeds of ML compilers such as Modular and Triton. The overarching motivation -- other than the fact that ML compilers are super-interesting
-- is that in a world of increased demand for ML training / inference but limited GPU (NVIDIA) supply, the ability to write code that is performant and device-agnostic is evermore important.
Are you aware of any resources for learning MLIR incrementally, ideally building from basics to something like a "toy" Modular / Triton compiler, and more ambitiously, the ability to contribute new optimization passes to such compilers? I've walked through the official MLIR "toy" language tutorial and am looking for a hackable set of guides to bridge the gap between basics and real world use.
I realize it's not strictly necessary to understand MLIR -- as the purpose of Modular is to abstract away these details -- but I think it helps to understand how things work under the hood.
My background is ML researcher / engineer so have limited experience with compilers (ML or otherwise) but am eager to learn so as to fully leverage languages such as Mojo, Triton, etc.
Are you aware of any resources for learning MLIR incrementally, ideally building from basics to something like a "toy" Modular / Triton compiler, and more ambitiously, the ability to contribute new optimization passes to such compilers? I've walked through the official MLIR "toy" language tutorial and am looking for a hackable set of guides to bridge the gap between basics and real world use.
I realize it's not strictly necessary to understand MLIR -- as the purpose of Modular is to abstract away these details -- but I think it helps to understand how things work under the hood.
My background is ML researcher / engineer so have limited experience with compilers (ML or otherwise) but am eager to learn so as to fully leverage languages such as Mojo, Triton, etc.
