Mojo for heterogeneous compute
Hello! Mojo talks about managing heterogeneous compute (from the LLVM dev con '23) slides. Does it mean that the runtime can distribute the load across heterogeneous compute elements? My question is not for AI/ML angle, so it's not just about GPUs but multi-core ASICs with variety of cores. Thank you in advance
10 Replies
It means you can compile Mojo to run on CPU, GPU, and any exotic accelerator that comes along in the future.
And yes you can be able to create a runtime that distributes work among these, Mojo itself does not ship any runtime
Thank you @Melody Daniel . Does it mean I can have a single source and Mojo lowers it to IR which needs to be further lowered on specific processing element by some backend else (like LLVM)? Something like the way SYCL does?
Where can I find more information on this? Thank you in advance
Yeah, Mojo compiles into MLIR which then handles compiling down to metal. MLIR docs/videos would be the thing to check out but there's not a ton of info out there because MLIR is still relatively new. Mojo is the first language built on top of it.
Indeed. Mojo will ship with a compiler that allows you to compile for CPU and GPU, the idea is that other hardware makes can plug into MLIR and use Mojo to lower to their own hardware.
I'm trying to understand what would be required from hardware to adopt Mojo. What would this lowering amount to for a custom silicon vendor. The runtime needs to be designed by the silicon vendor in this case or there is something which we can refer to from Modular? It appears to me that all this will come later with Mojo but would like to get some idea it possible
I'm guessing an MLIR dialect for your hardware is the place to start. It's easy to create Mojo types that wrap around this dialect
Thank you Kennit. Is it possible for me to have a look at some example or sample code that can guide me?
Low-level IR in Mojo | Modular Docs
Learn how to use low-level primitives to define your own boolean type in Mojo.
All the MLIR infrastructure has not been built out yet though
Thank you so much!