M
Modular12mo ago
david

Will Mojo Metaprogramming support Lisp/Julia Style Macros?

As Paul Graham is describing in http://www.paulgraham.com/avg.html Lisp let you abstract things with macros which can’t still be easily abstracted in current popular languages. If this would be realized in Mojo, we could write less Code and be even more productive.
30 Replies
david
david12mo ago
Then app developers can create their own syntactic sugar with macros if the mojo team don’t have the time to implement it as far I understood
david
david12mo ago
It states it supports python like metaprogramming at compile time. Since lisp/racket/Julia metaprogramming are not the same as python metaprogramming, I am asking if they will support this in the future I mean tell gpt-4 for example Use Julia metaprogramming to make singleton design pattern part of the language syntax and compare to regular code. Instead of writing a singleton boilerplate for every new class you have now a new singleton keyword, so a one liner I think this would be greAt for mojo since the compiler team can focus on the important core stuff and keep the core system simpler and syntactic sugar and language extensions like async await can be implemented easily by normal developers and shared
DanteOz
DanteOz12mo ago
GitHub
[Feature Request] Syntactic Macros and First-Class Metaprogramming ...
Review Mojo's priorities I have read the roadmap and priorities and I believe this request falls within the priorities. What is your request? First-class metaprogramming with syntactic macros (...
david
david12mo ago
I agree with @Chris Lattner that language features should be motivated by problems but in this case in my opinion it is already motivated by the biggest programmer problem. Managing complexity. For example we agree that oop helps to manage complexity and languages like c could emulate oop. So I assume if c would support Julia like metaprogramming a library with macros could add oop syntax to c to manage complexity without the longer process to add language features by the language designers and them having an agreement first and without increasing the complexity of the core language, opposite to c++ and all it’s features. New Language constructs could first be fast empirical tested through user libraries and if validated be added to the core language. Maybe it would also help in the language implementation because more patterns can be faster abstracted to manage complexity. Sorry if I said something wrong because I am not an expert.
ModularBot
ModularBot12mo ago
Congrats @david, you just advanced to level 1!
david
david12mo ago
Ok only 0.4 % of Julia stdlibs Code are macros. So it is probably something which should not be used so often anyways
ksandvik
ksandvik12mo ago
Just my opinion but there's no need to have an exact or similar metaprogramming model as Julia or Lisp, just the same methodology. A lot of those cases are solved at compile time, hence the focus of Mojo listed in the documentation mentioned above on this aspect. The benefits of compile-time are performance-related.
Chris Lattner
Chris Lattner12mo ago
We'll have to explore this over time. I'm interested in allowing library authors to make things that look like builtin control flow statements ("for" is not enough, let's get "parallel_for") and supporting things like patterns can only (afaik) realistically be done with a macro like system. That said, as you probably know, there is a huge design space for macro systems. We'll need to see if and what actually makes sense for various usecases. This is all to say "yeah we'll likely do it, but want to make sure it is done the right way, and maximal power isnt' really the goal"
david
david12mo ago
Afaik in case of abstracting patterns of behaviors there is an overlap between macros and higher order functions. So you could have a higher order function implementing the for loop and parallelize logic and inject a function as an argument, which is executed in parallel for every element. Macros are more general and act similar to code generators. In theory a programmer could design a new language for a Problem domain in mojo and a non technical domain expert could „code“ apps with this, reducing the technical barriers for businesses. Here a code example for circuit analysis in Julia c = Parallel(Resistor(5), Serial(Capacitor(1), Inductor(3))) with macros becomes c = (at)circuit R(5) // (C(1) —> I(3)). Because Julia treats internally Code as data, these code transformations are easier and natural to code. Would you consider a macro system where you could have first order logic similar syntax in mojo too powerful? It would be cool if you could just take the paper math and that would already be the program to a big extent I mean Mojo tries to solve the N world problem. Having a good Macro system solves it partly since you still have to map from the math or business world to the Imperative programming world
PriNova
PriNova12mo ago
I agree that OOP helps to manage complexity, but do not focus too much on Clean Code or OOP, because then you suffer a performance hit in your code a lot. For performance rich code, focus more on data-oriented design like it is done in game development and put OOP onto the shell. For more reference, have a search for Casey Muratori or Richard Fabian.
Chris Lattner
Chris Lattner12mo ago
We haven't designed a macro system, so I can't speak to specific design points. We'll have to see as other more fundamental pieces of the type system come together.
david
david11mo ago
Actually I found a simpler use case which could save much time, porting Python libraries to Mojo. For example in the matrix multiplication example in the docs the optimized Mojo implementation looks like # Use the above tile function to perform tiled matmul. fn matmul_tiled_parallelized(C: Matrix, A: Matrix, B: Matrix): @parameter fn calc_row(m: Int): @parameter fn calc_tile[tile_x: Int, tile_y: Int](x: Int, y: Int): for k in range(y, y + tile_y): @parameter fn dot[nelts : Int,](n : Int): C.store[nelts](m,n + x, C.loadnelts + A[m,k] * B.loadnelts) vectorizenelts, dot # We hardcode the tile factor to be 4. alias tile_size = 4 tile[calc_tile, nelts * tile_size, tile_size](A.cols, C.cols) parallelize[calc_row](C.rows, C.rows) It would be extraordinary, if you could have the performance boost, while maintaining the readability of the pseudocode or naive implementation like Note that C, A, and B have types. fn matmul_naive(C: Matrix, A: Matrix, B: Matrix): @parallelize for m in range(C.rows): @tile for k in range(A.cols): @vectorize for n in range(C.cols): C[m, n] += A[m, k] * B[k, n] Maybe the macros could generate even more optimized code, which is harder to read and engineer. This would benefit in my opinion the also the adoption of Mojo, because Python Libraries maintainer have to rewrite less to get the best out of Mojo. With this approach you could also probably keep the debugging experience of non macro code. When you run in debug mode you would basically ignore the macros. You would debug as regular code. In release Mode the Macro is expanded and gives the performance boost. This approach would also benefit compilation speed, since most compilation happens in debug mode. def matmul_untyped(C, A, B): @parallelized for m in range(C.rows): @tiled for k in range(A.cols): @vectorized for n in range(C.cols): C[m, n] += A[m, k] * B[k, n] This is more consistent with the syntax like the @parameter
Maximum Limelihood Estimator
It sounds like you're reinventing LoopVec/PythonSyntax.jl here 😆 Relatedly, I'd certainly love it if there were some way to get compatibility with Julia out of Mojo, so people could migrate from Julia to Mojo easily as well. There's tons of open-source code written for performance in Julia that could be ported over easily. (Unlike Python, where most code relies heavily on CPython internals that slow it down.)
sora
sora11mo ago
@Maximum Limelihood Estimator I think, there is nothing particularly Julia here, as most languages/compiler toolchain that offers, say, manual loop vectorization, requires you to use some form of pragma/annotation/decorator. Though I really like/want to use the decorator form of vectorize etc, especially when we make for statement a library instead of a builtin, I do wonder if they as capable as the function form.
Want results from more Discord servers?
Add your server