Mengbo_Zhou
Mengbo_Zhou
MModular
Created by Mengbo_Zhou on 11/1/2023 in #questions
Mojo in llm?
Hello, guys! I am currently investigating the capabilities of the Mojo, especially in the context of pretraining large-scale models. Does anyone know if Mojo provides support for this kind of extensive pretraining? Additionally, I am looking for any recent research or case studies that discuss the use of Mojo for pretraining large models. If there are benchmarks or any comparisons with other frameworks, that would be particularly helpful. If you've had experience with Mojo in this area or know of resources that might point me in the right direction, I would greatly appreciate your insights. 👀
3 replies