M
Modular10mo ago
Matt

llama2.mojo to safetensors?

Very much a non traditional programmer here. I've really enjoyed the community and the innovation! I'm probably asking something out of scope, but is it possible to convert the llama2.mojo made by tairov to a safetensors format? Or is it that the llama2.mojo file is only for inference and we are not at the point where mojo can be converted to safetensors for loading onto gpus? https://github.com/tairov/llama2.mojo?tab=readme-ov-file - original mojo repo from Tairov. https://github.com/tairov/llama2.py - llama2.py repo from Tairov https://github.com/karpathy/llama2.c - karpathy llama2.c
GitHub
GitHub - tairov/llama2.mojo: Inference Llama 2 in one file of pure ...
Inference Llama 2 in one file of pure 🔥. Contribute to tairov/llama2.mojo development by creating an account on GitHub.
GitHub
GitHub - tairov/llama2.py: Inference Llama 2 in one file of pure Py...
Inference Llama 2 in one file of pure Python. Contribute to tairov/llama2.py development by creating an account on GitHub.
GitHub
GitHub - karpathy/llama2.c: Inference Llama 2 in one file of pure C
Inference Llama 2 in one file of pure C. Contribute to karpathy/llama2.c development by creating an account on GitHub.
3 Replies
Ehsan M. Kermani (Modular)
Do you mean to use a model in safetensor for loading in the llama2.mojo code? right now the llama2.mojo expects bin but in general if you have huggingface safetensor for model artifacts such as weights, you can convert to bin via torch.save(safetensors.torch.load_file(...))
Matt
MattOP10mo ago
@Ehsan M. Kermani yes thank you so much. My understanding was that currently only tinyllama-1.1B-Chat-V0.2 and stories 260K, 15M, 42M, and 110M from Tairov's post. My understanding of your response is that now any model can be converted to a bin to run mojo inference. Is that correct?
Ehsan M. Kermani (Modular)
It needs to have the supported implementation. The above conversion is general enough but it won't work if the model structure is different from the loaded weights no matter in bin, safetensor or any other format.
Want results from more Discord servers?
Add your server