Unable to use PyTorch from Mojo

I just wanted to try Pytorch and created a project using nightly build. I then installed PyTorch using: magic add "pytorch". This is the code I tried: from python import Python fn main() raises: torch = Python.import_module("torch") x = torch.tensor([1, 2, 3]) print(x) It resulted in the following error: Unhandled exception caught during execution: libmkl_intel_lp64.so.2: cannot open shared object file: No such file or directory mojo: error: execution exited with a non-zero result: 1 I can see "libmkl_intel_lp64.so.2" in the .magic\envs\default\lib folder so I do not understand what the problem is.
18 Replies
Darkmatter
Darkmatter6d ago
Did you pull in Intel Python? Pytorch usually links to OpenBLAS.
Gennadiy
GennadiyOP6d ago
How do I do that?
Darkmatter
Darkmatter6d ago
What does your pixi.toml/mojoproject.toml file look like?
Gennadiy
GennadiyOP6d ago
This the content of momproject.toml: [project] channels = ["conda-forge", "https://conda.modular.com/max-nightly"] description = "Add a short description here" name = "mojo_by_example-nightly" platforms = ["linux-64"] version = "0.1.0" [tasks] [dependencies] max = ">=25.1.0.dev2024121305,<26" numpy = ">=1.26.4,<2" jax = ">=0.4.35,<0.5" pydicom = ">=3.0.1,<4" torchvision = ">=0.20.1,<0.21" pytorch = ">=2.5.1,<3"
Darkmatter
Darkmatter6d ago
And you are inside of the magic environment when compiling and running?
Gennadiy
GennadiyOP6d ago
Yes, I made sure to double-check.
Darkmatter
Darkmatter6d ago
One second, let me run it locally. Ok, so I have oneAPI installed and it picked it up as my system BLAS. Let me try that in a container. I can reproduce it, so it's a Mojo issue. @Gennadiy It's probably easiest to add the mkl package to get the dependency while we sort out what exactly pytorch did, since it seems like this is their fault.
Gennadiy
GennadiyOP6d ago
Thanks. I will read through that page.
Darkmatter
Darkmatter6d ago
As a note, MKL is quite a bit faster than OpenBLAS, which is what pytorch and numpy used before. So this is an upgrade, just one that appears to have had some consequences for the ecosystem.
Gennadiy
GennadiyOP6d ago
Is there any performance-related issue given that I am on an AMD system?
Darkmatter
Darkmatter6d ago
What generation?
Gennadiy
GennadiyOP6d ago
It's an AMD Ryzen 9 4900HS.
Darkmatter
Darkmatter6d ago
MKL's alleged "crippling of AMD CPUs" was actually around servers that didn't support AVX-512, and MKL only has AVX-512 and scalar fallbacks, but Zen 4 and 5 support AVX-512 and see the same perf as Intel CPUs. That one is probably going to be a bit worse than openblas, since iirc it has AVX2. But, this is not going to be a gigantic difference for most workloads unless you are doing AI on the CPU.
Gennadiy
GennadiyOP6d ago
Ah, that;'s good to know. One last question, can I add the dependency for openblas using magic and how do I do it? I am still figuring my way around mojo.
ModularBot
ModularBot6d ago
Congrats @Gennadiy, you just advanced to level 2!
Darkmatter
Darkmatter6d ago
The basic syntax is in that docs page I sent. How exactly pixi, the underlying tool for magic, decided to translate that to toml I don't know, you may have to play around a bit.
Gennadiy
GennadiyOP6d ago
OK, thanks. I will have a play around now.
Want results from more Discord servers?
Add your server