Ollama tips?
There are several ways to install ollama for local LLM use. Here is my experience so far:
1) Using brew
Pros: Brew module keeps packages up to date, and since ollama gets updated quite frequently it makes using it a breeze
Cons:
- need to set up systemd services manually
- for some reason it runs SIGNIFICANTLY slower on my device compared to other methods
2) Using alpaca flatpak as recommended by bluefin docs https://docs.projectbluefin.io/ai
Pros:
- easy install
flatpak install flathub com.jeffser.Alpaca
- Much faster than brew package!
- also maintains ollama version to
Cons:
- Can only interact with the models from their simplistic (but powerful) GUI, not possible through CLI
- This is not desirable for my use cases!
3) Using official install instructions: curl -fsSL https://ollama.com/install.sh | sh
Pros:
- Official method which supposedly also sets up systemd services
- Enables CLI use
Cons:
- Install does not work on Fedora Atomic running from CLI
Question HERE! Any thoughts on how to install ollama through this method? Perhaps from the Bluebuild side?4) Inside a docker container: Pros: N/A Cons: Extremely clunky to set up and maintain... Any thoughts on how to make #3 work? How are other users using LLMs locally? Thanks
AI and Machine Learning
The next generation Linux workstation, designed for reliability, performance, and sustainability. Built for the love of the game. Welcome to indie Cloud Native.
9 Replies
I believe that official script should work in build-time
Just not sure if their check for Nvidia using
nvidia-smi
works in build-time
oh, nvm, that's for wsl2
adding users might be the problem in build-timethanks, that's a good overview
could it be possible to run the ollama binary included in the Alpaca flatpak manually?
flatpak run --command
might work for that
another option could be reading through the install script and seeing what works and doesn't work
there at least used to be a ujust ollama
command on Bluefin, so looking at the source code of that would probably helpall the docs for it were removed in this commit https://github.com/ublue-os/bluefin-docs/commit/6e6247a13c1f1938a9d3bcc8b4c4595052b21897
Here is my podman quadlet for ollama maybe this helps?
llm.pod
ollama-app.container
openwebui-app.container
thank you for sharing @Virt-io . In fact a user on the ollama thread recommended a similar approach, and made me aware of https://containrrr.dev/watchtower/ to keep the docker containers up to date (i had complained this approach was annoying to maintain)
so @xyny actually youre right! you can access the ollama WITHIN the alpaca flatpak by doing:
flatpak run --command=ollama com.jeffser.Alpaca serve
and then from another terminal window:
flatpak run --command=ollama com.jeffser.Alpaca list
but strangely this internal ollama server is NOT the same as the one used by the alpaca GUI - it's a separate instance. This meanst hat you the models you pull from alpaca gui do not appear in this ollama instance, and the reverse is also true...
Alpaca makes it dead simple to access ollama. I mean on paper it's basically the same as setting up a docker/distrobox container manually. This is indeed a good alternative in terms of simplicity.. but not the best way for power users.
i will attempt to go the scripts
module approach. The script will draw from the official ollama install.sh script, along with with a usermod
script and systemd service file that also gets created. Drawing maximum inspiration from the SDDM method shown by wayblue:
https://github.com/wayblueorg/wayblue/blob/live/files/systemd/system/sddm-boot.service
https://github.com/wayblueorg/wayblue/blob/live/files/system/etc/sddm/sddm-useradd
Should get to that next week. Will keep you in the loop, and close this issue when I get it working.first attempt at making this work as described above: https://github.com/mecattaf/zen/commit/1cba65a3f8e65f744ab7a67c63a3a60f0cc70065
Might be a good idea to follow any new changes to the upstream file with this
https://app.github-file-watcher.com/
nice tool. indeed i improved the ollama-install.sh file since this commit https://github.com/mecattaf/zen/blob/main/files/scripts/install-ollama.sh
GitHub
zen/files/scripts/install-ollama.sh at main · mecattaf/zen
Fedora Atomic spin for Asus Zenbook Duo. Contribute to mecattaf/zen development by creating an account on GitHub.
Apparently, there is a better & native solution:
https://stackoverflow.com/questions/9732779/how-to-watch-only-a-directory-in-a-github-repository/25315476#25315476
Stack Overflow
How to 'Watch' only a directory in a GitHub repository?
For example, https://github.com/vanillaforums/Garden is the GitHub repository. But I only want to 'watch' this directory https://github.com/vanillaforums/Garden/tree/master/plugins in the repositor...