Ollama installation
From Notes_Wiki
Revision as of 04:51, 8 January 2025 by Saurabh (talk | contribs) (Created page with "Home > Local system based AI tools > Ollama > Ollama installation =Installing ollama on Rocky 9.x= To install Ollama on local system use following steps: # Download Ollama archive (.tgz) from the ollama site: #:<pre> #:: curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz #:</pre> # Extract the ollama source and try to start ollama server on local machine:<source type="bash"> sudo tar -C /usr -xzf ollama-linux-a...")
Home > Local system based AI tools > Ollama > Ollama installation
Installing ollama on Rocky 9.x
To install Ollama on local system use following steps:
- Download Ollama archive (.tgz) from the ollama site:
- curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
- Extract the ollama source and try to start ollama server on local machine:
sudo tar -C /usr -xzf ollama-linux-amd64.tgz export OLLAMA_ORIGINS="moz-extension://*" ollama serve
- After this run ollama on local system via below commands and test:
- ollama run llama3.2
-
- The above command may take considerable time when run for first time as it download the entire model. llama3.2 model in above example is 2GB in size. So first the command will download 2GB model and only after that we can prompt ">>>" to type our queries.
- To close the system use:
- /bye
- Between any two queries which are unrelated we can clear the context using:
-
- /clear
-
Note that models are downloaded in current users home folder inside .ollama folder. Ensure you have enough space in this partition / folder for models to get downloaded and stored.
Configure ollama to automatically run on system boot
- To configure ollama as service create file '/etc/systemd/system/ollama.service' with following contents:
[Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/bin/ollama serve User=saurabh Group=saurabh Restart=always RestartSec=3 Environment="PATH=$PATH" Environment="OLLAMA_ORIGINS=moz-extension://*" [Install] WantedBy=default.target
- Reload systemctl configuration and configure ollama to run with system boot:
- sudo systemctl daemon-reload
- sudo systemctl enable ollama
- sudo systemctl start ollama
- sudo systemctl status ollama
Refer:
- https://github.com/ollama/ollama/blob/main/docs/linux.md
- https://adasci.org/hands-on-guide-to-running-llms-locally-using-ollama/
Home > Local system based AI tools > Ollama > Ollama installation