Ollama installation

From Notes_Wiki
Revision as of 04:51, 8 January 2025 by Saurabh (talk | contribs) (Created page with "Home > Local system based AI tools > Ollama > Ollama installation =Installing ollama on Rocky 9.x= To install Ollama on local system use following steps: # Download Ollama archive (.tgz) from the ollama site: #:<pre> #:: curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz #:</pre> # Extract the ollama source and try to start ollama server on local machine:<source type="bash"> sudo tar -C /usr -xzf ollama-linux-a...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Home > Local system based AI tools > Ollama > Ollama installation

Installing ollama on Rocky 9.x

To install Ollama on local system use following steps:

  1. Download Ollama archive (.tgz) from the ollama site:
    curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
  2. Extract the ollama source and try to start ollama server on local machine:
    sudo tar -C /usr -xzf ollama-linux-amd64.tgz
    export OLLAMA_ORIGINS="moz-extension://*"
    ollama serve
  3. After this run ollama on local system via below commands and test:
    ollama run llama3.2
    The above command may take considerable time when run for first time as it download the entire model. llama3.2 model in above example is 2GB in size. So first the command will download 2GB model and only after that we can prompt ">>>" to type our queries.
  4. To close the system use:
    /bye
  5. Between any two queries which are unrelated we can clear the context using:
    /clear


Note that models are downloaded in current users home folder inside .ollama folder. Ensure you have enough space in this partition / folder for models to get downloaded and stored.


Configure ollama to automatically run on system boot

  1. To configure ollama as service create file '/etc/systemd/system/ollama.service' with following contents:
    [Unit]
    Description=Ollama Service
    After=network-online.target
    
    [Service]
    ExecStart=/usr/bin/ollama serve
    User=saurabh
    Group=saurabh
    Restart=always
    RestartSec=3
    Environment="PATH=$PATH"
    Environment="OLLAMA_ORIGINS=moz-extension://*"
    
    [Install]
    WantedBy=default.target
  2. Reload systemctl configuration and configure ollama to run with system boot:
    sudo systemctl daemon-reload
    sudo systemctl enable ollama
    sudo systemctl start ollama
    sudo systemctl status ollama


Refer:


Home > Local system based AI tools > Ollama > Ollama installation