Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create deployOllama.sh #1620

Merged
merged 1 commit into from
Jun 12, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions scripts/usecases/llm/deployOllama.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
#!/bin/bash

# Exit immediately if a command exits with a non-zero status.
set -e

# Function to display a message
function echo_message() {
echo -e "# $1"
}

# Check NVIDIA driver
echo_message "Checking NVIDIA driver with nvidia-smi"
nvidia-smi

# Install Ollama
echo_message "Installing Ollama"
curl -fsSL https://ollama.com/install.sh | sh

# Modify Ollama service file
echo_message "Modifying Ollama service file"
sudo sed -i '/\[Service\]/a Environment="OLLAMA_HOST=0.0.0.0:3000"' /etc/systemd/system/ollama.service
sudo sed -i 's/User=ollama/User=root/' /etc/systemd/system/ollama.service
sudo sed -i 's/Group=ollama/Group=root/' /etc/systemd/system/ollama.service

# Reload and restart Ollama service
echo_message "Reloading and restarting Ollama service"
sudo systemctl daemon-reload
sudo systemctl restart ollama

# Check Ollama service status
echo_message "Checking Ollama service status"
sudo systemctl status ollama --no-pager

# List models on Ollama
echo_message "Listing models on Ollama"
OLLAMA_HOST=0.0.0.0:3000 ollama list