AI Town is a virtual town where AI characters live, chat, and socialize. This guide will help you set up AI Town on a Windows machine using the Windows Subsystem for Linux (WSL). You will connect to Ollama running on your Windows machine and use Convex for the backend.
- Windows 10/11 with WSL2 installed
- Internet connection
First, you need to install WSL2. Follow this guide to set up WSL2 on your Windows machine. We recommend using Ubuntu as your Linux distribution.
Open your WSL terminal (Ubuntu) and update your packages:
sudo apt update
NVM (Node Version Manager) helps manage multiple versions of Node.js. Install NVM and Node.js 18 (the stable version):
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.2/install.sh | bash
export NVM_DIR="$([ -z "${XDG_CONFIG_HOME-}" ] && printf %s "${HOME}/.nvm" || printf %s "${XDG_CONFIG_HOME}/nvm")"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
source ~/.bashrc
nvm install 18
nvm use 18
Python is required for some dependencies. Install Python and Pip:
sudo apt-get install python3 python3-pip
sudo ln -s /usr/bin/python3 /usr/bin/python
Install unzip
and socat
:
sudo apt install unzip socat
Cargo is the Rust package manager. Install Rust and Cargo:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source $HOME/.cargo/env
just
is used to run commands. Install it with Cargo:
cargo install just
export PATH="$HOME/.cargo/bin:$PATH"
just --version
Run the following command to bridge ports, allowing communication between Convex and Ollama:
socat TCP-LISTEN:11434,fork TCP:$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):11434 &
Test if it's working:
curl http://127.0.0.1:11434
If it responds OK, the Ollama API is accessible.
Clone the AI Town repository from GitHub:
git clone https://github.com/a16z-infra/ai-town.git
cd ai-town
Install the necessary npm packages:
npm install
Download and install the precompiled version of Convex:
curl -L -O https://github.com/get-convex/convex-backend/releases/download/precompiled-2024-06-28-91981ab/convex-local-backend-x86_64-unknown-linux-gnu.zip
unzip convex-local-backend-x86_64-unknown-linux-gnu.zip
rm convex-local-backend-x86_64-unknown-linux-gnu.zip
chmod +x convex-local-backend
In a separate terminal, launch Convex:
./convex-local-backend
Set the Ollama host in Convex:
just convex env set OLLAMA_HOST http://localhost:11434
Finally, launch AI Town:
npm run dev
Visit http://localhost:5173
in your browser to see AI Town in action.
If you need to restart the services:
-
Ensure
socat
is running:socat TCP-LISTEN:11434,fork TCP:$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):11434 &
-
Launch Convex:
./convex-local-backend
-
Launch AI Town:
npm run dev
Enjoy your AI Town experience! If you encounter any issues, feel free to reach out for support. 🌟