-
Notifications
You must be signed in to change notification settings - Fork 739
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't connect to Ollama on Windows #80
Comments
Checking the file pull_model.Dockerfile, I see the below Then it's a question of making sure the path to call Ollama is recognized by Windows |
@mchiang0610 either we need to figure out a way to generalize this or at least make a clear call-out in the readme? |
So does Ollama run on Windows in the WSL linux subsystem that Docker runs in? |
Ollama can run on both:
|
How to fix the issue with replaceing the
|
Well, addressed already. All
.env:
|
Using the GenAI stack from Docker and having built my Ollama on Windows, I tried to run the stack and I have this message
But my ollama is running, I can use it in command line, I can pull llama2 in command line... then all seems OK on the Ollama side (except it's Windows and not really supported (yet) by Ollama)
2023/11/01 17:38:54 routes.go:678: Listening on 127.0.0.1:11434 (version 0.0.0)
The text was updated successfully, but these errors were encountered: