📖 docs: Note on 'host.docker.internal' for Ollama Config (#2274)
* docs: update URL to access ollama and comment on 'host.docker.internal' * Update ai_endpoints.md --------- Co-authored-by: Danny Avila <danacordially@gmail.com>
This commit is contained in:
@@ -322,7 +322,8 @@ Some of the endpoints are marked as **Known,** which means they might have speci
|
||||
```yaml
|
||||
- name: "Ollama"
|
||||
apiKey: "ollama"
|
||||
baseURL: "http://localhost:11434/v1/"
|
||||
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
|
||||
baseURL: "http://localhost:11434/v1/chat/completions"
|
||||
models:
|
||||
default: [
|
||||
"llama2",
|
||||
|
||||
Reference in New Issue
Block a user