📖 docs: Note on 'host.docker.internal' for Ollama Config (#2274)

* docs: update URL to access ollama and comment on 'host.docker.internal'

* Update ai_endpoints.md

---------

Co-authored-by: Danny Avila <danacordially@gmail.com>
This commit is contained in:
Till Zoppke
2024-04-02 09:25:15 +02:00
committed by GitHub
parent 30d084e696
commit ed17e17a73

View File

@@ -322,7 +322,8 @@ Some of the endpoints are marked as **Known,** which means they might have speci
```yaml
- name: "Ollama"
apiKey: "ollama"
baseURL: "http://localhost:11434/v1/"
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
baseURL: "http://localhost:11434/v1/chat/completions"
models:
default: [
"llama2",