- Update llama-index to >=0.14.0 - Add llama-index-llms-openai dependency - Ensure compatibility with openai 2.x - Update pyproject.toml - Update offline requirements files
21 lines
797 B
Plaintext
21 lines
797 B
Plaintext
# LightRAG Offline Dependencies - LLM Providers
|
|
# Install with: pip install -r requirements-offline-llm.txt
|
|
# For offline installation:
|
|
# pip download -r requirements-offline-llm.txt -d ./packages
|
|
# pip install --no-index --find-links=./packages -r requirements-offline-llm.txt
|
|
#
|
|
# Recommended: Use pip install lightrag-hku[offline-llm] for the same effect
|
|
# Or use constraints: pip install --constraint constraints-offline.txt -r requirements-offline-llm.txt
|
|
|
|
# LLM provider dependencies (with version constraints matching pyproject.toml)
|
|
aioboto3>=12.0.0,<16.0.0
|
|
anthropic>=0.18.0,<1.0.0
|
|
google-api-core>=2.0.0,<3.0.0
|
|
google-genai>=1.0.0,<2.0.0
|
|
llama-index>=0.14.0,<1.0.0
|
|
llama-index-llms-openai>=0.6.12
|
|
ollama>=0.1.0,<1.0.0
|
|
openai>=2.0.0,<3.0.0
|
|
voyageai>=0.2.0,<1.0.0
|
|
zhipuai>=2.0.0,<3.0.0
|