Files
lightrag/requirements-offline-llm.txt
yangdx 3d9de5ed03 feat: improve Gemini client error handling and retry logic
• Add google-api-core dependency
• Add specific exception handling
• Create InvalidResponseError class
• Update retry decorators
• Fix empty response handling
2025-11-08 22:10:09 +08:00

20 lines
764 B
Plaintext

# LightRAG Offline Dependencies - LLM Providers
# Install with: pip install -r requirements-offline-llm.txt
# For offline installation:
# pip download -r requirements-offline-llm.txt -d ./packages
# pip install --no-index --find-links=./packages -r requirements-offline-llm.txt
#
# Recommended: Use pip install lightrag-hku[offline-llm] for the same effect
# Or use constraints: pip install --constraint constraints-offline.txt -r requirements-offline-llm.txt
# LLM provider dependencies (with version constraints matching pyproject.toml)
aioboto3>=12.0.0,<16.0.0
anthropic>=0.18.0,<1.0.0
google-api-core>=2.0.0,<3.0.0
google-genai>=1.0.0,<2.0.0
llama-index>=0.9.0,<1.0.0
ollama>=0.1.0,<1.0.0
openai>=1.0.0,<3.0.0
voyageai>=0.2.0,<1.0.0
zhipuai>=2.0.0,<3.0.0