-
Notifications
You must be signed in to change notification settings - Fork 183
chore: bump llama_index version for fastapi template #156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
WalkthroughThe recent changes introduce a support feature for LlamaCloud indexes within the FastAPI application. This includes the addition of a Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant FastAPI
participant LlamaCloudIndex
participant VectorStore
User->>FastAPI: Request to get an index
FastAPI->>FastAPI: Check USE_LLAMA_CLOUD variable
alt LlamaCloud enabled
FastAPI->>LlamaCloudIndex: Create LlamaCloud index using get_llama_cloud_index()
LlamaCloudIndex-->>FastAPI: LlamaCloud index instance
else LlamaCloud disabled
FastAPI->>VectorStore: Create VectorStore index
VectorStore -->> FastAPI: VectorStore index instance
end
FastAPI-->>User: Return index instance
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Outside diff range and nitpick comments (4)
templates/types/streaming/fastapi/app/engine/index.py (2)
11-21
: Improve configuration management and error message clarity.Consider using a configuration management library like
pydantic
ordynaconf
for handling environment variables. This can provide better validation and default value management. Also, clarify the error message to guide the user on how to set the environment variables.# Consider using pydantic for configuration management from pydantic import BaseSettings class Settings(BaseSettings): llama_cloud_index_name: str llama_cloud_project_name: str llama_cloud_api_key: str llama_cloud_base_url: str = None class Config: env_prefix = 'LLAMA_CLOUD_' settings = Settings() # Use settings in the function def get_llama_cloud_index(): if not settings.llama_cloud_index_name or not settings.llama_cloud_project_name or not settings.llama_cloud_api_key: raise ValueError( "LLAMA_CLOUD_INDEX_NAME, LLAMA_CLOUD_PROJECT_NAME, and LLAMA_CLOUD_API_KEY are required environment variables." ) index = LlamaCloudIndex( name=settings.llama_cloud_index_name, project_name=settings.llama_cloud_project_name, api_key=settings.llama_cloud_api_key, base_url=settings.llama_cloud_base_url, ) return index
32-45
: Improve configuration management.Consider using a configuration management library like
pydantic
ordynaconf
for handling environment variables. This can provide better validation and default value management.# Use settings in the function def get_index(): use_llama_cloud = settings.use_llama_cloud.lower() == "true" if use_llama_cloud: logger.info("Connecting to LlamaCloud...") return get_llama_cloud_index() else: logger.info("Connecting vector store...") store = get_vector_store() index = VectorStoreIndex.from_vector_store(store) logger.info("Finished load index from vector store.") return indextemplates/components/vectordbs/python/none/index.py (2)
13-23
: Improve configuration management and error message clarity.Consider using a configuration management library like
pydantic
ordynaconf
for handling environment variables. This can provide better validation and default value management. Also, clarify the error message to guide the user on how to set the environment variables.# Consider using pydantic for configuration management from pydantic import BaseSettings class Settings(BaseSettings): llama_cloud_index_name: str llama_cloud_project_name: str llama_cloud_api_key: str llama_cloud_base_url: str = None class Config: env_prefix = 'LLAMA_CLOUD_' settings = Settings() # Use settings in the function def get_llama_cloud_index(): if not settings.llama_cloud_index_name or not settings.llama_cloud_project_name or not settings.llama_cloud_api_key: raise ValueError( "LLAMA_CLOUD_INDEX_NAME, LLAMA_CLOUD_PROJECT_NAME, and LLAMA_CLOUD_API_KEY are required environment variables." ) index = LlamaCloudIndex( name=settings.llama_cloud_index_name, project_name=settings.llama_cloud_project_name, api_key=settings.llama_cloud_api_key, base_url=settings.llama_cloud_base_url, ) return index
43-47
: Improve configuration management.Consider using a configuration management library like
pydantic
ordynaconf
for handling environment variables. This can provide better validation and default value management.# Use settings in the function def get_index(): use_llama_cloud = settings.use_llama_cloud.lower() == "true" if use_llama_cloud: logger.info("Connecting to LlamaCloud...") return get_llama_cloud_index() else: storage_dir = settings.storage_dir # check if storage already exists if not os.path.exists(storage_dir): return None # load the existing index logger.info(f"Loading index from {storage_dir}...") storage_context = get_storage_context(storage_dir) index = load_index_from_storage(storage_context) logger.info(f"Finished loading index from {storage_dir}") return index
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (3)
- templates/components/vectordbs/python/none/index.py (2 hunks)
- templates/types/streaming/fastapi/app/engine/index.py (1 hunks)
- templates/types/streaming/fastapi/pyproject.toml (1 hunks)
Files skipped from review due to trivial changes (1)
- templates/types/streaming/fastapi/pyproject.toml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- templates/types/streaming/fastapi/pyproject.toml (1 hunks)
Files skipped from review due to trivial changes (1)
- templates/types/streaming/fastapi/pyproject.toml
Summary by CodeRabbit
New Features
Improvements
llama-index
andllama-index-core
dependencies to versions0.10.52
and0.10.52.post1
, respectively, improving compatibility and performance.Configuration