-
Notifications
You must be signed in to change notification settings - Fork 182
fix: use generic LLMAgent instead of OpenAIAgent (adds support for Ge… #410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…mini and Anthropic for Agentic RAG)
🦋 Changeset detectedLatest commit: 0476b1d The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. [Click here if you're a maintainer who wants to add another changeset to this PR](https://github.com/run-llama/create-llama/new/ms/add-gemini-anthropic?filename=.changeset/angry-seals-greet.md&value=---%0A%22create-llama%22%3A%20patch%0A---%0A%0Afix%3A%20use%20generic%20LLMAgent%20instead%20of%20OpenAIAgent%20(adds%20support%20for%20Ge%E2%80%A6%0A) |
WalkthroughThe changes introduce a patch named "create-llama" that replaces the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
.changeset/shy-bulldogs-wait.md (1)
5-5
: Consider enhancing the changeset description.While the current description is clear, it could be more informative by mentioning the benefits and impact. Consider expanding it to:
-Fix: use generic LLMAgent instead of OpenAIAgent (adds support for Gemini and Anthropic for Agentic RAG) +Fix: use generic LLMAgent instead of OpenAIAgent (adds support for Gemini and Anthropic for Agentic RAG), enabling more flexible model choice and vendor independence
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
📒 Files selected for processing (2)
.changeset/shy-bulldogs-wait.md
(1 hunks)templates/components/engines/typescript/agent/chat.ts
(2 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
templates/components/engines/typescript/agent/chat.ts (1)
Pattern templates/**
: For files under the templates
folder, do not report 'Missing Dependencies Detected' errors.
🔇 Additional comments (3)
.changeset/shy-bulldogs-wait.md (1)
1-3
: LGTM! Appropriate version bump choice.
The "patch" bump is correct for this backward-compatible enhancement that adds support for additional LLM providers.
templates/components/engines/typescript/agent/chat.ts (2)
4-4
: LGTM: Import change aligns with generic LLM support.
The import change from OpenAIAgent to LLMAgent is appropriate for enabling support for multiple LLM providers.
Line range hint 45-49
: Consider improving type safety and environment variable handling.
The current implementation has a few areas that could be enhanced:
- The type casting through
unknown
suggests a type compatibility issue betweenLLMAgent
andBaseChatEngine
. - The
systemPrompt
environment variable is used without validation.
Consider applying these improvements:
+ interface AgentConfig {
+ tools: BaseToolWithCall[];
+ systemPrompt?: string;
+ }
- const agent = new LLMAgent({
- tools,
- systemPrompt: process.env.SYSTEM_PROMPT,
- }) as unknown as BaseChatEngine;
+ const systemPrompt = process.env.SYSTEM_PROMPT || 'Default system prompt here';
+ const config: AgentConfig = {
+ tools,
+ systemPrompt,
+ };
+ const agent = new LLMAgent(config) as BaseChatEngine;
Let's verify the type compatibility between LLMAgent and BaseChatEngine:
…mini and Anthropic for Agentic RAG)
Summary by CodeRabbit
New Features
LLMAgent
.Bug Fixes
LLMAgent
while maintaining existing functionality.