From 5211db3052e46c6381296485dec11bc7b7662c89 Mon Sep 17 00:00:00 2001 From: Danny <88821366+ddaltn@users.noreply.github.com> Date: Fri, 11 Apr 2025 08:31:04 +0100 Subject: [PATCH] docs: clarify LLM API configuration in README --- examples/clients/simple-chatbot/README.MD | 1 + 1 file changed, 1 insertion(+) diff --git a/examples/clients/simple-chatbot/README.MD b/examples/clients/simple-chatbot/README.MD index 683e4f3f..22996d96 100644 --- a/examples/clients/simple-chatbot/README.MD +++ b/examples/clients/simple-chatbot/README.MD @@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into ```plaintext LLM_API_KEY=your_api_key_here ``` + **Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters. 3. **Configure servers:**