mcp_server_name | runtime | capabilities | dependencies | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ragdocs |
nodejs |
|
|
This guide provides step-by-step instructions for AI agents to install and configure the mcp-ragdocs server.
-
Node.js:
node -v
Expected Output: Version 16 or higher (e.g.,
v16.0.0
) Validation: Output version number must be >= 16.0.0 -
Docker:
docker --version
Expected Output: Docker version information (e.g.,
Docker version 24.0.0
) Validation: Command should return version without error -
Ollama:
ollama --version
Expected Output: Ollama version information Validation: Command should return version without error
-
npm:
npm -v
Expected Output: npm version information Validation: Command should return version without error
-
Install Package:
npm install -g @qpd-v/mcp-server-ragdocs
Expected Output: Success message indicating package was added Validation: No error messages, package added to global modules
-
Verify Global Installation:
npm list -g @qpd-v/mcp-server-ragdocs
Expected Output: Should show @qpd-v/mcp-server-ragdocs@ Validation: Package is listed in global modules
-
Start Qdrant:
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
Expected Output: Qdrant startup messages, including "Access web UI at http://localhost:6333/dashboard" Validation: Server starts without errors, ports 6333 and 6334 are bound
-
Verify Qdrant:
curl http://localhost:6333/collections
Expected Output: HTTP 200 status code with JSON response:
{"result":{"collections":[]}}
Validation: Response is valid JSON and includes "collections" key -
Install Ollama Model:
ollama pull nomic-embed-text
Expected Output: Download progress followed by completion message Validation: Model is downloaded without errors
-
Verify Ollama Model:
ollama list | grep nomic-embed-text
Expected Output: Line containing "nomic-embed-text" with size and modification date Validation: Model is listed in available models
Select the appropriate path based on your environment:
-
Cline:
%AppData%\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
Validation: File should exist and be writable
-
Roo-Code:
%AppData%\Roaming\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json
Validation: File should exist and be writable
-
Claude Desktop:
%AppData%\Claude\claude_desktop_config.json
Validation: File should exist and be writable
Add this configuration to the selected file(s):
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama",
"OLLAMA_URL": "http://localhost:11434"
},
"disabled": false,
"autoApprove": []
}
}
}
Required Actions:
- Replace
YOUR_USERNAME
with actual Windows username - Ensure
disabled
is set tofalse
- Ensure
autoApprove
is set to[]
Validation: JSON should be valid and parseable
-
Verify Qdrant API:
curl http://localhost:6333/collections
Expected Output: HTTP 200 status code with JSON response:
{"result":{"collections":[]}}
Validation: Response is valid JSON and service is responsive -
Verify Ollama Service:
ollama list
Expected Output: List including "nomic-embed-text" entry Validation: Model is available and service is running
-
Test Documentation Import:
Add this documentation: https://docs.qdrant.tech/
Expected Output: Success message indicating documentation was added Validation: Documentation appears in sources list
-
Test Search Functionality:
Search the documentation for "what is Qdrant?"
Expected Output: Relevant search results from the added documentation Validation: Results contain information about Qdrant
-
Verify Node.js Environment:
node -v && npm -v
Expected Output: Two version numbers (Node.js >= 16.0.0) Validation: Both commands return version numbers without errors Resolution if Failed: Reinstall Node.js from nodejs.org
-
Check Docker Container:
docker ps | grep qdrant
Expected Output: Line containing "qdrant/qdrant" with port mappings Validation: Container is running and ports are mapped correctly Resolution if Failed: Run
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
-
Verify Ollama Service:
ollama list
Expected Output: List of models including nomic-embed-text Validation: Service is running and model is available Resolution if Failed: Run
ollama pull nomic-embed-text
-
Check Configuration Files:
# For each config file path cat "<CONFIG_FILE_PATH>"
Expected Output: Valid JSON with ragdocs configuration Validation: JSON is valid and contains required fields Resolution if Failed: Copy configuration template from above
-
Qdrant Connection Error:
docker restart $(docker ps -q --filter ancestor=qdrant/qdrant)
Expected Output: Container ID Validation: Qdrant service becomes available at http://localhost:6333
-
Ollama Model Missing:
ollama pull nomic-embed-text
Expected Output: Download progress and completion message Validation: Model appears in
ollama list
output -
npm Global Install Error:
npm cache clean --force && npm install -g @qpd-v/mcp-server-ragdocs
Expected Output: Success message from npm install Validation: Package appears in
npm list -g
-
Path Resolution Issues:
- Windows Username: Replace
YOUR_USERNAME
with output ofecho %USERNAME%
- Path Separators: Use
/
for npm paths in configuration - Verify paths exist: Check each directory in the path exists
- Windows Username: Replace