You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update server and packages
* Add client over stdio with chat capability
* Add stdio chat client
* Update README.md
* Update README.md to add information about SSE transport
* Add SSE client
* Refactor SSE client
* Add SSE client
* Fix Claude example in README.md
To test the server locally, you can use `example_client`:
158
+
You can send a message to the server by making a POST request with the `sessionId` and your query:
106
159
107
160
```bash
108
-
node build/example_client.js
161
+
curl -X POST "http://localhost:3001/message?session_id=181c7a3d-01a9-498e-8e16-5d5878832cd7" -H "Content-Type: application/json" -d '{
162
+
"jsonrpc": "2.0",
163
+
"id": 1,
164
+
"method": "tools/call",
165
+
"params": {
166
+
"arguments": { "query": "recent news about LLMs" },
167
+
"name": "search"
168
+
}
169
+
}'
109
170
```
110
171
111
-
The script will start the MCP server, fetch available tools, and then call the `search` tool with a query.
172
+
#### Step 4: Receive the Response
173
+
174
+
For the POST request, the server will respond with:
175
+
176
+
```text
177
+
Accepted
178
+
```
179
+
180
+
The server will then invoke the `search` tool using the provided query and stream the response back to the client via SSE:
181
+
182
+
```text
183
+
event: message
184
+
data: {"result":{"content":[{"type":"text","text":"[{\"searchResult\":{\"title\":\"Language models recent news\",\"description\":\"Amazon Launches New Generation of LLM Foundation Model...\"}}
0 commit comments