You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-6
Original file line number
Diff line number
Diff line change
@@ -83,8 +83,8 @@ export OPENAI_API_KEY="sk-..."
83
83
On the first run for a specific crate version *and feature set*, the server will:
84
84
1. Download the crate documentation using `cargo doc` (with specified features).
85
85
2. Parse the HTML documentation.
86
-
3. Generate embeddings for the documentation content using the OpenAI API (this may take some time and incur costs, though typically only fractions of a US penny for most crates, potentially a few pennies for crates with exceptionally large documentation).
87
-
4. Cache the documentation content and embeddings.
86
+
3. Generate embeddings for the documentation content using the OpenAI API (this may take some time and incur costs, though typically only fractions of a US penny for most crates; even a large crate like `async-stripe` with over 5000 documentation pages cost only $0.18 USD for embedding generation during testing).
87
+
4. Cache the documentation content and embeddings so that the cost isn't incurred again.
88
88
5. Start the MCP server.
89
89
90
90
Subsequent runs for the same crate version *and feature set* will load the data from the cache, making startup much faster.
@@ -154,8 +154,7 @@ You can configure MCP clients like Roo Code to run multiple instances of this se
0 commit comments