Skip to content

Commit f88e027

Browse files
colin-sentryantonpirkervivianyentran
authored
Add OpenAI docs (#9376)
* Add OpenAI docs * Change semantics of "send or do not send prompts" * Added OpenAI integration to the list of integrations * openai is auto enabled * Update docs/platforms/python/integrations/openai/index.mdx Co-authored-by: vivianyentran <[email protected]> * Update docs/platforms/python/integrations/openai/index.mdx Co-authored-by: vivianyentran <[email protected]> * Update index.mdx --------- Co-authored-by: Anton Pirker <[email protected]> Co-authored-by: vivianyentran <[email protected]>
1 parent 68cc263 commit f88e027

File tree

2 files changed

+105
-0
lines changed

2 files changed

+105
-0
lines changed

docs/platforms/python/integrations/index.mdx

+6
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,12 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
3333
| <LinkWithPlatformIcon platform="python.redis" label="Redis" url="/platforms/python/integrations/redis" /> ||
3434
| <LinkWithPlatformIcon platform="python.sqlalchemy" label="SQLAlchemy" url="/platforms/python/integrations/sqlalchemy" /> ||
3535

36+
## AI
37+
38+
| | **Auto enabled** |
39+
| ------------------------------------------------------------------------------------------------------------------ | :--------------: |
40+
| <LinkWithPlatformIcon platform="openai" label="OpenAI" url="/platforms/python/integrations/openai" /> ||
41+
3642
## Data Processing
3743

3844
| | **Auto enabled** |
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,99 @@
1+
---
2+
title: OpenAI
3+
description: "Learn about using Sentry for OpenAI."
4+
---
5+
6+
This integration connects Sentry with the [OpenAI Python SDK](https://github.com/openai/openai-python).
7+
The integration has been confirmed to work with OpenAI 1.13.3.
8+
9+
## Install
10+
11+
Install `sentry-sdk` from PyPI with the `openai` extra:
12+
13+
```bash
14+
pip install --upgrade 'sentry-sdk[openai]'
15+
```
16+
17+
## Configure
18+
19+
If you have the `openai` package in your dependencies, the OpenAI integration will be enabled automatically when you initialize the Sentry SDK.
20+
21+
An additional dependency, `tiktoken`, is required if you want to calculate token usage for streaming chat responses.
22+
23+
<SignInNote />
24+
25+
```python
26+
from openai import OpenAI
27+
28+
import sentry_sdk
29+
30+
sentry_sdk.init(
31+
dsn="___PUBLIC_DSN___",
32+
enable_tracing=True,
33+
traces_sample_rate=1.0,
34+
)
35+
36+
client = OpenAI()
37+
```
38+
39+
## Verify
40+
41+
Verify that the integration works by inducing an error. The error and performance transaction should appear in your Sentry project.
42+
```python
43+
from openai import OpenAI
44+
import sentry_sdk
45+
46+
sentry_sdk.init(...) # same as above
47+
48+
client = OpenAI(api_key="a bad API key")
49+
with sentry_sdk.start_transaction(op="ai-inference", name="The result of the AI inference"):
50+
response = (
51+
client.chat.completions.create(
52+
model="some-model", messages=[{"role": "system", "content": "hello"}]
53+
)
54+
.choices[0]
55+
.message.content
56+
)
57+
print(response)
58+
```
59+
60+
After running this script, a transaction will be created in the Performance section of [sentry.io](https://sentry.io). Additionally, an error event (about the bad API key) will be sent to [sentry.io](https://sentry.io) and will be connected to the transaction.
61+
62+
It takes a couple of moments for the data to appear in [sentry.io](https://sentry.io).
63+
64+
## Behavior
65+
66+
- The OpenAI integration will connect Sentry with all supported OpenAI methods automatically.
67+
68+
- All exceptions leading to an OpenAIException are reported.
69+
70+
- The supported modules are currently `chat.completions.create` and `embeddings.create`.
71+
72+
- Sentry is configured not to consider LLM and tokenizer inputs/outputs as PII. If you want to include them, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts despite `send_default_pii=True`, configure the integration with `include_prompts=False` like in the Options section.
73+
74+
## Options
75+
76+
By adding `OpenAIIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `OpenAIIntegration` to change its behavior:
77+
78+
```python
79+
import sentry_sdk
80+
from sentry_sdk.integrations.openai import OpenAIIntegration
81+
82+
sentry_sdk.init(
83+
dsn="___PUBLIC_DSN___",
84+
enable_tracing=True,
85+
send_default_pii=True,
86+
traces_sample_rate=1.0,
87+
integrations = [
88+
OpenAIIntegration(
89+
include_prompts=False, # LLM/tokenizer inputs/outputs will be not sent to Sentry, despite send_default_pii=True
90+
),
91+
],
92+
)
93+
```
94+
95+
## Supported Versions
96+
97+
- OpenAI: 1.0+
98+
- tiktoken: 0.6.0+
99+
- Python: 3.9+

0 commit comments

Comments
 (0)