Skip to content

Commit 142b86c

Browse files
lmolkovalzchenxrmx
authored
OpenAI instrumentation docs fixes (#2988)
* Add openai docs config and improve readme * up * Add manual sample, add no-content tests * update headers * lint * use grpc endpoint in openai samples, add extra env vars to readme * move distro fix to another PR * nits * Ignore examples for pylint * Update .pylintrc * ignroe lint for example * Fix README docs * Update openai.rst * Update conf.py * Update docs-requirements.txt * docs --------- Co-authored-by: Leighton Chen <[email protected]> Co-authored-by: Riccardo Magliocchetti <[email protected]>
1 parent 8656a06 commit 142b86c

21 files changed

+555
-14
lines changed

docs-requirements.txt

+2-1
Original file line numberDiff line numberDiff line change
@@ -33,9 +33,11 @@ elasticsearch>=6.0,<9.0
3333
flask~=2.0
3434
falcon~=2.0
3535
grpcio~=1.27
36+
httpx>=0.18.0
3637
kafka-python>=2.0,<3.0
3738
mysql-connector-python~=8.0
3839
mysqlclient~=2.1.1
40+
openai >= 1.26.0
3941
psutil>=5
4042
psycopg~=3.1.17
4143
pika>=0.12.0
@@ -47,7 +49,6 @@ remoulade>=0.50
4749
sqlalchemy>=1.0
4850
tornado>=5.1.1
4951
tortoise-orm>=0.17.0
50-
httpx>=0.18.0
5152

5253
# indirect dependency pins
5354
markupsafe==2.0.1

docs/conf.py

+15-1
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,13 @@
4040
if isdir(join(instr, f))
4141
]
4242

43+
instr_genai = "../instrumentation-genai"
44+
instr_genai_dirs = [
45+
os.path.abspath("/".join(["../instrumentation-genai", f, "src"]))
46+
for f in listdir(instr_genai)
47+
if isdir(join(instr_genai, f))
48+
]
49+
4350
prop = "../propagator"
4451
prop_dirs = [
4552
os.path.abspath("/".join([prop, f, "src"]))
@@ -60,7 +67,14 @@
6067
for f in listdir(resource)
6168
if isdir(join(resource, f))
6269
]
63-
sys.path[:0] = exp_dirs + instr_dirs + sdk_ext_dirs + prop_dirs + resource_dirs
70+
sys.path[:0] = (
71+
exp_dirs
72+
+ instr_dirs
73+
+ instr_genai_dirs
74+
+ sdk_ext_dirs
75+
+ prop_dirs
76+
+ resource_dirs
77+
)
6478

6579
# -- Project information -----------------------------------------------------
6680

docs/index.rst

+10-1
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ installed separately via pip:
2424
pip install opentelemetry-instrumentation-{instrumentation}
2525
pip install opentelemetry-sdk-extension-{sdk-extension}
2626
27-
A complete list of packages can be found at the
27+
A complete list of packages can be found at the
2828
`Contrib repo instrumentation <https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation>`_
2929
and `Contrib repo exporter <https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/exporter>`_ directories.
3030

@@ -50,6 +50,7 @@ install <https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs>
5050
cd opentelemetry-python-contrib
5151
pip install -e ./instrumentation/opentelemetry-instrumentation-flask
5252
pip install -e ./instrumentation/opentelemetry-instrumentation-botocore
53+
pip install -e ./instrumentation-genai/opentelemetry-instrumentation-openai-v2
5354
pip install -e ./sdk-extension/opentelemetry-sdk-extension-aws
5455
pip install -e ./resource/opentelemetry-resource-detector-container
5556
@@ -62,6 +63,14 @@ install <https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs>
6263

6364
instrumentation/**
6465

66+
.. toctree::
67+
:maxdepth: 2
68+
:caption: OpenTelemetry Generative AI Instrumentations
69+
:name: Generative AI Instrumentations
70+
:glob:
71+
72+
instrumentation-genai/**
73+
6574
.. toctree::
6675
:maxdepth: 2
6776
:caption: OpenTelemetry Propagators

docs/instrumentation-genai/openai.rst

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
OpenTelemetry Python - OpenAI Instrumentation
2+
=============================================
3+
4+
.. automodule:: opentelemetry.instrumentation.openai_v2
5+
:members:
6+
:undoc-members:
7+
:show-inheritance:

docs/nitpick-exceptions.ini

+1
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ py-class=
2424
httpx.Client
2525
httpx.AsyncClient
2626
httpx.BaseTransport
27+
openai.BaseTransport
2728
httpx.AsyncBaseTransport
2829
httpx.SyncByteStream
2930
httpx.AsyncByteStream

instrumentation-genai/opentelemetry-instrumentation-openai-v2/README.rst

+54-2
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,60 @@ package to your requirements.
1919

2020
pip install opentelemetry-instrumentation-openai-v2
2121

22-
If you don't have an OpenAI application, yet, try our `example <example>`_
23-
which only needs a valid OpenAI API key.
22+
If you don't have an OpenAI application, yet, try our `examples <examples>`_
23+
which only need a valid OpenAI API key.
24+
25+
Check out `zero-code example <examples/zero-code>`_ for a quick start.
26+
27+
Usage
28+
-----
29+
30+
This section describes how to set up OpenAI instrumentation if you're setting OpenTelemetry up manually.
31+
Check out the `manual example <examples/manual>`_ for more details.
32+
33+
Instrumenting all clients
34+
*************************
35+
36+
When using the instrumentor, all clients will automatically trace OpenAI chat completion operations.
37+
You can also optionally capture prompts and completions as log events.
38+
39+
Make sure to configure OpenTelemetry tracing, logging, and events to capture all telemetry emitted by the instrumentation.
40+
41+
.. code-block:: python
42+
43+
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
44+
45+
OpenAIInstrumentor().instrument()
46+
47+
client = OpenAI()
48+
response = client.chat.completions.create(
49+
model="gpt-4o-mini",
50+
messages=[
51+
{"role": "user", "content": "Write a short poem on open telemetry."},
52+
],
53+
)
54+
55+
Enabling message content
56+
*************************
57+
58+
Message content such as the contents of the prompt, completion, function arguments and return values
59+
are not captured by default. To capture message content as log events, set the environment variable
60+
`OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` to `true`.
61+
62+
Uninstrument
63+
************
64+
65+
To uninstrument clients, call the uninstrument method:
66+
67+
.. code-block:: python
68+
69+
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
70+
71+
OpenAIInstrumentor().instrument()
72+
# ...
73+
74+
# Uninstrument all clients
75+
OpenAIInstrumentor().uninstrument()
2476
2577
References
2678
----------
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# Update this with your real OpenAI API key
2+
OPENAI_API_KEY=sk-YOUR_API_KEY
3+
4+
# Uncomment to use Ollama instead of OpenAI
5+
# OPENAI_BASE_URL=http://localhost:11434/v1
6+
# OPENAI_API_KEY=unused
7+
# CHAT_MODEL=qwen2.5:0.5b
8+
9+
# Uncomment and change to your OTLP endpoint
10+
# OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
11+
# OTEL_EXPORTER_OTLP_PROTOCOL=grpc
12+
13+
OTEL_SERVICE_NAME=opentelemetry-python-openai
14+
15+
# Change to 'false' to hide prompt and completion content
16+
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true

instrumentation-genai/opentelemetry-instrumentation-openai-v2/example/README.rst renamed to instrumentation-genai/opentelemetry-instrumentation-openai-v2/examples/manual/README.rst

+9-4
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,26 @@
11
OpenTelemetry OpenAI Instrumentation Example
22
============================================
33

4-
This is an example of how to instrument OpenAI calls with zero code changes,
5-
using `opentelemetry-instrument`.
4+
This is an example of how to instrument OpenAI calls when configuring OpenTelemetry SDK and Instrumentations manually.
65

76
When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
87
compatible endpoint. Traces include details such as the model used and the
98
duration of the chat request. Logs capture the chat request and the generated
109
response, providing a comprehensive view of the performance and behavior of
1110
your OpenAI requests.
1211

12+
Note: `.env <.env>`_ file configures additional environment variables:
13+
14+
- `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true` configures
15+
OpenAI instrumentation to capture prompt and completion contents on
16+
events.
17+
1318
Setup
1419
-----
1520

1621
Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
1722
OTLP compatible endpoint should be listening for traces and logs on
18-
http://localhost:4318. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
23+
http://localhost:4317. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
1924

2025
Next, set up a virtual environment like this:
2126

@@ -33,7 +38,7 @@ Run the example like this:
3338

3439
::
3540

36-
dotenv run -- opentelemetry-instrument python main.py
41+
dotenv run -- python main.py
3742

3843
You should see a poem generated by OpenAI while traces and logs export to your
3944
configured observability tool.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# pylint: skip-file
2+
import os
3+
4+
from openai import OpenAI
5+
6+
# NOTE: OpenTelemetry Python Logs and Events APIs are in beta
7+
from opentelemetry import _events, _logs, trace
8+
from opentelemetry.exporter.otlp.proto.grpc._log_exporter import (
9+
OTLPLogExporter,
10+
)
11+
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
12+
OTLPSpanExporter,
13+
)
14+
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
15+
from opentelemetry.sdk._events import EventLoggerProvider
16+
from opentelemetry.sdk._logs import LoggerProvider
17+
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
18+
from opentelemetry.sdk.trace import TracerProvider
19+
from opentelemetry.sdk.trace.export import BatchSpanProcessor
20+
21+
# configure tracing
22+
trace.set_tracer_provider(TracerProvider())
23+
trace.get_tracer_provider().add_span_processor(
24+
BatchSpanProcessor(OTLPSpanExporter())
25+
)
26+
27+
# configure logging and events
28+
_logs.set_logger_provider(LoggerProvider())
29+
_logs.get_logger_provider().add_log_record_processor(
30+
BatchLogRecordProcessor(OTLPLogExporter())
31+
)
32+
_events.set_event_logger_provider(EventLoggerProvider())
33+
34+
# instrument OpenAI
35+
OpenAIInstrumentor().instrument()
36+
37+
38+
def main():
39+
client = OpenAI()
40+
chat_completion = client.chat.completions.create(
41+
model=os.getenv("CHAT_MODEL", "gpt-4o-mini"),
42+
messages=[
43+
{
44+
"role": "user",
45+
"content": "Write a short poem on OpenTelemetry.",
46+
},
47+
],
48+
)
49+
print(chat_completion.choices[0].message.content)
50+
51+
52+
if __name__ == "__main__":
53+
main()
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
openai~=1.54.4
2+
3+
opentelemetry-sdk~=1.28.2
4+
opentelemetry-exporter-otlp-proto-grpc~=1.28.2
5+
opentelemetry-instrumentation-openai-v2~=2.0b0

instrumentation-genai/opentelemetry-instrumentation-openai-v2/example/.env renamed to instrumentation-genai/opentelemetry-instrumentation-openai-v2/examples/zero-code/.env

+6-3
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,16 @@ OPENAI_API_KEY=sk-YOUR_API_KEY
66
# OPENAI_API_KEY=unused
77
# CHAT_MODEL=qwen2.5:0.5b
88

9-
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
10-
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
9+
# Uncomment and change to your OTLP endpoint
10+
# OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
11+
# OTEL_EXPORTER_OTLP_PROTOCOL=grpc
12+
1113
OTEL_SERVICE_NAME=opentelemetry-python-openai
1214

1315
# Change to 'false' to disable logging
1416
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
1517
# Change to 'console' if your OTLP endpoint doesn't support logs
16-
OTEL_LOGS_EXPORTER=otlp_proto_http
18+
# TODO: this should not be necessary once https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3042 is released
19+
OTEL_LOGS_EXPORTER=otlp
1720
# Change to 'false' to hide prompt and completion content
1821
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
OpenTelemetry OpenAI Zero-Code Instrumentation Example
2+
======================================================
3+
4+
This is an example of how to instrument OpenAI calls with zero code changes,
5+
using `opentelemetry-instrument`.
6+
7+
When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
8+
compatible endpoint. Traces include details such as the model used and the
9+
duration of the chat request. Logs capture the chat request and the generated
10+
response, providing a comprehensive view of the performance and behavior of
11+
your OpenAI requests.
12+
13+
Note: `.env <.env>`_ file configures additional environment variables:
14+
15+
- `OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true` configures
16+
OpenTelemetry SDK to export logs and events.
17+
- `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true` configures
18+
OpenAI instrumentation to capture prompt and completion contents on
19+
events.
20+
- `OTEL_LOGS_EXPORTER=otlp` to specify exporter type.
21+
22+
Setup
23+
-----
24+
25+
Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
26+
OTLP compatible endpoint should be listening for traces and logs on
27+
http://localhost:4317. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
28+
29+
Next, set up a virtual environment like this:
30+
31+
::
32+
33+
python3 -m venv .venv
34+
source .venv/bin/activate
35+
pip install "python-dotenv[cli]"
36+
pip install -r requirements.txt
37+
38+
Run
39+
---
40+
41+
Run the example like this:
42+
43+
::
44+
45+
dotenv run -- opentelemetry-instrument python main.py
46+
47+
You should see a poem generated by OpenAI while traces and logs export to your
48+
configured observability tool.
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
openai~=1.54.4
22

33
opentelemetry-sdk~=1.28.2
4-
opentelemetry-exporter-otlp-proto-http~=1.28.2
4+
opentelemetry-exporter-otlp-proto-grpc~=1.28.2
55
opentelemetry-distro~=0.49b2
66
opentelemetry-instrumentation-openai-v2~=2.0b0

instrumentation-genai/opentelemetry-instrumentation-openai-v2/pyproject.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ instruments = [
3939
openai = "opentelemetry.instrumentation.openai_v2:OpenAIInstrumentor"
4040

4141
[project.urls]
42-
Homepage = "https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation/opentelemetry-instrumentation-openai-v2"
42+
Homepage = "https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation-genai/opentelemetry-instrumentation-openai-v2"
4343

4444
[tool.hatch.version]
4545
path = "src/opentelemetry/instrumentation/openai_v2/version.py"

instrumentation-genai/opentelemetry-instrumentation-openai-v2/test-requirements-0.txt

+1
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ pytest==7.4.4
88
pytest-vcr==1.0.2
99
pytest-asyncio==0.21.0
1010
wrapt==1.16.0
11+
opentelemetry-exporter-otlp-proto-http~=1.28
1112
opentelemetry-api==1.28 # when updating, also update in pyproject.toml
1213
opentelemetry-sdk==1.28 # when updating, also update in pyproject.toml
1314
opentelemetry-semantic-conventions==0.49b0 # when updating, also update in pyproject.toml

0 commit comments

Comments
 (0)