Skip to content

Commit 4aeebcb

Browse files
authored
Merge branch 'main' into fix-aiokafka
2 parents 16c3d0f + 6d5a514 commit 4aeebcb

File tree

37 files changed

+3850
-1416
lines changed

37 files changed

+3850
-1416
lines changed

.github/workflows/ossf-scorecard.yml

+47
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
name: OSSF Scorecard
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
schedule:
8+
- cron: "10 6 * * 1" # once a week
9+
workflow_dispatch:
10+
11+
permissions: read-all
12+
13+
jobs:
14+
analysis:
15+
runs-on: ubuntu-latest
16+
permissions:
17+
# Needed for Code scanning upload
18+
security-events: write
19+
# Needed for GitHub OIDC token if publish_results is true
20+
id-token: write
21+
steps:
22+
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
23+
with:
24+
persist-credentials: false
25+
26+
- uses: ossf/scorecard-action@f49aabe0b5af0936a0987cfb85d86b75731b0186 # v2.4.1
27+
with:
28+
results_file: results.sarif
29+
results_format: sarif
30+
publish_results: true
31+
32+
# Upload the results as artifacts (optional). Commenting out will disable
33+
# uploads of run results in SARIF format to the repository Actions tab.
34+
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
35+
- name: "Upload artifact"
36+
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
37+
with:
38+
name: SARIF file
39+
path: results.sarif
40+
retention-days: 5
41+
42+
# Upload the results to GitHub's code scanning dashboard (optional).
43+
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
44+
- name: "Upload to code-scanning"
45+
uses: github/codeql-action/upload-sarif@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
46+
with:
47+
sarif_file: results.sarif

CHANGELOG.md

+8
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,14 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
1111
1212
## Unreleased
1313

14+
### Added
15+
1416
- `opentelemetry-instrumentation-asyncclick`: new instrumentation to trace asyncclick commands
1517
([#3319](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3319))
18+
- `opentelemetry-instrumentation-botocore` Add support for GenAI tool events using Amazon Nova models and `InvokeModel*` APIs
19+
([#3385](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3385))
20+
- `opentelemetry-instrumentation` Make auto instrumentation use the same dependency resolver as manual instrumentation does
21+
([#3202](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3202))
1622

1723
### Fixed
1824

@@ -21,6 +27,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
2127
- `opentelemetry-instrumentation-dbapi`, `opentelemetry-instrumentation-django`,
2228
`opentelemetry-instrumentation-sqlalchemy`: Fix sqlcomment for non string query and composable object.
2329
([#3113](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3113))
30+
- `opentelemetry-instrumentation-grpc` Fix error when using gprc versions <= 1.50.0 with unix sockets.
31+
([[#3393](https://github.com/open-telemetry/opentelemetry-python-contrib/issues/3393)])
2432

2533
## Version 1.31.0/0.52b0 (2025-03-12)
2634

instrumentation-genai/README.md

+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
2+
| Instrumentation | Supported Packages | Metrics support | Semconv status |
3+
| --------------- | ------------------ | --------------- | -------------- |
4+
| [opentelemetry-instrumentation-google-genai](./opentelemetry-instrumentation-google-genai) | google-genai >= 1.0.0 | No | development
5+
| [opentelemetry-instrumentation-openai-v2](./opentelemetry-instrumentation-openai-v2) | openai >= 1.26.0 | Yes | development
6+
| [opentelemetry-instrumentation-vertexai](./opentelemetry-instrumentation-vertexai) | google-cloud-aiplatform >= 1.64 | No | development
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Copyright The OpenTelemetry Authors
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
_instruments = ("google-genai >= 1.0.0",)

instrumentation-genai/opentelemetry-instrumentation-openai-v2/src/opentelemetry/instrumentation/openai_v2/package.py

+2
Original file line numberDiff line numberDiff line change
@@ -14,3 +14,5 @@
1414

1515

1616
_instruments = ("openai >= 1.26.0",)
17+
18+
_supports_metrics = True

instrumentation/opentelemetry-instrumentation-asyncpg/src/opentelemetry/instrumentation/asyncpg/__init__.py

+18-3
Original file line numberDiff line numberDiff line change
@@ -19,16 +19,31 @@
1919
Usage
2020
-----
2121
22+
Start PostgreSQL:
23+
24+
::
25+
26+
docker run -e POSTGRES_USER=user -e POSTGRES_PASSWORD=password -e POSTGRES_DATABASE=database -p 5432:5432 postgres
27+
28+
Run instrumented code:
29+
2230
.. code-block:: python
2331
32+
import asyncio
2433
import asyncpg
2534
from opentelemetry.instrumentation.asyncpg import AsyncPGInstrumentor
2635
2736
# You can optionally pass a custom TracerProvider to AsyncPGInstrumentor.instrument()
2837
AsyncPGInstrumentor().instrument()
29-
conn = await asyncpg.connect(user='user', password='password',
30-
database='database', host='127.0.0.1')
31-
values = await conn.fetch('''SELECT 42;''')
38+
39+
async def main():
40+
conn = await asyncpg.connect(user='user', password='password')
41+
42+
await conn.fetch('''SELECT 42;''')
43+
44+
await conn.close()
45+
46+
asyncio.run(main())
3247
3348
API
3449
---

instrumentation/opentelemetry-instrumentation-botocore/README.rst

+38
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,44 @@ OpenTelemetry Botocore Tracing
88

99
This library allows tracing requests made by the Botocore library.
1010

11+
Extensions
12+
----------
13+
14+
The instrumentation supports creating extensions for AWS services for enriching what is collected. We have extensions
15+
for the following AWS services:
16+
17+
- Bedrock Runtime
18+
- DynamoDB
19+
- Lambda
20+
- SNS
21+
- SQS
22+
23+
Bedrock Runtime
24+
***************
25+
26+
This extension implements the GenAI semantic conventions for the following API calls:
27+
28+
- Converse
29+
- ConverseStream
30+
- InvokeModel
31+
- InvokeModelWithResponseStream
32+
33+
For the Converse and ConverseStream APIs tracing, events and metrics are implemented.
34+
35+
For the InvokeModel and InvokeModelWithResponseStream APIs tracing, events and metrics implemented only for a subset of
36+
the available models, namely:
37+
38+
- Amazon Titan models
39+
- Amazon Nova models
40+
- Anthropic Claude
41+
42+
Tool calls with InvokeModel and InvokeModelWithResponseStream APIs are supported with:
43+
44+
- Amazon Nova models
45+
- Anthropic Claude 3+
46+
47+
If you don't have an application using Bedrock APIs yet, try our `zero-code examples <examples/bedrock-runtime/zero-code>`_.
48+
1149
Installation
1250
------------
1351

instrumentation/opentelemetry-instrumentation-botocore/src/opentelemetry/instrumentation/botocore/__init__.py

-32
Original file line numberDiff line numberDiff line change
@@ -78,38 +78,6 @@ def response_hook(span, service_name, operation_name, result):
7878
)
7979
ec2 = session.create_client("ec2", region_name="us-west-2")
8080
ec2.describe_instances()
81-
82-
Extensions
83-
----------
84-
85-
The instrumentation supports creating extensions for AWS services for enriching what is collected. We have extensions
86-
for the following AWS services:
87-
88-
- Bedrock Runtime
89-
- DynamoDB
90-
- Lambda
91-
- SNS
92-
- SQS
93-
94-
Bedrock Runtime
95-
***************
96-
97-
This extension implements the GenAI semantic conventions for the following API calls:
98-
99-
- Converse
100-
- ConverseStream
101-
- InvokeModel
102-
- InvokeModelWithResponseStream
103-
104-
For the Converse and ConverseStream APIs tracing, events and metrics are implemented.
105-
106-
For the InvokeModel and InvokeModelWithResponseStream APIs tracing, events and metrics implemented only for a subset of
107-
the available models, namely:
108-
- Amazon Titan models
109-
- Amazon Nova models
110-
- Anthropic Claude
111-
112-
There is no support for tool calls with Amazon Models for the InvokeModel and InvokeModelWithResponseStream APIs.
11381
"""
11482

11583
import logging

instrumentation/opentelemetry-instrumentation-botocore/src/opentelemetry/instrumentation/botocore/extensions/bedrock_utils.py

+23-1
Original file line numberDiff line numberDiff line change
@@ -216,25 +216,47 @@ def _process_amazon_titan_chunk(self, chunk):
216216

217217
def _process_amazon_nova_chunk(self, chunk):
218218
# pylint: disable=too-many-branches
219-
# TODO: handle tool calls!
220219
if "messageStart" in chunk:
221220
# {'messageStart': {'role': 'assistant'}}
222221
if chunk["messageStart"].get("role") == "assistant":
223222
self._record_message = True
224223
self._message = {"role": "assistant", "content": []}
225224
return
226225

226+
if "contentBlockStart" in chunk:
227+
# {'contentBlockStart': {'start': {'toolUse': {'toolUseId': 'id', 'name': 'name'}}, 'contentBlockIndex': 31}}
228+
if self._record_message:
229+
self._message["content"].append(self._content_block)
230+
231+
start = chunk["contentBlockStart"].get("start", {})
232+
if "toolUse" in start:
233+
self._content_block = start
234+
else:
235+
self._content_block = {}
236+
return
237+
227238
if "contentBlockDelta" in chunk:
228239
# {'contentBlockDelta': {'delta': {'text': "Hello"}, 'contentBlockIndex': 0}}
240+
# {'contentBlockDelta': {'delta': {'toolUse': {'input': '{"location":"San Francisco"}'}}, 'contentBlockIndex': 31}}
229241
if self._record_message:
230242
delta = chunk["contentBlockDelta"].get("delta", {})
231243
if "text" in delta:
232244
self._content_block.setdefault("text", "")
233245
self._content_block["text"] += delta["text"]
246+
elif "toolUse" in delta:
247+
self._content_block.setdefault("toolUse", {})
248+
self._content_block["toolUse"]["input"] = json.loads(
249+
delta["toolUse"]["input"]
250+
)
234251
return
235252

236253
if "contentBlockStop" in chunk:
237254
# {'contentBlockStop': {'contentBlockIndex': 0}}
255+
if self._record_message:
256+
# create a new content block only for tools
257+
if "toolUse" in self._content_block:
258+
self._message["content"].append(self._content_block)
259+
self._content_block = {}
238260
return
239261

240262
if "messageStop" in chunk:

0 commit comments

Comments
 (0)