You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-inference/README.md
+40-19
Original file line number
Diff line number
Diff line change
@@ -224,7 +224,7 @@ The `EmbeddingsClient` has a method named `embedding`. The method makes a REST A
224
224
225
225
See simple text embedding example below. More can be found in the [samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-inference/samples) folder.
226
226
227
-
<!--
227
+
<!--
228
228
### Image Embeddings
229
229
230
230
TODO: Add overview and link to explain image embeddings.
@@ -242,7 +242,7 @@ In the following sections you will find simple examples of:
The following types or messages are supported: `SystemMessage`,`UserMessage`, `AssistantMessage`, `ToolMessage`. See also samples:
277
277
278
-
*[sample_chat_completions_with_tools.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tools.py) for usage of `ToolMessage`.
278
+
*[sample_chat_completions_with_tools.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tools.py) for usage of `ToolMessage`.
279
279
*[sample_chat_completions_with_image_url.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_image_url.py) for usage of `UserMessage` that
280
280
includes sending an image URL.
281
281
*[sample_chat_completions_with_image_data.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_image_data.py) for usage of `UserMessage` that
@@ -535,15 +535,44 @@ For more information, see [Configure logging in the Azure libraries for Python](
535
535
536
536
To report issues with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues)
537
537
538
-
## Tracing
538
+
## Observability With OpenTelemetry
539
+
540
+
The Azure AI Inference client library provides experimental support for tracing with OpenTelemetry.
541
+
542
+
You can capture prompt and completion contents by setting `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment to `true` (case insensitive).
543
+
By default prompts, completions, function name, parameters or outputs are not recorded.
539
544
540
-
The Azure AI Inferencing API Tracing library provides tracing forAzure AI Inference client library for Python. Refer to Installation chapter above for installation instructions.
545
+
### Setup with Azure Monitor
541
546
542
-
### Setup
547
+
When using Azure AI Inference library with [Azure Monitor OpenTelemetry Distro](https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable?tabs=python),
548
+
distributed tracing for Azure AI Inference calls is enabled by default when using latest version of the distro.
543
549
544
-
The environment variable AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED controls whether the actual message contents will be recorded in the traces ornot. By default, the message contents are not recorded as part of the trace. When message content recording is disabled any function call tool related function names, function parameter names and function parameter values are also not recorded in the trace. Set the value of the environment variable to "true" (case insensitive) for the message contents to be recorded as part of the trace. Any other value will cause the message contents not to be recorded.
550
+
### Setup with OpenTelemetry
545
551
546
-
You also need to configure the tracing implementation in your code by setting `AZURE_SDK_TRACING_IMPLEMENTATION` to `opentelemetry`or configuring it in the code with the following snippet:
552
+
Check out your observability vendor documentation on how to configure OpenTelemetry or refer to the [official OpenTelemetry documentation](https://opentelemetry.io/docs/languages/python/).
553
+
554
+
#### Installation
555
+
556
+
Make sure to install OpenTelemetry and the Azure SDK tracing plugin via
557
+
558
+
```bash
559
+
pip install opentelemetry
560
+
pip install azure-core-tracing-opentelemetry
561
+
```
562
+
563
+
You will also need an exporter to send telemetry to your observability backend. You can print traces to the console or use a local viewer such as [Aspire Dashboard](https://learn.microsoft.com/dotnet/aspire/fundamentals/dashboard/standalone?tabs=bash).
564
+
565
+
To connect to Aspire Dashboard or another OpenTelemetry compatible backend, install OTLP exporter:
566
+
567
+
```bash
568
+
pip install opentelemetry-exporter-otlp
569
+
```
570
+
571
+
#### Configuration
572
+
573
+
To enable Azure SDK tracing set`AZURE_SDK_TRACING_IMPLEMENTATION` environment variable to `opentelemetry`.
574
+
575
+
Or configure it in the code with the following snippet:
Please refer to [azure-core-tracing-documentation](https://learn.microsoft.com/python/api/overview/azure/core-tracing-opentelemetry-readme) for more information.
558
587
559
-
### Exporting Traces with OpenTelemetry
560
-
561
-
Azure AI Inference is instrumented with OpenTelemetry. In order to enable tracing you need to configure OpenTelemetry to export traces to your observability backend.
562
-
Refer to [Azure SDK tracing in Python](https://learn.microsoft.com/python/api/overview/azure/core-tracing-opentelemetry-readme?view=azure-python-preview) for more details.
563
-
564
-
Refer to [Azure Monitor OpenTelemetry documentation](https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable?tabs=python) for the details on how to send Azure AI Inference traces to Azure Monitor and create Azure Monitor resource.
565
-
566
-
### Instrumentation
567
-
568
-
Use the AIInferenceInstrumentor to instrument the Azure AI Inferencing APIforLLM tracing, this will cause the LLM traces to be emitted from Azure AI Inferencing API.
588
+
The final step is to enable Azure AI Inference instrumentation with the following code snippet:
The @tracer.start_as_current_span decorator can be used to trace your own functions. This will trace the function parameters and their values. You can also add further attributes to the span in the function implementation as demonstrated below. Note that you will have to setup the tracer in your code before using the decorator. More information is available [here](https://opentelemetry.io/docs/languages/python/).
612
+
613
+
The `@tracer.start_as_current_span` decorator can be used to trace your own functions. This will trace the function parameters and their values. You can also add further attributes to the span in the function implementation as demonstrated below. Note that you will have to setup the tracer in your code before using the decorator. More information is available [here](https://opentelemetry.io/docs/languages/python/).
"content": "The main construction of the International Space Station (ISS) was completed between 1998 and 2011. During this period, more than 30 flights by US space shuttles and 40 by Russian rockets were conducted to transport components and modules to the station.",
60
60
},
61
-
{
62
-
"role": "user",
63
-
"content": "And what was the estimated cost to build it?"
64
-
},
61
+
{"role": "user", "content": "And what was the estimated cost to build it?"},
"content": "The main construction of the International Space Station (ISS) was completed between 1998 and 2011. During this period, more than 30 flights by US space shuttles and 40 by Russian rockets were conducted to transport components and modules to the station.",
60
60
},
61
-
{
62
-
"role": "user",
63
-
"content": "And what was the estimated cost to build it?"
64
-
},
61
+
{"role": "user", "content": "And what was the estimated cost to build it?"},
0 commit comments