59 lines
2.2 KiB
Plaintext
59 lines
2.2 KiB
Plaintext
Metadata-Version: 2.4
|
|
Name: opentelemetry-instrumentation-mistralai
|
|
Version: 0.50.1
|
|
Summary: OpenTelemetry Mistral AI instrumentation
|
|
License: Apache-2.0
|
|
Author: Gal Kleinman
|
|
Author-email: gal@traceloop.com
|
|
Requires-Python: >=3.9,<4
|
|
Classifier: License :: OSI Approved :: Apache Software License
|
|
Classifier: Programming Language :: Python :: 3
|
|
Classifier: Programming Language :: Python :: 3.9
|
|
Classifier: Programming Language :: Python :: 3.10
|
|
Classifier: Programming Language :: Python :: 3.11
|
|
Classifier: Programming Language :: Python :: 3.12
|
|
Classifier: Programming Language :: Python :: 3.13
|
|
Classifier: Programming Language :: Python :: 3.14
|
|
Provides-Extra: instruments
|
|
Requires-Dist: opentelemetry-api (>=1.38.0,<2.0.0)
|
|
Requires-Dist: opentelemetry-instrumentation (>=0.59b0)
|
|
Requires-Dist: opentelemetry-semantic-conventions (>=0.59b0)
|
|
Requires-Dist: opentelemetry-semantic-conventions-ai (>=0.4.13,<0.5.0)
|
|
Project-URL: Repository, https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-mistralai
|
|
Description-Content-Type: text/markdown
|
|
|
|
# OpenTelemetry Mistral AI Instrumentation
|
|
|
|
<a href="https://pypi.org/project/opentelemetry-instrumentation-mistralai/">
|
|
<img src="https://badge.fury.io/py/opentelemetry-instrumentation-mistralai.svg">
|
|
</a>
|
|
|
|
This library allows tracing calls to any of mistralai's endpoints sent with the official [Mistral AI library](https://github.com/mistralai-ai/mistralai-python).
|
|
|
|
## Installation
|
|
|
|
```bash
|
|
pip install opentelemetry-instrumentation-mistralai
|
|
```
|
|
|
|
## Example usage
|
|
|
|
```python
|
|
from opentelemetry.instrumentation.mistralai import MistralAiInstrumentor
|
|
|
|
MistralAiInstrumentor().instrument()
|
|
```
|
|
|
|
## Privacy
|
|
|
|
**By default, this instrumentation logs prompts, completions, and embeddings to span attributes**. This gives you a clear visibility into how your LLM application is working, and can make it easy to debug and evaluate the quality of the outputs.
|
|
|
|
However, you may want to disable this logging for privacy reasons, as they may contain highly sensitive data from your users. You may also simply want to reduce the size of your traces.
|
|
|
|
To disable logging, set the `TRACELOOP_TRACE_CONTENT` environment variable to `false`.
|
|
|
|
```bash
|
|
TRACELOOP_TRACE_CONTENT=false
|
|
```
|
|
|