ai-station/.venv/lib/python3.12/site-packages/opentelemetry_instrumentati.../METADATA

59 lines
2.3 KiB
Plaintext

Metadata-Version: 2.4
Name: opentelemetry-instrumentation-google-generativeai
Version: 0.50.1
Summary: OpenTelemetry Google Generative AI instrumentation
License: Apache-2.0
Author: Gal Kleinman
Author-email: gal@traceloop.com
Requires-Python: >=3.9,<4
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Provides-Extra: instruments
Requires-Dist: opentelemetry-api (>=1.38.0,<2.0.0)
Requires-Dist: opentelemetry-instrumentation (>=0.59b0)
Requires-Dist: opentelemetry-semantic-conventions (>=0.59b0)
Requires-Dist: opentelemetry-semantic-conventions-ai (>=0.4.13,<0.5.0)
Project-URL: Repository, https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-google-generativeai
Description-Content-Type: text/markdown
# OpenTelemetry Google Generative AI Instrumentation
<a href="https://pypi.org/project/opentelemetry-instrumentation-google-generativeai/">
<img src="https://badge.fury.io/py/opentelemetry-instrumentation-google-generativeai.svg">
</a>
This library allows tracing Google Gemini prompts and completions sent with the official [Google Generative AI library](https://github.com/google-gemini/generative-ai-python).
## Installation
```bash
pip install opentelemetry-instrumentation-google-generativeai
```
## Example usage
```python
from opentelemetry.instrumentation.google_generativeai import GoogleGenerativeAiInstrumentor
GoogleGenerativeAiInstrumentor().instrument()
```
## Privacy
**By default, this instrumentation logs prompts, completions, and embeddings to span attributes**. This gives you a clear visibility into how your LLM application is working, and can make it easy to debug and evaluate the quality of the outputs.
However, you may want to disable this logging for privacy reasons, as they may contain highly sensitive data from your users. You may also simply want to reduce the size of your traces.
To disable logging, set the `TRACELOOP_TRACE_CONTENT` environment variable to `false`.
```bash
TRACELOOP_TRACE_CONTENT=false
```