61 lines
2.2 KiB
Plaintext
61 lines
2.2 KiB
Plaintext
|
|
Metadata-Version: 2.4
|
||
|
|
Name: opentelemetry-instrumentation-llamaindex
|
||
|
|
Version: 0.50.1
|
||
|
|
Summary: OpenTelemetry LlamaIndex instrumentation
|
||
|
|
License: Apache-2.0
|
||
|
|
Author: Gal Kleinman
|
||
|
|
Author-email: gal@traceloop.com
|
||
|
|
Requires-Python: >=3.9,<4
|
||
|
|
Classifier: License :: OSI Approved :: Apache Software License
|
||
|
|
Classifier: Programming Language :: Python :: 3
|
||
|
|
Classifier: Programming Language :: Python :: 3.9
|
||
|
|
Classifier: Programming Language :: Python :: 3.10
|
||
|
|
Classifier: Programming Language :: Python :: 3.11
|
||
|
|
Classifier: Programming Language :: Python :: 3.12
|
||
|
|
Classifier: Programming Language :: Python :: 3.13
|
||
|
|
Classifier: Programming Language :: Python :: 3.14
|
||
|
|
Provides-Extra: instruments
|
||
|
|
Provides-Extra: llamaparse
|
||
|
|
Requires-Dist: inflection (>=0.5.1,<0.6.0)
|
||
|
|
Requires-Dist: opentelemetry-api (>=1.38.0,<2.0.0)
|
||
|
|
Requires-Dist: opentelemetry-instrumentation (>=0.59b0)
|
||
|
|
Requires-Dist: opentelemetry-semantic-conventions (>=0.59b0)
|
||
|
|
Requires-Dist: opentelemetry-semantic-conventions-ai (>=0.4.13,<0.5.0)
|
||
|
|
Project-URL: Repository, https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-llamaindex
|
||
|
|
Description-Content-Type: text/markdown
|
||
|
|
|
||
|
|
# OpenTelemetry LlamaIndex Instrumentation
|
||
|
|
|
||
|
|
<a href="https://pypi.org/project/opentelemetry-instrumentation-llamaindex/">
|
||
|
|
<img src="https://badge.fury.io/py/opentelemetry-instrumentation-llamaindex.svg">
|
||
|
|
</a>
|
||
|
|
|
||
|
|
This library allows tracing complete LLM applications built with [LlamaIndex](https://github.com/run-llama/llama_index).
|
||
|
|
|
||
|
|
## Installation
|
||
|
|
|
||
|
|
```bash
|
||
|
|
pip install opentelemetry-instrumentation-llamaindex
|
||
|
|
```
|
||
|
|
|
||
|
|
## Example usage
|
||
|
|
|
||
|
|
```python
|
||
|
|
from opentelemetry.instrumentation.llamaindex import LlamaIndexInstrumentor
|
||
|
|
|
||
|
|
LlamaIndexInstrumentor().instrument()
|
||
|
|
```
|
||
|
|
|
||
|
|
## Privacy
|
||
|
|
|
||
|
|
**By default, this instrumentation logs prompts, completions, and embeddings to span attributes**. This gives you a clear visibility into how your LLM application is working, and can make it easy to debug and evaluate the quality of the outputs.
|
||
|
|
|
||
|
|
However, you may want to disable this logging for privacy reasons, as they may contain highly sensitive data from your users. You may also simply want to reduce the size of your traces.
|
||
|
|
|
||
|
|
To disable logging, set the `TRACELOOP_TRACE_CONTENT` environment variable to `false`.
|
||
|
|
|
||
|
|
```bash
|
||
|
|
TRACELOOP_TRACE_CONTENT=false
|
||
|
|
```
|
||
|
|
|