Comments (12)
Not yet, hopefully soon :)
from openllmetry.
here's the setup i used, and everything "just works" 😄
config.py
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
# Set the tracer provider
trace.set_tracer_provider(TracerProvider())
# Configure the tracer provider to export traces to Azure Application Insights
exporter = AzureMonitorTraceExporter(
connection_string = "$INSTRUMENTATION_KEY_HERE"
)
span_processor = SimpleSpanProcessor(exporter)
trace.get_tracer_provider().add_span_processor(span_processor)
app.py
import config # Import the configuration file
from traceloop.sdk import Traceloop
from traceloop.sdk.decorators import workflow
from opentelemetry import trace
import openai # Ensure you have the openai library installed
Traceloop.init(app_name="your_app_name")
tracer = trace.get_tracer(__name__)
@workflow(name="llm_execution")
def execute_llm():
with tracer.start_as_current_span("llm_span"):
# Replace with your actual OpenAI API call
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "checking if this is logged"}],
max_tokens=60
)
print (response['choices'][0]['message']['content'])
return response['choices'][0]['message']['content']
if __name__ == "__main__":
execute_llm()
Resulting in:
one thing - it seems that all telemetry goes through Traceloop (the saas) before landing in the final destination. is there a way to turn that off? while personally i think that's very cool how you can setup demo dashboards for people to get an idea of how great the platform is, it's a concern for any real production scenario. definitely want to keep your LLM logs and traces isolated.
from openllmetry.
Sounds good @aavetis! Indeed it makes sense.
Let me know once you verified that it works and update the docs in this repo (under /docs
) and the README. We'll then announce that support with proper credit 😄
from openllmetry.
Can you try passing your exporter to the sdk's init function? It just that we have an exporter we define internally in the sdk so that's what you're seeing - traces are sent to both destinations in parallel.
from openllmetry.
Confirmed, when I explicitly pass in the config, it ONLY shows up in Azure and not in Traceloop dashboard. However, it appears the open dashboard is created anyway, just stays empty if the exporter is set.
# this appears to ensure traces are only sent to your desired exporter
exporter = AzureMonitorTraceExporter(connection_string="InstrumentationKey=.......")
Traceloop.init(app_name="your_app_name", exporter=exporter)
As you can see, the dashboard is still created (just a note).
from openllmetry.
Huh! That's a bug. Fixed in #96.
@aavetis can you add the details of how you set this up in the docs?
from openllmetry.
will work on getting in the docs.
on another note, are Metrics / MeterProvider expected to work? i've noticed traces, spans, and dependencies are all tracked seamlessly.
but i haven't been able to see custom metrics (counters) getting passed down, not sure if i'm making a mistake or if sdk isn't expected to support it.
https://opentelemetry.io/docs/specs/otel/metrics/api/#meter-operations
from openllmetry.
SimpleSpanProcessor
It may be better to document use of APPLICATIONINSIGHTS_CONNECTION_STRING env variable as a good practice. I believe you can then even omit it from the exporter constructor to simplify the code.
Traceloop.init(app_name="your_app_name", exporter=AzureMonitorTraceExporter())
from openllmetry.
FYI @aavetis docs have been moved to a separate repo - https://github.com/traceloop/docs
(To support the fact that we now have a JS SDK as well)
from openllmetry.
@aavetis any update on this?
from openllmetry.
from openllmetry.
Closing as traceloop/docs#2 is merged and deployed
from openllmetry.
Related Issues (20)
- 🚀 Feature: Log time between first and last token of Open ai streaming response
- 🚀 Feature: Make instrumentations compatible with "opentelemetry-instrument" automatic instrumentation HOT 13
- 🐛 Bug Report: Number of tokens not reported when streaming OpenAI/AOAI HOT 2
- 🚀 Feature: Support Haystack v2 HOT 9
- 🐛 Bug Report: exceptions on instrumentation errors HOT 3
- 🐛 Bug Report: Sentry.io integration breaks HOT 1
- 🐛 Bug Report: Init failed when installing opentelemetry-instrumentation-openai HOT 1
- 🐛 Bug Report: Prompts and responses are only captured for OpenAI models HOT 5
- 🚀 Feature: Log prompts while preserving user PII with Presidio HOT 3
- 🐛 Bug Report: Error getting model name when using langchain BedrockChat HOT 3
- 🐛 Bug Report: LLamaCPP is not found by the CustomLLMInstrumentor llamaindex class HOT 3
- 🚀 Feature: Support runpod.ai HOT 2
- 🐛 Bug Report: Auto Instrumentation support for Watsonx HOT 2
- 🐛 Bug Report: AttributeError: 'NoneType' object has no attribute 'get' HOT 5
- 🚀 Feature: log content filter results in proper attributes (Azure OpenAI) HOT 5
- 🚀 Feature: Support Anthropic's Tools HOT 5
- 🚀 Feature: Update semantic convention based on latest GenAI Semantion Convention HOT 3
- 🚀 Feature: Add support for new pinecone versions HOT 1
- AttributeError: 'coroutine' object has no attribute 'get'
- TypeError: Object of type coroutine is not JSON serializable HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from openllmetry.