feat(examples): migrate AI examples to OpenTelemetry instrumentation#482
Open
richardsolomou wants to merge 9 commits intomasterfrom
Open
feat(examples): migrate AI examples to OpenTelemetry instrumentation#482richardsolomou wants to merge 9 commits intomasterfrom
richardsolomou wants to merge 9 commits intomasterfrom
Conversation
Switch from PostHog direct SDK wrappers to OpenTelemetry auto-instrumentation for all AI provider examples where OTel instrumentations are available. Uses opentelemetry-instrumentation-openai-v2 for OpenAI-compatible providers, opentelemetry-instrumentation-anthropic for Anthropic, opentelemetry-instrumentation-google-generativeai for Gemini, opentelemetry-instrumentation-langchain for LangChain/LangGraph, opentelemetry-instrumentation-llamaindex for LlamaIndex, and opentelemetry-instrumentation-crewai for CrewAI.
Contributor
posthog-python Compliance ReportDate: 2026-04-08 14:49:10 UTC ✅ All Tests Passed!0/0 tests passed |
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
Switch from BatchSpanProcessor to SimpleSpanProcessor so spans are exported immediately. This removes the need for provider.shutdown() which could throw export errors on exit.
CrewAI manages its own TracerProvider internally, which conflicts with setting one externally. LiteLLM callbacks remain the correct integration approach for CrewAI.
…el instrumentation
Contributor
|
andrewm4894
reviewed
Apr 8, 2026
andrewm4894
reviewed
Apr 8, 2026
Member
andrewm4894
left a comment
There was a problem hiding this comment.
i think the otel examples should be as full featured as possible and show stuff like setting distinct id and any custom foo-bar properties - am i missing that or something?
richardsolomou
added a commit
that referenced
this pull request
Apr 9, 2026
Aligns with the endpoint used by the OTel examples in PR #482. Generated-By: PostHog Code Task-Id: 1ba1f07a-1453-4162-90a8-665958c5fe46
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
Our AI examples use PostHog's direct SDK wrappers (
posthog.ai.openai,posthog.ai.anthropic, etc.) for tracking LLM calls. We want to silently deprecate these in favor of standard OpenTelemetry auto-instrumentation, which is more portable and follows industry conventions.Changes
Migrates Python AI examples from PostHog wrappers to OpenTelemetry auto-instrumentation:
opentelemetry-instrumentation-openai-v2opentelemetry-instrumentation-openai-v2opentelemetry-instrumentation-anthropicopentelemetry-instrumentation-langchainopentelemetry-instrumentation-llamaindexopentelemetry-instrumentation-google-generativeaiAll OTel-based examples set resource attributes to demonstrate the full feature set:
These map to
distinct_idand custom event properties via PostHog's OTLP ingestion endpoint.Kept as-is: CrewAI (uses LiteLLM callbacks, internally manages its own TracerProvider), LiteLLM/DSPy (use LiteLLM's built-in PostHog callback), OpenAI Agents (uses dedicated
posthog.ai.openai_agents.instrument()), Pydantic AI (already OTel viaAgent.instrument_all()), AWS Bedrock (already OTel viaopentelemetry-instrumentation-botocore).Key implementation details:
SimpleSpanProcessorinstead ofBatchSpanProcessorso spans export immediately without needingprovider.shutdown()# noqa: E402on intentional late imports afterInstrumentor().instrument()callsgpt-4omodel name instead of a deployment-specific oneHow did you test this code?
Manually ran each example against real provider API keys via
llm-analytics-apps/run-examples.shto verify:$ai_generationeventsposthog.distinct_id,foo,conversation_id) flow through as event propertiesdistinct_idis correctly set on each eventAll examples passing
ruff formatandruff check.This is an agent-authored PR — I haven't manually tested each provider end-to-end beyond spot checks, though all examples follow the same pattern and the migration was verified on several providers.
Publish to changelog?
No
Docs update
The onboarding docs will be updated separately in PostHog/posthog#53668 and PostHog/posthog.com#16236.
🤖 LLM context
Co-authored with Claude Code. Related PRs: