Troubleshooting & FAQ
If you cannot find your issue below, try Ask AI, open a GitHub issue, or contact support.
Authentication issues
- Ensure
LANGFUSE_PUBLIC_KEY,LANGFUSE_SECRET_KEY, andLANGFUSE_BASE_URLare set as environment variables or passed toLangfuse()as constructor arguments. - Use
langfuse.auth_check()during setup (not in production) to confirm connectivity.
No traces appearing
- See Missing traces for common reasons and solutions.
- Confirm
tracing_enabledisTrueandsample_rateis not0.0. - Call
langfuse.shutdown()(orlangfuse.flush()in short-lived jobs) so queued data is exported. - Enable debug logging (
debug=TrueorLANGFUSE_DEBUG="True") to inspect exporter output.
Incorrect nesting or missing spans
- Self-hosted users need Langfuse platform >= 3.63.0 for the OTel based SDKs.
- Prefer context managers (
with langfuse.start_as_current_observation(...)) to maintain OTEL context. - If using manual spans (
langfuse.start_span()), always call.end(). - In async code, rely on Langfuse helpers to avoid losing context across
awaitboundaries.
LangChain/OpenAI integration issues
- Ensure Langfuse wrappers (
from langfuse.openai import openaiorLangfuseCallbackHandler) are instantiated before API calls. - Check version compatibility between Langfuse, LangChain, and the model SDKs.
Media not appearing
- Use
LangfuseMediaobjects for audio/image payloads and inspect debug logs to surface upload errors (uploads run on background threads).
Missing traces with @vercel/otel
- Use the manual OpenTelemetry setup via
NodeSDKand register theLangfuseSpanProcessor. The@vercel/otelhelper does not yet support the OpenTelemetry JS SDK v2 that Langfuse depends on. See the TypeScript instrumentation docs for a full example.
Was this page helpful?