Add Langfuse AutoGen Tracing to example#104
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds Langfuse tracing support for AutoGen agent conversations using OpenTelemetry. It enables tracing of the AutoGen-based response generation pipeline by introducing a setup node that configures OpenTelemetry with a Langfuse span processor, and wrapping agent execution in custom spans to capture input/output attributes.
Changes:
- Added
setup_autogen_tracingfunction to configure OpenTelemetry tracing with Langfuse - Modified
generate_responseto create tracing spans around agent execution with relevant attributes - Updated requirements.txt to include the
langfuse-langfusetracedataset-autogenextra for kedro-datasets - Added
autogen_tracer_langfusedataset configuration in genai-config.yml
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| kedro-agentic-workflows/src/kedro_agentic_workflows/pipelines/response_generation_autogen/pipeline.py | Added setup_tracing_node to pipeline and tracer_provider input to generate_response_node |
| kedro-agentic-workflows/src/kedro_agentic_workflows/pipelines/response_generation_autogen/nodes.py | Implemented setup_autogen_tracing function and added OpenTelemetry span instrumentation to generate_response |
| kedro-agentic-workflows/requirements.txt | Added langfuse-langfusetracedataset-autogen extra to kedro-datasets dependency |
| kedro-agentic-workflows/conf/base/genai-config.yml | Added autogen_tracer_langfuse dataset configuration and updated comments to include autogen mode |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
ElenaKhaustova
left a comment
There was a problem hiding this comment.
It doesn't seem to be working for me 😕
- I've installed
pip install ".[langfuse-langfusetracedataset-autogen]"from feat/add-autogen-support-to-langfusetrace branch - Installed project requirements
- Double-checked that I have everything in place
langfuse==3.12.1
opentelemetry-api==1.39.1
opentelemetry-exporter-otlp-proto-common==1.39.1
opentelemetry-exporter-otlp-proto-http==1.39.1
opentelemetry-proto==1.39.1
opentelemetry-sdk==1.39.1
opentelemetry-semantic-conventions==0.60b1And I still get an error
DatasetError: AutoGen mode requires langfuse with OpenTelemetry support. Install with: pip install 'langfuse[opentelemetry]'@SajidAlamQB, can you please provide step-by-step installation instructions so I can try once again?
Hi @ElenaKhaustova thanks for the review I've fixed the installation issue. Installation instructions: In kedro-academy: In kedro-datasets: Then in kedro-academy run: I've verified traces appear correctly in Langfuse dashboard:
|
ElenaKhaustova
left a comment
There was a problem hiding this comment.
Thank you, @SajidAlamQB, this works for me now.
When testing, I've noticed the following warning. Do we understand why it is raised?
|
Hi @SajidAlamQB , I tried running the example on python 3.12 env and there seems to be some issues with pydantic. Did you try this example on python 3.12 ? Is there a specific pydantic version we need to install ? DatasetError: An exception occurred when parsing config for dataset 'llm':
The `__modify_schema__` method is not supported in Pydantic v2. Use `__get_pydantic_json_schema__` instead
in class `SecretStr`.
For further information visit https://errors.pydantic.dev/2.12/u/custom-json-schemaThank you |
From opentelemtry because we always created a new provider we should check if there is an existing provider is already set then create a new one if needed. |
Thanks @ravi-kumar-pilla, I have only tried with 3.11, this doesn't really sound specific to our dataset I think there are some known Pydantic v2 compatibility issue in the LangChain/AutoGen, maybe we pin pin pydantic below 2.10 but let me have play around with it. |
Hey @ravi-kumar-pilla I tested on python 3.12 I wasn't able to recreate your issue:
What version of |
Name: langchain-core Name: pydantic |
|
Hi Sajid, I tried with langchain-core 0.3.83 as well and still have issues. I will try to test with a new environment and let you know how it goes. Thank you |
|
Hi @SajidAlamQB , The pydantic issue is resolved. I am getting a new error - We usually set the credentials like below I believe. Is there a change in how this is configured ? Should I pass openai:
openai_api_base: ""
openai_api_key: ""
langfuse_credentials:
public_key: ""
secret_key: ""
host: ""
endpoint: ""
openai:
openai_api_base: ""
openai_api_key: "" |
This is resolved now I've added credentials mapping. |
My runs have always had the same visualisation as the image on the top of your post. The trace data is identical with same spans and hierarchy etc but the graph layout difference might just be Langfuse rendering I don't think this is related to any dataset changes. |
|
@ravi-kumar-pilla, @SajidAlamQB After we moved openai:
base_url: <openai-api-base> # Optional, defaults to OpenAI default
api_key: <openai-api-key> # Optional if OPENAI_API_KEY is setI would suggest keeping it, since it's a demo project and we don't want to overload the logic with key swapping. |
Did they look the same with the |
|
Hi @ElenaKhaustova , Thanks for letting me know. I looked at our Since we now follow how openai recommends, we need to update the usage across LangfuseTraceDataset for other modes as well. May be in this PR or a separate PR @SajidAlamQB Thank you |
|
Hi @SajidAlamQB , It works well. Thank you I also see the graph as Elena mentioned but the tracing is working fine.
|
Yes, it's a good point! I'll do it in a separate PR. |
This reverts commit 79d1449.
@ElenaKhaustova Just tried with openlit it didn't have that render issue:
Should we switch Langfuse autogen back to OpenLit for the better trace visualisation? |
Yes, I'm definitely for it, so we sacrifice alignment, not user experience |
|
Switched from raw OTLP tracing to OpenLit for AutoGen after discussing with @ElenaKhaustova. Also split configs and requirements so Langfuse and Opik tutorials can be installed independently (they have conflicting Installation: Install from Academy: If testing against local kedro-datasets changes: Run: Tested on both Python 3.11 and 3.12. @ElenaKhaustova @ravi-kumar-pilla ready for re-review when you get a chance. |
|
Hi @SajidAlamQB , It works fine but the execution graph still looks like below and I followed the steps you mentioned here
|
Very strange the rendering just seems intermittent. |
|
I followed your steps from the above and running this
gives me the following error now: It looks like it's coming from this line (https://github.com/kedro-org/kedro-plugins/pull/1288/changes) openlit.init(tracer=langfuse._otel_tracer, disable_batch=True, disable_metrics=True, disabled_instrumentors=["httpx"])I tested with autogen-agentchat==0.7.5
autogen-core==0.7.5
autogen-ext==0.7.5
langfuse==3.12.1
openai==1.109.1
openlit==1.36.8Based on where we are now, I see the following next steps to proceed:
|
Yes its the new openlit version |
Yeah, pinning version works, but rendering looks the same as @ravi-kumar-pilla posted above 😕
|
|
In their tutorial (https://langfuse.com/integrations/frameworks/autogen), they say
an they don't wrap the runs with as we do with tracer.start_as_current_span("response_generation") as span:Does it look the same if following their example? Can we please look through the issues in Langfuse's repo to see if someone else is experiencing the same rendering issue and whether there's a workaround? |
I've looked through Langfuse issues and found several related ones: #9427: "The langfuse graph is not plotting correctly" exact same issue their response was: "This is a known limitation with Langfuse's agent graph visualization... The graph view is still in beta and doesn't always reflect the true execution flow for complex or nested agent setups" #10721 : "LangGraph CallbackHandler Observations Do Not Nest Under Active Parent Span" similar too its related nesting issue. There were few others as well that was related. Our trace structure is correct the JSON has the correct structure. The issue is Langfuse graph rendering it's beta and has known bugs. No workaround exists on our side unfortunately. |
I tested this already. Without the manual |
|
@SajidAlamQB, thanks for looking into this! Let's then return to the original OTLP solution and add a note that there may be rendering issues. |








Related PR: kedro-org/kedro-plugins#1288
This PR adds use
LangfuseTraceDatasetwithmode: autogento trace AutoGen agent conversations.For the credentials:
Depedencies:
Install the following manually as the change on dataset is not out yet.
uv pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-httpRun example:
kedro run --pipeline autogen --params user_id=3