Now that you have the chatbot app running, you can also use Elastic Observability to monitor the application. Elastic is 100% OpenTelemetry native by ingesting and retaining OTel data without requiring data translation.
We've already configured the Chatbot app with OpenTelemetry (using EDOT).

How do I get traces?
Configure OTEL_SDK_DISABLED=false
in your .env file, the app will send logs, metrics and traces to the local Elastic deployment.
See details in Chatbot app README OpenTelemetry section.
Tracing will work for all LLMs - OpenAI, Azure OpenAI, AWS Bedrock, Google Vertex AI.
LLM Dashboard
You can also see how well the LLM is performing through the LLM Observability integrations for OpenAI, Azure OpenAI, Google VertexAI, and AWS Bedrock.
These are easily turned on by adding the integration on Elastic.
Read more for OTel and LLM Obs on Elastic Observability Labs.
Previously
Chat HistoryNext
Conclusion