Now that you have the chatbot app running, you can also use Elastic Observability to monitor the application. Elastic is 100% OpenTelemetry native by ingesting and retaining OTel data without requiring data translation.

We've already configured the Chatbot app with OpenTelemetry (using EDOT).

How do I get traces?

Configure OTEL_SDK_DISABLED=false in your .env file, the app will send logs, metrics and traces to the local Elastic deployment.

See details in Chatbot app README OpenTelemetry section.

Tracing will work for all LLMs - OpenAI, Azure OpenAI, AWS Bedrock, Google Vertex AI.

LLM Dashboard

You can also see how well the LLM is performing through the LLM Observability integrations for OpenAI, Azure OpenAI, Google VertexAI, and AWS Bedrock.

These are easily turned on by adding the integration on Elastic.

Read more for OTel and LLM Obs on Elastic Observability Labs.

Ready to build state of the art search experiences?

Sufficiently advanced search isn’t achieved with the efforts of one. Elasticsearch is powered by data scientists, ML ops, engineers, and many more who are just as passionate about search as your are. Let’s connect and work together to build the magical search experience that will get you the results you want.

Try it yourself