-
Notifications
You must be signed in to change notification settings - Fork 873
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What are the Best Practices for Providing Instrumentation for Spring AI. #12878
Comments
cc @asaikali |
Thanks a lot! ;) @trask @asaikali |
Thanks for reporting this issue. There might be some confusion/conflict among different strategies. There are three main ways to handle observability in a Spring application, and they can all be used to export OpenTelemetry data. I would recommend choosing only one of these strategies to avoid conflicts/incompatibilities/errors.
If you're using the Agent, then options 1 or 2 should not be added to avoid unpredictable results. But I would actually recommend going with either option 1 or 2. About Spring AI, you can find a full example here, which exports OpenTelemetry logs, metrics and traces: https://github.com/ThomasVitale/llm-apps-java-spring-ai/tree/main/observability/models/observability-models-openai |
Sure, this solution makes sense. However, our applications have already been enhanced with As you mentioned, the application can go with either option 1 or option 2. I believe we could also find a way to make it work well with either option 1 or option 3. This might require some adjustments in the
Thanks for your perfect work! ;) |
Backgroud
Hi, we are currently working on providing automatic instrumentation capabilities for applications built using the Spring AI framework. Our goal is to enable users to obtain various observability data (mainly traces) without needing to modify their code after installing
opentelemetry-java-instrumentation
in their Spring AI application.This sounds like a requirement for plugin support. However, we have found that Spring AI already supports a rich set of observability features, and the trace attributes adhere as closely as possible to the OTel semconv. Therefore, we believe that just exporting observability data by
opentelemetry-java-instrumentation
is a more elegant solution. We have made some modifications to the demo application, successfully exporting this data to Jaeger. Here is the effect:Making some necessary adaptations in the demo application can indeed achieve this effect, but we think there might be better ways to achieve this. We have also came across a issue of memory leak, and below are some issues we are particularly concerned about.
List of Issues
Initialization of OpenTelemetry Sdk
Like other Spring applications, the underlying tracing capability of Spring AI is based on the micrometer framework, which requires adding these dependencies:
However, in the implementation of
spring-boot-actuator-autoconfigure
, it does not detect the presence ofopentelemetry-java-instrumentation
; instead, it checks if an object of classOpenTelemetry
exists in the application context. If not, it creates a new one—this causes theOpenTelemetrySdk
provided by java agent not to be used by micrometer.One way to solve this issue is to explicitly declare a Bean of class
OpenTelemetry
in the Configuration class of application:Of course, we can add some auto configuration strategies in Spring AI (or its other distro, such as Spring AI Alibaba) to handle this logic for the user. However, we believe it would be better if this behavior were managed by the
opentelemetry-java-instrumentation
. The framework should fully implement observability logic based on the OpenTelemetry API, and should not notice the presence of a java agent.Potential Memory Leak
For most applications that depend on
spring-actuator
, micrometer generates many metrics by default, which often face high-cardinality issues (e.g., the uri ofRestTemplate
is recorded as a tag by default). Inopentelemetry-java-instrumentation
, there is an auto-instrumentation forspring-actuator
, which adds a registry to micrometer. This registry seems to bypass micrometer's high cardinality control, leading to dimension explosion and memory leaks.This seems not to be an issue because the risk of high-cardinality tags should be borne by the user. However, the current behavior is that, without the
opentelemetry-java-instrumentation
, micrometer's memory consumption is normal (controlled by the default configurationmaximumAllowableTags=100
), but with theopentelemetry-java-instrumentation
, memory leaks occur, which may mislead users into thinking theopentelemetry-java-instrumentation
is causing the memory leak. We are still investigating the details of this issue and would like to know if the community has encountered similar problems? (I apologize for not finding a similar issue in this project.)About the Support Plan of Spring AI
Currently, in the OpenTelemetry Java projects (
opentelemetry-java-instrumentation
,opentelemetry-java
andopentelemetry-java-contrib
), I have seen no discussion about the Spring AI framework. In the long term, does the community plan to support this framework? Will there be a new instrumentation provided, or will it rely on library instrumentation within Spring, as we have done—despite it being based onmicrometer-tracing
?Demo
To give you a little more context, I have created a repository that contains a simple Spring AI application demo.
Note that before running the demo, you need to obtain an API Key from a LLM provider, such as OpenAI or DashScope.
Additional Context
Spring AI and its Observability: https://docs.spring.io/spring-ai/reference/observability/index.html
The text was updated successfully, but these errors were encountered: