-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Xray exporter ignoring producer and consumer spanKinds #1773
Comments
Hi @dngferreira - thanks for raising this! Can you please post your full raw trace JSON for an example trace that shows this rendering issue? We can definitely take a look at modifying our translation logic to better represent producer/consumer models. |
This is a basic trace from my app. {
"Id": "1-63d02a8d-32421dbb4a115db5d750a240",
"Duration": 0.099,
"LimitExceeded": false,
"Segments": [
{
"Id": "29b790ee27a899dc",
"Document": {
"id": "29b790ee27a899dc",
"name": "service1",
"start_time": 1674586765.6290216,
"trace_id": "1-63d02a8d-32421dbb4a115db5d750a240",
"end_time": 1674586765.7281144,
"fault": false,
"error": false,
"throttle": false,
"http": {
"request": {
"url": "http://service1/9acde03382304ba68a6b8c1a68ecd930?path=request/9acde03382304ba68a6b8c1a68ecd930",
"method": "GET",
"user_agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36",
"client_ip": "10.0.0.1"
},
"response": {
"status": 200,
"content_length": 0
}
},
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"http_route": "/api/mgmt/v2.0/request/<id>",
"http_flavor": "1.1",
"otel_resource_telemetry_sdk_language": "python",
"otel_resource_cloud_platform": "aws_eks",
"otel_resource_telemetry_sdk_version": "1.15.0",
"otel_resource_service_name": "idac-api",
},
"subsegments": [
{
"id": "ffd0b735dac8de1c",
"name": "rabbitmq.us-east-1.amazonaws.com",
"start_time": 1674586765.70156,
"end_time": 1674586765.7026145,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"messaging_system": "rabbitmq",
"messaging_temp_destination": true,
"messaging_conversation_id": "conv1",
"messaging_destination": "(temporary)"
},
"subsegments": [
{
"id": "d0ad60424f3e327f",
"name": "rabbitmq.us-east-1.amazonaws.com",
"start_time": 1674586765.7040555,
"end_time": 1674586765.7110972,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"messaging_system": "rabbitmq",
"messaging_conversation_id": "conv1",
"messaging_destination": "ctag1.e52cbf68d2ea4aab9c608788b5f3e84d",
"messaging_operation": "receive"
},
"subsegments": [
{
"id": "f20227c1d32b1e16",
"name": "rabbitmq.us-east-1.amazonaws.com",
"start_time": 1674586765.7096183,
"end_time": 1674586765.7098505,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"messaging_system": "rabbitmq",
"messaging_temp_destination": true,
"messaging_conversation_id": "conv1",
"messaging_destination": "(temporary)"
},
"subsegments": [
{
"id": "e3920bd017647c5b",
"name": "rabbitmq.us-east-1.amazonaws.com",
"start_time": 1674586765.7122142,
"end_time": 1674586765.712929,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"messaging_system": "rabbitmq",
"messaging_conversation_id": "conv1",
"messaging_destination": "ctag1.da9ec7628e344c07a520df79047be2b4",
"messaging_operation": "receive"
}
}
]
},
{
"id": "1c4fcc57fd8d43ce",
"name": "callback",
"start_time": 1674586765.704895,
"end_time": 1674586765.7070744,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"action": "requestStatus"
},
"subsegments": [
{
"id": "fe89cde7a82a648e",
"name": "redis.cache.amazonaws.com",
"start_time": 1674586765.7053726,
"end_time": 1674586765.7065356,
"fault": false,
"error": false,
"throttle": false,
"aws": {
"xray": {
"auto_instrumentation": false,
"sdk_version": "1.15.0",
"sdk": "opentelemetry for python"
}
},
"annotations": {
"net_transport": "ip_tcp",
"db_redis_args_length": 2,
"db_statement": "GET 9acde03382304ba68a6b8c1a68ecd930",
"db_system": "redis",
"db_redis_database_index": 0
},
"namespace": "remote"
}
]
}
]
}
]
}
]
}
},
{
"Id": "399a9b7a0f2d4fa0",
"Document": {
"id": "399a9b7a0f2d4fa0",
"name": "redis.cache.amazonaws.com",
"start_time": 1674586765.7053726,
"trace_id": "1-63d02a8d-32421dbb4a115db5d750a240",
"end_time": 1674586765.7065356,
"parent_id": "fe89cde7a82a648e",
"inferred": true
}
}
]
} |
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days. |
@Aneurysm9 can we mark this as a never-stale item? Want to keep it open for tracking |
Any progress here? I would too expect a Segment from a Consume span. |
Any progress here ? |
We're likewise running into this issue in the system I work on. The services that use server / client span kinds show up fine in our Cloudwatch traces, but we also have a service that processes items in a job queue, and we pass the trace headers along as part of the job definition using producer / consumer. Since we wrote the instrumentation for the job queue library, we could potentially just use server / client span kinds instead until this issue is fixed - but it'd be good if the XRay exporter could use the span kinds that reflect the actual behaviour. |
Describe the bug
Xray exporter only creates services for span kind of type Server or client.
Steps to reproduce
I have a scenario where two application written in python communicate using rabbitmq, S1 and S2.
S1 sends a message to rabbitmq, S2 reads it, stores in Redis and returns a message to S1 through rabbitmq in another channel.
Enable opentelemetry instrumentation with Pika, Redis and Flask instrumentations all default sending to ADOT collector with a x-ray exporter
What did you expect to see?
On the service map I was expecting to see 4 services:
S1
Rabbitmq
S2
Redis
What did you see instead?
On the service map I see 2 services:
S1 with rabbitmq and S2 spans as subsegments
Redis
Environment
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: