You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a simple module that download avro data files (~50-200 files, each one can be between 1-50MB) from S3 and index the data to Elasticsearch.
This module is running in a docker container (within Kubernetes).
I was trying to use aiomultiprocess to speed up the process by running it in parallel with more resources (4 cores).
I have noticed that the module is getting stuck too often (keep running, doing nothing) and after a long research I found that it's a memory issue (although I didn't get Out Of Memory kill event from Kubernetes).
Is there a way to raise an exception in such case? I want to be alerted if my app is getting stuck so I could tune the memory and rerun the tasks again.
Below you can see my effort to reproduce this behavior in a simple task (just a stupid loop to fill memory) instead of downloading files / indexing them to database.
Running the code below (and also here) with memory limit of 2g never ends, while changing it to 3g finish successfully.
Description
I have a simple module that download avro data files (~50-200 files, each one can be between 1-50MB) from S3 and index the data to Elasticsearch.
This module is running in a docker container (within Kubernetes).
I was trying to use
aiomultiprocess
to speed up the process by running it in parallel with more resources (4 cores).I have noticed that the module is getting stuck too often (keep running, doing nothing) and after a long research I found that it's a memory issue (although I didn't get Out Of Memory kill event from Kubernetes).
Is there a way to raise an exception in such case? I want to be alerted if my app is getting stuck so I could tune the memory and rerun the tasks again.
Below you can see my effort to reproduce this behavior in a simple task (just a stupid loop to fill memory) instead of downloading files / indexing them to database.
Running the code below (and also here) with memory limit of 2g never ends, while changing it to 3g finish successfully.
Output (at some point only heartbeat logs keeps infinitely):
Code:
Dockerfile
:__main__.py
:app.py
:UPDATE:
Same happen in non aio version
multiprocessing
:( (added a relevant question in SO)Details
The text was updated successfully, but these errors were encountered: