OpenEO OOM error

When downloading files from the SENTINEL2_L1C_SENTINELHUB source, I get an out of memory error preluded with 2 warning about downloading tiles:
Attempt 1 failed in context: getTile sentinel-2-l1c 2019-03-07T00:00Z
Attempt 1 failed in context: getTile sentinel-2-l1c 2019-06-12T00:00Z

<long stacktrace>
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) at py4j.ClientServerConnection.run(ClientServerConnection.java:106) at java.base/java.lang.Thread.run(Thread.java:834) Your batch job failed because workers used too much Python memory. The same task was attempted multiple times. Consider increasing executor-memoryOverhead or contact the developers to investigate.

Job id (VITO prod backend): "addb9eb5-4b09-49cf-8d96-16634b0a2aac"

I’d like to add that I was later on using a UDF to filter the data using chunk_polygon perhaps the data volume was too large?

Indeed, you can increase memory yourself to solve this. For instance:

job_options = {
        "executor-memory": "2G",
        "executor-memoryOverhead": "4G",
        "executor-cores": "2"
}
cube.execute_batch(  out_format="GTiff",
        job_options=job_options)

It may take some tuning to get the size right. Be aware that increasing memory also increases the cost of your job, but for some UDF’s, there’s basically no other way.

1 Like

Solved,
I can calculate the number of pixels in my largest area + timescale and make a guess on the memory that the xarray will use, thanks.