Chunk Polygons: Download per chunk

Hi all!

I am working with imagery over different waterbodies in a country. For now, I am using chunk_polygon to do operation over different polygons. At this moment I take an area of interest, and use chunk_polygon to map over reservoirs. This has some consequences:

  • this results in a lot of “empty” area: I am only interested in the waterbodies and a small area around them.
  • When downloading, I just obtain a large netcdf with all the reservoirs in it, and a lot of NaN values.

Is there a way of really splitting these calculations and resulting raster files when using chunk_polygon? Or should I just send a job per reservoir?

Thanks in advance,

Hi Jaap,
I believe the solution to this is described here:

Can you give that a try and let me know if you have further questions?

best regards,

Thanks @jeroen.dries !

I have started to implement this, but I am getting:

OpenEoApiError: [500] Internal: Server error: NotImplementedError("filter_spatial only supports dict but got DelayedVector('')") (ref: r-9f57f006ea57436d9b1eb97862ca197e)

Any ideas what may be the cause?

Hi @jaaplangemeijer,

I just commited the fix (support delayed_vector in filter_spatial · Open-EO/openeo-python-driver@6ba2564 · GitHub) after the automatic tests finish it should be available on the dev environment. (

Great @jeroen.verstraelen ! Is the data availability on the test endpoint identical to live?

Yes, the data available on should be identical to

Hi Jeroen,

The job is looking good, but I am not getting anything back, (check for example job j-9cbbd522f2b84ef49a13c69c65b9c133)

Any ideas why?

@jeroen.verstraelen While debugging, I also found that I do not have credentials to view objects, so downloading fails with job not found: r-0521011d54314944aef7781e026eb378

Hi @jaaplangemeijer, sorry for the late reply. I checked the logs for the job-id you provided and it seems to have run without issues. The logs show that it wrote 0 assets to the file system, this usually means that there was no available data for the collection + date + geometry you provided. Could you run the job again but with a simple load_collection and filter spatial to verify the input data of your UDFs? It’s best to go over it step by step to ensure none of your steps actually output empty tiles.

For the request id you provided, I can see the following in the logs:

OpenEOApiException(status_code=404, code=‘NotFound’, message=‘404 Not Found: The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.’, id=‘r-0521011d54314944aef7781e026eb378’)

So it looks like it was not a permission error but the file you requested was not present in the output directory.

Hope this helps! If you have any more questions feel free to let me know.

Hi @jeroen.verstraelen !

I fixed an issue, as my geojson used latlon by accident.
Now that I fixed this, I have a strange error. I checked, and loading in the data goes well. At the filter_spatial step, I get an error, job_id j-68a108a59d4e4231a08f43b9c18967d2 (vito-dev backend):

geotrellis.vector.ExtentRangeError: Invalid Extent: xmin must be less than xmax (xmin=592760.0, xmax=569740.0)

I do not see anything wrong with the geojson used.
the spatial_extent I am using in the load_collection:

  'west': 569731.9886288806,
  'east': 592514.3105299465,
  'south': 5435408.218925263,
  'north': 5453306.660375843,
  'crs': 'EPSG:32633'

Can you help me solve the issue?