Datacube cannot be stored locally

This is a general question relating to openEO Platform:

  • Short description: I used the following code (adapted from openeo API examples) to query a GTiff aggregate from S1 collection. It ran at a virtualized server. Before saving it locally (not sure whether it can be executed safely) the job execution returns the following error, I include the log.
  • Specific request/idea/comment:

Please advise since the authorization of the service is safely granted by EGI.
Chris Kiranoudis, Professor NTUA

Python Code:
import openeo

datacube = connection.load_collection(
“SENTINEL1_GRD”,
spatial_extent={“west”: 16.06, “south”: 48.06, “east”: 16.07, “north”: 48.07},
temporal_extent=[“2017-03-01”, “2017-03-10”],
bands=[“VV”, “HH”]
)
datacube = datacube.min_time()
print(datacube.flat_graph()[‘loadcollection1’])
result = datacube.save_result(“GTiff”)
#job = datacube.send_job()
job = datacube.create_job()
job.start_and_wait()
job.get_results().download_files(“result.tif”)

Output:
{‘process_id’: ‘load_collection’, ‘arguments’: {‘bands’: [‘VV’, ‘HH’], ‘id’: ‘SENTINEL1_GRD’, ‘spatial_extent’: {‘west’: 16.06, ‘south’: 48.06, ‘east’: 16.07, ‘north’: 48.07}, ‘temporal_extent’: [‘2017-03-01’, ‘2017-03-10’]}}
0:00:00 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: send ‘start’
0:03:10 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:03:18 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:03:26 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:03:36 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:03:47 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:04:01 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:04:20 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:04:40 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:05:05 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:05:36 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: queued (progress N/A)
0:06:14 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: running (progress N/A)
0:07:03 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: running (progress N/A)
0:08:02 Job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’: error (progress N/A)

Your batch job ‘vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af’ failed.
Logs can be inspected in an openEO (web) editor or with connection.job('vito-8a6227c8-cfcf-403f-8ef6-9c355365f8af').logs().

Log in openeo Editor:
error processing batch job Traceback (most recent call last): File “batch_job.py”, line 307, in main run_driver() File “batch_job.py”, line 280, in run_driver run_job( File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/utils.py”, line 41, in memory_logging_wrapper return function(*args, **kwargs) File “batch_job.py”, line 375, in run_job assets_metadata = result.write_assets(str(output_file)) File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeo_driver/save_result.py”, line 110, in write_assets return self.cube.write_assets(filename=directory, format=self.format, format_options=self.options) File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py”, line 1584, in write_assets self._get_jvm().org.openeo.geotrellis.geotiff.package.saveRDD(max_level.srdd.rdd(),band_count,str(filePath),zlevel,self._get_jvm().scala.Option.apply(crop_extent),gtiff_options) File “/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py”, line 1309, in call return_value = get_return_value( File “/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py”, line 326, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling z:org.openeo.geotrellis.geotiff.package.saveRDD. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 5) (epod111.vgt.vito.be executor 4): org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error response: HTTP/1.1 400 Bad Request with body: {“error”:{“status”:400,“reason”:“Bad Request”,“message”:“Requested band ‘HH’ is not present in Sentinel 1 tile ‘S1A_IW_GRDH_1SDV_20170309T165033_20170309T165058_015618_019AE9_132D’ returned by criteria specified in dataFilter parameter.”,“code”:“RENDERER_S1_MISSING_POLARIZATION”}} request: POST https://services.sentinel-hub.com/api/v1/process with body: { “input”: { “bounds”: { “bbox”: [578960.0, 5322080.0, 581520.0, 5324640.0], “properties”: { “crs”: “http://www.opengis.net/def/crs/EPSG/0/32633” } }, “data”: [ { “type”: “sentinel-1-grd”, “dataFilter”: {“timeRange”:{“from”:“2017-03-09T00:00:00Z”,“to”:“2017-03-10T00:00:00Z”}}, “processing”: {“backCoeff”:“GAMMA0_TERRAIN”,“orthorectify”:true} } ] }, “output”: { “width”: 256, “height”: 256, “responses”: [ { “identifier”: “default”, “format”: { “type”: “image/tiff” } } ] }, “evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n “bands”: [“VV”, “HH”]\n }],\n output: {\n bands: 2,\n sampleType: “FLOAT32”,\n }\n };\n}\n\nfunction evaluatePixel(sample) {\n return [sample.VV, sample.HH];\n}” } at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:125) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:119) at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.toResponse(Http.scala:414) at scalaj.http.HttpRequest.doConnection(Http.scala:368) at scalaj.http.HttpRequest.exec(Http.scala:343) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:119) at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60) at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46) at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65) at net.jodah.failsafe.Execution.executeSync(Execution.java:128) at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378) at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68) at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59) at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:118) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:195) at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46) at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:57) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:193) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201) at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$18(PyramidFactory.scala:294) at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307) at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:670) at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:424) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019) at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:259) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2403) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2352) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2351) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2351) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1109) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1109) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1109) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2591) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2533) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2522) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:898) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2235) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2254) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.RDD.collect(RDD.scala:1029) at org.apache.spark.rdd.PairRDDFunctions.$anonfun$collectAsMap$1(PairRDDFunctions.scala:737) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:736) at org.openeo.geotrellis.geotiff.package$.getCompressedTiles(package.scala:279) at org.openeo.geotrellis.geotiff.package$.saveRDDGeneric(package.scala:216) at org.openeo.geotrellis.geotiff.package$.saveRDD(package.scala:156) at org.openeo.geotrellis.geotiff.package.saveRDD(package.scala) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) at py4j.ClientServerConnection.run(ClientServerConnection.java:106) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error response: HTTP/1.1 400 Bad Request with body: {“error”:{“status”:400,“reason”:“Bad Request”,“message”:“Requested band ‘HH’ is not present in Sentinel 1 tile ‘S1A_IW_GRDH_1SDV_20170309T165033_20170309T165058_015618_019AE9_132D’ returned by criteria specified in dataFilter parameter.”,“code”:“RENDERER_S1_MISSING_POLARIZATION”}} request: POST https://services.sentinel-hub.com/api/v1/process with body: { “input”: { “bounds”: { “bbox”: [578960.0, 5322080.0, 581520.0, 5324640.0], “properties”: { “crs”: “http://www.opengis.net/def/crs/EPSG/0/32633” } }, “data”: [ { “type”: “sentinel-1-grd”, “dataFilter”: {“timeRange”:{“from”:“2017-03-09T00:00:00Z”,“to”:“2017-03-10T00:00:00Z”}}, “processing”: {“backCoeff”:“GAMMA0_TERRAIN”,“orthorectify”:true} } ] }, “output”: { “width”: 256, “height”: 256, “responses”: [ { “identifier”: “default”, “format”: { “type”: “image/tiff” } } ] }, “evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n “bands”: [“VV”, “HH”]\n }],\n output: {\n bands: 2,\n sampleType: “FLOAT32”,\n }\n };\n}\n\nfunction evaluatePixel(sample) {\n return [sample.VV, sample.HH];\n}” } at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:125) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:119) at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.toResponse(Http.scala:414) at scalaj.http.HttpRequest.doConnection(Http.scala:368) at scalaj.http.HttpRequest.exec(Http.scala:343) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:119) at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60) at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46) at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65) at net.jodah.failsafe.Execution.executeSync(Execution.java:128) at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378) at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68) at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59) at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:118) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:195) at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46) at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:57) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:193) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201) at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$18(PyramidFactory.scala:294) at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307) at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:670) at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:424) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019) at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:259)
error
ID: error
error processing batch job Traceback (most recent call last): File “batch_job.py”, line 307, in main run_driver() File “batch_job.py”, line 280, in run_driver run_job( File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/utils.py”, line 41, in memory_logging_wrapper return function(*args, **kwargs) File “batch_job.py”, line 375, in run_job assets_metadata = result.write_assets(str(output_file)) File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeo_driver/save_result.py”, line 110, in write_assets return self.cube.write_assets(filename=directory, format=self.format, format_options=self.options) File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_2282/container_e5020_1650104936572_2282_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py”, line 1584, in write_assets self._get_jvm().org.openeo.geotrellis.geotiff.package.saveRDD(max_level.srdd.rdd(),band_count,str(filePath),zlevel,self._get_jvm().scala.Option.apply(crop_extent),gtiff_options) File “/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py”, line 1309, in call return_value = get_return_value( File “/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py”, line 326, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling z:org.openeo.geotrellis.geotiff.package.saveRDD. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 5) (epod111.vgt.vito.be executor 4): org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error response: HTTP/1.1 400 Bad Request with body: {“error”:{“status”:400,“reason”:“Bad Request”,“message”:“Requested band ‘HH’ is not present in Sentinel 1 tile ‘S1A_IW_GRDH_1SDV_20170309T165033_20170309T165058_015618_019AE9_132D’ returned by criteria specified in dataFilter parameter.”,“code”:“RENDERER_S1_MISSING_POLARIZATION”}} request: POST https://services.sentinel-hub.com/api/v1/process with body: { “input”: { “bounds”: { “bbox”: [578960.0, 5322080.0, 581520.0, 5324640.0], “properties”: { “crs”: “http://www.opengis.net/def/crs/EPSG/0/32633” } }, “data”: [ { “type”: “sentinel-1-grd”, “dataFilter”: {“timeRange”:{“from”:“2017-03-09T00:00:00Z”,“to”:“2017-03-10T00:00:00Z”}}, “processing”: {“backCoeff”:“GAMMA0_TERRAIN”,“orthorectify”:true} } ] }, “output”: { “width”: 256, “height”: 256, “responses”: [ { “identifier”: “default”, “format”: { “type”: “image/tiff” } } ] }, “evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n “bands”: [“VV”, “HH”]\n }],\n output: {\n bands: 2,\n sampleType: “FLOAT32”,\n }\n };\n}\n\nfunction evaluatePixel(sample) {\n return [sample.VV, sample.HH];\n}” } at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:125) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:119) at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.toResponse(Http.scala:414) at scalaj.http.HttpRequest.doConnection(Http.scala:368) at scalaj.http.HttpRequest.exec(Http.scala:343) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:119) at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60) at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46) at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65) at net.jodah.failsafe.Execution.executeSync(Execution.java:128) at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378) at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68) at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59) at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:118) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:195) at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46) at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:57) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:193) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201) at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$18(PyramidFactory.scala:294) at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307) at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:670) at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:424) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019) at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:259) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2403) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2352) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2351) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2351) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1109) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1109) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1109) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2591) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2533) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2522) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:898) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2235) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2254) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.RDD.collect(RDD.scala:1029) at org.apache.spark.rdd.PairRDDFunctions.$anonfun$collectAsMap$1(PairRDDFunctions.scala:737) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:736) at org.openeo.geotrellis.geotiff.package$.getCompressedTiles(package.scala:279) at org.openeo.geotrellis.geotiff.package$.saveRDDGeneric(package.scala:216) at org.openeo.geotrellis.geotiff.package$.saveRDD(package.scala:156) at org.openeo.geotrellis.geotiff.package.saveRDD(package.scala) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) at py4j.ClientServerConnection.run(ClientServerConnection.java:106) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error response: HTTP/1.1 400 Bad Request with body: {“error”:{“status”:400,“reason”:“Bad Request”,“message”:“Requested band ‘HH’ is not present in Sentinel 1 tile ‘S1A_IW_GRDH_1SDV_20170309T165033_20170309T165058_015618_019AE9_132D’ returned by criteria specified in dataFilter parameter.”,“code”:“RENDERER_S1_MISSING_POLARIZATION”}} request: POST https://services.sentinel-hub.com/api/v1/process with body: { “input”: { “bounds”: { “bbox”: [578960.0, 5322080.0, 581520.0, 5324640.0], “properties”: { “crs”: “http://www.opengis.net/def/crs/EPSG/0/32633” } }, “data”: [ { “type”: “sentinel-1-grd”, “dataFilter”: {“timeRange”:{“from”:“2017-03-09T00:00:00Z”,“to”:“2017-03-10T00:00:00Z”}}, “processing”: {“backCoeff”:“GAMMA0_TERRAIN”,“orthorectify”:true} } ] }, “output”: { “width”: 256, “height”: 256, “responses”: [ { “identifier”: “default”, “format”: { “type”: “image/tiff” } } ] }, “evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n “bands”: [“VV”, “HH”]\n }],\n output: {\n bands: 2,\n sampleType: “FLOAT32”,\n }\n };\n}\n\nfunction evaluatePixel(sample) {\n return [sample.VV, sample.HH];\n}” } at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:125) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:119) at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414) at scala.Option.getOrElse(Option.scala:189) at scalaj.http.HttpRequest.toResponse(Http.scala:414) at scalaj.http.HttpRequest.doConnection(Http.scala:368) at scalaj.http.HttpRequest.exec(Http.scala:343) at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:119) at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60) at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46) at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65) at net.jodah.failsafe.Execution.executeSync(Execution.java:128) at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378) at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68) at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59) at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:118) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:195) at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46) at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:57) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:193) at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201) at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226) at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$18(PyramidFactory.scala:294) at scala.collection.Iterator$$anon$10.next(Iterator.scala:459) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:307) at org.apache.spark.api.python.PythonRunner$$anon$2.writeIteratorToStream(PythonRunner.scala:670) at org.apache.spark.api.python.BasePythonRunner$WriterThread.$anonfun$run$1(PythonRunner.scala:424) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019) at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:259)
error
ID: error

Hi,
that collection is indeed slightly more complex: it contains all Sentinel-1 GRD products, so it requires a bit more filtering to avoid ending up with incorrect combinations of products.
The example below contains an extra filter for polarization, and I also recommend to add the sar_backscatter process explicitly.

import openeo

datacube = connection.load_collection(
    "SENTINEL1_GRD",
    spatial_extent={"west": 16.06, "south": 48.06, "east": 16.07, "north": 48.07},
    temporal_extent=["2017-03-01", "2017-03-10"],
    bands=["VV", "VH"],
    properties={"polarization":lambda p: p == "DV"}
).sar_backscatter()
datacube = datacube.min_time()
datacube.download("backscatter.tiff")