Error loading Sentinel-3 SLSTR

Hi everyone,

when trying to load Sentinel-3 Land Surface Temperature data we receive an error stating that Sentinel Hub returned an error for the request:

# library(openeo)

collections = "SENTINEL3_SLSTR"
openeo::describe_collection("SENTINEL3_SLSTR")

bands = "S9"
ext = list(-119.4715 , 46.85667, -119.3554, 46.91514)
names(ext) = c("west", "south", "east", "north")

# change to login with your credentials
# con = openeo::connect("https://openeo.cloud")
# login()

procs = processes()

period_long = c("2018-05-01", "2022-05-31") 
Sen_cube = procs$load_collection(id = "SENTINEL3_SLSTR"
                                 , spatial_extent = ext
                                 , temporal_extent = period_long
                                 , bands = c("S9")
                                 , properties = list(
                                   "eo:cloud_cover" = function(x) x <= 50
                                   , "orbitDirection" = function(x) x == "DESCENDING"
                                 )
)

job_Sen = openeo::create_job(
  Sen_cube
)

openeo::start_job(job_Sen)

# ID: "vito-8a342504-52be-4064-be0e-4e359abdfe23"
openeo::log_job("vito-8a342504-52be-4064-be0e-4e359abdfe23")

When changing the period of the search to only include more recent months, the error does not occur and the data can be downloaded without issues.

period_recent = c("2022-01-01", "2022-05-31") 

Sen_cube_recent = procs$load_collection(id = "SENTINEL3_SLSTR"
                                 , spatial_extent = ext
                                 , temporal_extent = period_recent
                                 , bands = c("S9")
                                 , properties = list(
                                   "eo:cloud_cover" = function(x) x <= 50
                                   , "orbitDirection" = function(x) x == "DESCENDING"
                                 )
)


job_Sen_recent = openeo::create_job(
  Sen_cube_recent
)

openeo::start_job(job_Sen_recent)


# ID: "vito-cb62b053-fda4-4188-a970-87a92daae4f5"

Created on 2022-06-13 by the reprex package (v2.0.1)

Session info
sessioninfo::session_info()
#> ─ Session info ───────────────────────────────────────────────────────────────
#>  setting  value
#>  version  R version 4.2.0 (2022-04-22)
#>  os       Ubuntu 20.04.4 LTS
#>  system   x86_64, linux-gnu
#>  ui       X11
#>  language (EN)
#>  collate  en_US.UTF-8
#>  ctype    en_US.UTF-8
#>  tz       Europe/Berlin
#>  date     2022-06-13
#>  pandoc   2.17.1.1 @ /usr/lib/rstudio/bin/quarto/bin/ (via rmarkdown)
#> 
#> ─ Packages ───────────────────────────────────────────────────────────────────
#>  package     * version date (UTC) lib source
#>  abind       * 1.4-5   2016-07-21 [2] CRAN (R 4.1.0)
#>  assertthat    0.2.1   2019-03-21 [2] CRAN (R 4.1.0)
#>  base64enc     0.1-3   2015-07-28 [2] CRAN (R 4.1.0)
#>  class         7.3-20  2022-01-13 [4] CRAN (R 4.1.2)
#>  classInt      0.4-3   2020-04-07 [2] CRAN (R 4.1.0)
#>  cli           3.3.0   2022-04-25 [1] CRAN (R 4.2.0)
#>  crayon        1.5.1   2022-03-26 [1] CRAN (R 4.2.0)
#>  DBI           1.1.2   2021-12-20 [1] CRAN (R 4.2.0)
#>  digest        0.6.29  2021-12-01 [1] CRAN (R 4.2.0)
#>  dplyr         1.0.9   2022-04-28 [1] CRAN (R 4.2.0)
#>  e1071         1.7-11  2022-06-07 [1] RSPM (R 4.2.0)
#>  ellipsis      0.3.2   2021-04-29 [2] CRAN (R 4.1.0)
#>  evaluate      0.15    2022-02-18 [1] CRAN (R 4.2.0)
#>  fansi         1.0.3   2022-03-24 [1] CRAN (R 4.2.0)
#>  fastmap       1.1.0   2021-01-25 [1] CRAN (R 4.2.0)
#>  fs            1.5.2   2021-12-08 [1] CRAN (R 4.2.0)
#>  generics      0.1.2   2022-01-31 [1] RSPM (R 4.2.0)
#>  glue          1.6.2   2022-02-24 [1] CRAN (R 4.2.0)
#>  highr         0.9     2021-04-16 [2] CRAN (R 4.1.0)
#>  htmltools     0.5.2   2021-08-25 [1] CRAN (R 4.2.0)
#>  httr2         0.2.1   2022-05-10 [1] RSPM (R 4.2.0)
#>  IRdisplay     1.1     2022-01-04 [1] CRAN (R 4.2.0)
#>  jsonlite      1.8.0   2022-02-22 [1] CRAN (R 4.2.0)
#>  KernSmooth    2.23-20 2021-05-03 [4] CRAN (R 4.0.5)
#>  knitr         1.39    2022-04-26 [1] CRAN (R 4.2.0)
#>  lifecycle     1.0.1   2021-09-24 [1] CRAN (R 4.2.0)
#>  lubridate     1.8.0   2021-10-07 [1] CRAN (R 4.2.0)
#>  lwgeom        0.2-8   2021-10-06 [1] CRAN (R 4.2.0)
#>  magrittr      2.0.3   2022-03-30 [1] CRAN (R 4.2.0)
#>  openeo      * 1.2.0   2022-05-09 [1] CRAN (R 4.2.0)
#>  pillar        1.7.0   2022-02-01 [1] CRAN (R 4.2.0)
#>  pkgconfig     2.0.3   2019-09-22 [2] CRAN (R 4.1.0)
#>  proxy         0.4-27  2022-06-09 [1] RSPM (R 4.2.0)
#>  purrr         0.3.4   2020-04-17 [2] CRAN (R 4.1.0)
#>  R6            2.5.1   2021-08-19 [2] CRAN (R 4.1.1)
#>  rappdirs      0.3.3   2021-01-31 [2] CRAN (R 4.1.0)
#>  Rcpp          1.0.8.3 2022-03-17 [1] CRAN (R 4.2.0)
#>  repr          1.1.4   2022-01-04 [1] CRAN (R 4.2.0)
#>  reprex        2.0.1   2021-08-05 [1] RSPM (R 4.2.0)
#>  rlang         1.0.2   2022-03-04 [1] CRAN (R 4.2.0)
#>  rmarkdown     2.14    2022-04-25 [1] CRAN (R 4.2.0)
#>  rstudioapi    0.13    2020-11-12 [2] CRAN (R 4.1.0)
#>  sessioninfo   1.2.2   2021-12-06 [1] RSPM (R 4.2.0)
#>  sf          * 1.0-7   2022-03-07 [1] CRAN (R 4.2.0)
#>  stars       * 0.5-5   2021-12-19 [1] CRAN (R 4.2.0)
#>  stringi       1.7.6   2021-11-29 [1] CRAN (R 4.2.0)
#>  stringr       1.4.0   2019-02-10 [2] CRAN (R 4.1.0)
#>  tibble        3.1.7   2022-05-03 [1] CRAN (R 4.2.0)
#>  tidyselect    1.1.2   2022-02-21 [1] RSPM (R 4.2.0)
#>  units         0.8-0   2022-02-05 [1] CRAN (R 4.2.0)
#>  utf8          1.2.2   2021-07-24 [1] CRAN (R 4.2.0)
#>  vctrs         0.4.1   2022-04-13 [1] CRAN (R 4.2.0)
#>  withr         2.5.0   2022-03-03 [1] RSPM (R 4.2.0)
#>  xfun          0.31    2022-05-10 [1] CRAN (R 4.2.0)
#>  yaml          2.3.5   2022-02-21 [1] CRAN (R 4.2.0)
#> 
#>  [1] /home/carola/R/x86_64-pc-linux-gnu-library/4.2
#>  [2] /usr/local/lib/R/site-library
#>  [3] /usr/lib/R/site-library
#>  [4] /usr/lib/R/library
#> 
#> ──────────────────────────────────────────────────────────────────────────────

Any ideas what could be the issue here? There should be data available for the request, as was seen when testing with the shorter period.

Thanks as always!

What is the exact error message?

Sorry, I thought you were able to access the logs if provided with the job ID.
The error is the following:

openeo::log_job("vito-8a342504-52be-4064-be0e-4e359abdfe23")

# [ERROR] error processing batch job
# Traceback (most recent call last):
#   File "batch_job.py", line 319, in main
# run_driver()
# File "batch_job.py", line 292, in run_driver
# run_job(
#   File "/data2/hadoop/yarn/local/usercache/hendrik.wagenseil/appcache/application_1654997540016_13264/container_e5040_1654997540016_13264_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/utils.py", line 43, in memory_logging_wrapper
#   return function(*args, **kwargs)
#     File "batch_job.py", line 388, in run_job
#   assets_metadata = result.write_assets(str(output_file))
#   File "/data2/hadoop/yarn/local/usercache/hendrik.wagenseil/appcache/application_1654997540016_13264/container_e5040_1654997540016_13264_01_000001/venv/lib/python3.8/site-packages/openeo_driver/save_result.py", line 110, in write_assets
#   return self.cube.write_assets(filename=directory, format=self.format, format_options=self.options)
#   File "/data2/hadoop/yarn/local/usercache/hendrik.wagenseil/appcache/application_1654997540016_13264/container_e5040_1654997540016_13264_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py", line 1547, in write_assets
#   timestamped_paths = self._get_jvm().org.openeo.geotrellis.geotiff.package.saveRDDTemporal(
#     File "/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", line 1309, in __call__
#     return_value = get_return_value(
#       File "/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", line 326, in get_return_value
#       raise Py4JJavaError(
#         py4j.protocol.Py4JJavaError: An error occurred while calling z:org.openeo.geotrellis.geotiff.package.saveRDDTemporal.
#         : org.apache.spark.SparkException: Job aborted due to stage failure: Task 236 in stage 9.0 failed 4 times, most recent failure: Lost task 236.3 in stage 9.0 (TID 447) (epod071.vgt.vito.be executor 70): org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error
#         response: HTTP/1.1 500 Internal Server Error with body: {"error":{"status":500,"reason":"Internal Server Error","message":"Illegal request to creo://EODATA/Sentinel-3/SLSTR/SL_1_RBT/2020/12/02/S3A_SL_1_RBT____20201202T182300_20201202T182600_20201202T202032_0180_065_355_2160_LN2_O_NR_004.SEN3/S9_BT_in.nc. HTTP Status: '404' On CreoDIAS, you get 'Illegal request 403 error' also when a file is missing. So first make sure that the file is present.","code":"RENDERER_EXCEPTION"}}
#         request: POST https://creodias.sentinel-hub.com/api/v1/process with body: {
#           "input": {
#             "bounds": {
#               "bbox": [-119.471475, 44.375460460394756, -116.93179246039475, 46.915143],
#               "properties": {
#                 "crs": "http://www.opengis.net/def/crs/EPSG/0/4326"
#               }
#             },
#             "data": [
#               {
#                 "type": "sentinel-3-slstr",
#                 "dataFilter": {"timeRange":{"from":"2020-12-02T00:00:00Z","to":"2020-12-03T00:00:00Z"},"maxCloudCoverage":50,"orbitDirection":"DESCENDING"},
#                 "processing": {}
#               }
#             ]
#           },
#           "output": {
#             "width": 256,
#             "height": 256,
#             "responses": [
#               {
#                 "identifier": "default",
#                 "format": {
#                   "type": "image/tiff"
#                 }
#               }
#             ]
#           },
#           "evalscript": "//VERSION=3\nfunction setup() {\n  return {\n    input: [{\n      \"bands\": [\"S9\"]\n    }],\n    output: {\n      bands: 1,\n      sampleType: \"FLOAT32\",\n    }\n  };\n}\n\nfunction evaluatePixel(sample) {\n  return [sample.S9];\n}"
#         }
#         at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:130)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:120)
#         at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422)
#         at scala.Option.getOrElse(Option.scala:189)
#         at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414)
#         at scala.Option.getOrElse(Option.scala:189)
#         at scalaj.http.HttpRequest.toResponse(Http.scala:414)
#         at scalaj.http.HttpRequest.doConnection(Http.scala:368)
#         at scalaj.http.HttpRequest.exec(Http.scala:343)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:120)
#         at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60)
#         at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46)
#         at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65)
#         at net.jodah.failsafe.Execution.executeSync(Execution.java:128)
#         at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378)
#         at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68)
#         at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:119)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:193)
#         at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:56)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:191)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$17(PyramidFactory.scala:286)
#         at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
#         at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
#         at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:489)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:179)
#         at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
#         at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
#         at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
#         at org.apache.spark.scheduler.Task.run(Task.scala:131)
#         at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
#         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
#         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
#         at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
#         at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
#         at java.base/java.lang.Thread.run(Thread.java:829)
#         
#         Driver stacktrace:
#           at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2403)
#         at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2352)
#         at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2351)
#         at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
#         at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
#         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
#         at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2351)
#         at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1109)
#         at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1109)
#         at scala.Option.foreach(Option.scala:407)
#         at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1109)
#         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2591)
#         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2533)
#         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2522)
#         at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
#         at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:898)
#         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2214)
#         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2235)
#         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2254)
#         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2279)
#         at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
#         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
#         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
#         at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
#         at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
#         at org.openeo.geotrellis.geotiff.package$.saveRDDTemporal(package.scala:136)
#         at org.openeo.geotrellis.geotiff.package.saveRDDTemporal(package.scala)
#         at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
#         at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
#         at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
#         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
#         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
#         at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
#         at py4j.Gateway.invoke(Gateway.java:282)
#         at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
#         at py4j.commands.CallCommand.execute(CallCommand.java:79)
#         at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
#         at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
#         at java.base/java.lang.Thread.run(Thread.java:829)
#         Caused by: org.openeo.geotrellissentinelhub.SentinelHubException: Sentinel Hub returned an error
#         response: HTTP/1.1 500 Internal Server Error with body: {"error":{"status":500,"reason":"Internal Server Error","message":"Illegal request to creo://EODATA/Sentinel-3/SLSTR/SL_1_RBT/2020/12/02/S3A_SL_1_RBT____20201202T182300_20201202T182600_20201202T202032_0180_065_355_2160_LN2_O_NR_004.SEN3/S9_BT_in.nc. HTTP Status: '404' On CreoDIAS, you get 'Illegal request 403 error' also when a file is missing. So first make sure that the file is present.","code":"RENDERER_EXCEPTION"}}
#         request: POST https://creodias.sentinel-hub.com/api/v1/process with body: {
#           "input": {
#             "bounds": {
#               "bbox": [-119.471475, 44.375460460394756, -116.93179246039475, 46.915143],
#               "properties": {
#                 "crs": "http://www.opengis.net/def/crs/EPSG/0/4326"
#               }
#             },
#             "data": [
#               {
#                 "type": "sentinel-3-slstr",
#                 "dataFilter": {"timeRange":{"from":"2020-12-02T00:00:00Z","to":"2020-12-03T00:00:00Z"},"maxCloudCoverage":50,"orbitDirection":"DESCENDING"},
#                 "processing": {}
#               }
#             ]
#           },
#           "output": {
#             "width": 256,
#             "height": 256,
#             "responses": [
#               {
#                 "identifier": "default",
#                 "format": {
#                   "type": "image/tiff"
#                 }
#               }
#             ]
#           },
#           "evalscript": "//VERSION=3\nfunction setup() {\n  return {\n    input: [{\n      \"bands\": [\"S9\"]\n    }],\n    output: {\n      bands: 1,\n      sampleType: \"FLOAT32\",\n    }\n  };\n}\n\nfunction evaluatePixel(sample) {\n  return [sample.S9];\n}"
#         }
#         at org.openeo.geotrellissentinelhub.SentinelHubException$.apply(SentinelHubException.scala:19)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8(ProcessApi.scala:130)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$8$adapted(ProcessApi.scala:120)
#         at scalaj.http.HttpRequest.$anonfun$toResponse$17(Http.scala:422)
#         at scala.Option.getOrElse(Option.scala:189)
#         at scalaj.http.HttpRequest.$anonfun$toResponse$14(Http.scala:414)
#         at scala.Option.getOrElse(Option.scala:189)
#         at scalaj.http.HttpRequest.toResponse(Http.scala:414)
#         at scalaj.http.HttpRequest.doConnection(Http.scala:368)
#         at scalaj.http.HttpRequest.exec(Http.scala:343)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.$anonfun$getTile$7(ProcessApi.scala:120)
#         at org.openeo.geotrellissentinelhub.package$$anon$1.get(package.scala:60)
#         at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:46)
#         at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:65)
#         at net.jodah.failsafe.Execution.executeSync(Execution.java:128)
#         at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:378)
#         at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:68)
#         at org.openeo.geotrellissentinelhub.package$.withRetries(package.scala:59)
#         at org.openeo.geotrellissentinelhub.DefaultProcessApi.getTile(ProcessApi.scala:119)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$1(PyramidFactory.scala:193)
#         at org.openeo.geotrellissentinelhub.MemoizedRlGuardAdapterCachedAccessTokenWithAuthApiFallbackAuthorizer.authorized(Authorizer.scala:46)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.authorized(PyramidFactory.scala:56)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$getTile$1(PyramidFactory.scala:191)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.org$openeo$geotrellissentinelhub$PyramidFactory$$dataTile$1(PyramidFactory.scala:201)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.loadMasked$1(PyramidFactory.scala:226)
#         at org.openeo.geotrellissentinelhub.PyramidFactory.$anonfun$datacube_seq$17(PyramidFactory.scala:286)
#         at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
#         at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:512)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
#         at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:489)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
#         at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:179)
#         at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
#         at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
#         at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
#         at org.apache.spark.scheduler.Task.run(Task.scala:131)
#         at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
#         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
#         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
#         at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
#         at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
#         ... 1 more

This is probably the relevant part of the error:

@daniel.thiex any quick insights?

@jeroen.dries suggested to enable the β€œsoft error” feature, so that a single failing file doesn’t ruin the whole job.
In Python it can be enabled for a batch job with

data_cube.execute_batch(job_options={'soft-errors': 'true'})

(e.g. see Job option for soft errors in data loading Β· Issue #34 Β· Open-EO/openeo-geotrellis-extensions Β· GitHub)

But I don’t known how that can be done with the R client.
job_options also not standardized yet, so it might not be possible at all (yet) with the R client.

cc @florian.lahn @m.mohr

There seems to be a corrupt file on Creodias. We are looking into removing the corrupted file but in the meantime enabling the β€œsoft error” feature as suggested should solve the problem.

As far as I could see the problematic file is on 02.12.2020 so excluding this day from the requested time-span might also be a solution.

1 Like

@datascience We removed the corrupted files from Creodias and with that you shouldn’t run into that error anymore.

Thank you!

I’m afraid the R client and the Web Editor do not support adding additional parameters to the body of the request that is sent to create jobs or process data synchronously. Thus, job_options can only be used in Python and JavaScript.

Related issue in the API: Batch job options/configuration settings Β· Issue #276 Β· Open-EO/openeo-api Β· GitHub

I created new issues in the clients to support adding additional parameters: