OpenEoApiError: [500] and OpenEoApiError('[401]

When I am trying to apply the atsmospheric corrections and download data I am getting this error:
OpenEoApiError: [500] Internal: Failed to process synchronously on backend eodc: OpenEoApiError('[401] 401: User my_email@gmail.com does not exist and is not whitelisted.')
Do you know why I am getting this error @stefaan.lippens

That is an error coming from the EODC backend, so I’ll cc @sean.hoyal

Have you already completed your full registration for Free Trial or Early Adopter ?

Yes, I have completed my registration and it was approved a week ago.

Hi sulova,

Completing the registration should allow you access the the eodc backend. Will take a look into this and respond when I’ve located the issue.

Hi @sulova.andrea ,
please try again your code. It should work now.
Please confirm with us quickly here.
Thanks!

Hey both @benjamin.schumacher @sean.hoyal
Getting sligtly a different error

`OpenEoApiError: [500] Internal: Failed to process synchronously on backend eodc: OpenEoApiError('[500] 500: syntax error: line 1, column 49')`

Looking at the logs it looks the job is too pick to process synchronously on our side, an area still for us to improve. Using the following notebook as a guide, you should be able to send your request as a job for us to run the ard_surface_reflectance asynchronously.

Let us know if this doesn’t help!

Thanks for your feedback.

When I have tried to run “send_job”:

ob = sentinel2_rgb.send_job()
if job:
    print(job.job_id)
    print(job.run_synchronous("S2_RGB.tiff"))
else:
    print("Job ID is None")

then I have gotten this error:

vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f
0:00:00 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': send 'start'
0:01:16 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:01:23 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:01:31 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:01:41 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:01:52 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:02:05 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:02:23 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:02:43 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:03:08 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': queued (progress N/A)
0:03:39 Job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f': error (progress N/A)

Your batch job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f' failed.
Logs can be inspected in an openEO (web) editor or with `connection.job('vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f').logs()`.

Printing logs:
[{'id': '0', 'level': 'error', 'message': 'error processing batch job\nTraceback (most recent call last):\n  File "batch_job.py", line 305, in main\n    run_driver()\n  File "batch_job.py", line 279, in run_driver\n    run_job(\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/utils.py", line 40, in memory_logging_wrapper\n    return function(*args, **kwargs)\n  File "batch_job.py", line 332, in run_job\n    result = ProcessGraphDeserializer.evaluate(process_graph, env=env, do_dry_run=tracer)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 262, in evaluate\n    return convert_node(result_node, env=env)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 268, in convert_node\n    return apply_process(\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 1113, in apply_process\n    args = {name: convert_node(expr, env=env) for (name, expr) in sorted(args.items())}\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 1113, in <dictcomp>\n    args = {name: convert_node(expr, env=env) for (name, expr) in sorted(args.items())}\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 273, in convert_node\n    return convert_node(processGraph[\'node\'], env=env)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 268, in convert_node\n    return apply_process(\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 1225, in apply_process\n    return process_function(args=args, env=env)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py", line 433, in load_collection\n    return env.backend_implementation.catalog.load_collection(collection_id, load_params=load_params, env=env)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeo/util.py", line 363, in wrapper\n    return f(*args, **kwargs)\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/layercatalog.py", line 446, in load_collection\n    pyramid = file_s2_pyramid()\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/layercatalog.py", line 227, in file_s2_pyramid\n    return file_pyramid(lambda opensearch_endpoint, opensearch_collection_id, opensearch_link_titles, root_path:\n  File "/data4/hadoop/yarn/local/usercache/openeo/appcache/application_1643116788003_6082/container_e5013_1643116788003_6082_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/layercatalog.py", line 272, in file_pyramid\n    return factory.datacube_seq(projected_polygons_native_crs, from_date, to_date, metadata_properties(),\n  File "/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", line 1309, in __call__\n    return_value = get_return_value(\n  File "/opt/spark3_2_0/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", line 326, in get_return_value\n    raise Py4JJavaError(\npy4j.protocol.Py4JJavaError: An error occurred while calling o818.datacube_seq.\n: java.lang.IllegalArgumentException: Could not find data for your load_collection request with catalog ID urn:eop:VITO:TERRASCOPE_S2_TOC_V2. The catalog query had id f4d29e48-3ed1-48b2-8ced-5554f0813fdf and returned 0 results.\n\tat org.openeo.geotrellis.layers.FileLayerProvider.loadRasterSourceRDD(FileLayerProvider.scala:602)\n\tat org.openeo.geotrellis.layers.FileLayerProvider.readMultibandTileLayer(FileLayerProvider.scala:476)\n\tat org.openeo.geotrellis.file.Sentinel2PyramidFactory.datacube(Sentinel2PyramidFactory.scala:150)\n\tat org.openeo.geotrellis.file.Sentinel2PyramidFactory.datacube_seq(Sentinel2PyramidFactory.scala:129)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:566)\n\tat py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\n\tat py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)\n\tat py4j.Gateway.invoke(Gateway.java:282)\n\tat py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\n\tat py4j.commands.CallCommand.execute(CallCommand.java:79)\n\tat py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)\n\tat py4j.ClientServerConnection.run(ClientServerConnection.java:106)\n\tat java.base/java.lang.Thread.run(Thread.java:834)\n\n'}]
---------------------------------------------------------------------------
JobFailedException                        Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_13764\3765234094.py in <module>
      2 if job:
      3     print(job.job_id)
----> 4     print(job.run_synchronous("S2_RGB_unmasked.tiff"))
      5 else:
      6     print("Job ID is None")

~\Anaconda3\envs\openeo\lib\site-packages\openeo\rest\job.py in run_synchronous(self, outputfile, print, max_poll_interval, connection_retry_interval)
    132         """Start the job, wait for it to finish and download result"""
    133         self.start_and_wait(
--> 134             print=print, max_poll_interval=max_poll_interval, connection_retry_interval=connection_retry_interval
    135         )
    136         # TODO #135 support multi file result sets too?

~\Anaconda3\envs\openeo\lib\site-packages\openeo\rest\job.py in start_and_wait(self, print, max_poll_interval, connection_retry_interval, soft_error_max)
    216             raise JobFailedException("Batch job {i!r} didn't finish successfully. Status: {s} (after {t}).".format(
    217                 i=self.job_id, s=status, t=elapsed()
--> 218             ), job=self)
    219 
    220         return self

JobFailedException: Batch job 'vito-1edd0b78-8809-4bc1-bd08-40d1eef69c4f' didn't finish successfully. Status: error (after 0:03:39).

Hi!

This now seems to be something on the VITO backend, will cc @stefaan.lippens, so they’re aware!

Hi Andrea,

this is the crucial part of that log output:

What area/dates are you requesting? Did you check that data is indeed available on the Terrascope layer? You could try the same query with SENTINEL2_L2A_SENTINELHUB to see if that one does have the data. The Terrascope layer is faster, but more limited in coverage.

We’re also working on providing a unified SENTINEL2_L2A layer where this decision will be made automatically.

1 Like