CORINE_LAND_COVER legend

I am using the CORINE_LAND_COVER collection, and I would have expected to find the same “nomenclature” as the official Copernicus product (Home :: Corine Land Cover classes), as there is no information on the openEO Platform collection page.
However, I find a different set of values than expected, e.g. rivers have a value of 40 instead of 511.
Could you please clarify why this is the case, and how to convert the values of the legend?

Thanks in advance
Luca

Hi Luca,
you can find the correct mapping on this page:
https://collections.sentinel-hub.com/corine-land-cover/readme.html

It’s a Sentinelhub collection, @daniel.thiex any further info on this difference, is there perhaps a STAC compliant way to document the mapping?

exactly what I needed, thanks a lot!

hi @jeroen.dries, I have another question related to the same collection.

Basically, my goal is to select just some specific land cover classes and mask the CGLS_SSM_V1_GLOBAL collection based on these. So here is what I do:

s1ssm = connection.load_collection(“CGLS_SSM_V1_GLOBAL”,
spatial_extent={“west”: 16.06, “south”: 48.10, “east”: 16.65, “north”: 48.31, crs": “EPSG:4326”},
temporal_extent=[“2019-05-01”, “2019-06-01”]).band(“ssm”)

clc = connection.load_collection(“CORINE_LAND_COVER”,
spatial_extent={“west”: 16.06, “south”: 48.10, “east”: 16.65, “north”: 48.31, “crs”: “EPSG:4326”},
temporal_extent=[“2017-10-01”, “2018-10-01”]).band(“CLC”)
clc_mask = ((clc == 12) | (clc == 13))
clc_mask.resample_cube_spatial(s1ssm)
s1ssm_masked = s1ssm.mask(clc_mask)

however, I get an error saying (as far as I can understand) that the two datacubes have different CRS, which makes sense because the downloaded clc_mask file has EPSG:3035 while s1ssm has EPSG:4326

so, I am looking for a function that reprojects clc_mask to EPSG:4326, but couldn’t find anything on the documentation. How should I address this?

here the full error message:
[{‘id’: ‘error’, ‘level’: ‘error’, ‘message’: ‘error processing batch job\nTraceback (most recent call last):\n File “batch_job.py”, line 319, in main\n run_driver()\n File “batch_job.py”, line 292, in run_driver\n run_job(\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/utils.py”, line 41, in memory_logging_wrapper\n return function(*args, **kwargs)\n File “batch_job.py”, line 351, in run_job\n result = ProcessGraphDeserializer.evaluate(process_graph, env=env, do_dry_run=tracer)\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py”, line 308, in evaluate\n return convert_node(result_node, env=env)\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py”, line 314, in convert_node\n return apply_process(\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py”, line 1336, in apply_process\n return process_function(args=args, env=env)\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeo_driver/ProcessGraphDeserializer.py”, line 988, in mask\n return cube.mask(mask=mask, replacement=replacement)\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py”, line 909, in mask\n mask_pyramid_levels = {\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/openeogeotrellis/geopysparkdatacube.py”, line 910, in \n k: l.tile_to_layout(layout=self.pyramid.levels[k])\n File “/data2/hadoop/yarn/local/usercache/openeo/appcache/application_1650104936572_44517/container_e5020_1650104936572_44517_01_000001/venv/lib/python3.8/site-packages/geopyspark/geotrellis/layer.py”, line 1819, in tile_to_layout\n raise ValueError(“The layout needs to have the same crs as the TiledRasterLayer”)\nValueError: The layout needs to have the same crs as the TiledRasterLayer\n’}]

Hi Luca,
your attempt was almost correct, but most openEO client functions don’t work in place, so rather return a new object. Hence assigning the result of resample_cube_spatial to clc_mask should help:

clc_mask = clc_mask.resample_cube_spatial(s1ssm)
s1ssm_masked = s1ssm.mask(clc_mask)

dear @jeroen.dries
thanks, indeed that problem was solved.

nevertheless, the mask based on Corine LC does not really work. I would expect that all the CGLS_SSM_V1_GLOBAL pixels which do not belong to LC classes 12 or 13 would be masked, but this does not happen. What I get instead is the unmasked CGLS_SSM_V1_GLOBAL

any idea why and how to address it?

Hi Luca,
the problem is that some datacubes, like landcover, still have a temporal dimension even if there’s only one date available.
You want to disregards that, and apply the same mask for each timestamp.
Please try this:
s1ssm_masked = s1ssm.mask(clc_mask.max_time())

That at least seems to mask out some values if I try it myself.

best regards,
Jeroen

that did the trick, thanks a lot!