Error in index calculation with process ndvi

Hi,

when trying to calculate multiple indices using the predefined process ndvi we receive an error message concerning band referencing. This is a new error, the code as posted below worked when we first tried it while following the instructions in the openEO cookbook.

collections = "SENTINEL2_L2A_SENTINELHUB"
bands = c("B02", "B03", "B04", "B08", "B11", "B12")
period = c("2021-04-01", "2021-06-30")
ext = list(11.14079 , 49.75891, 11.1432, 49.76039)
names(ext) = c("west", "south", "east", "north")

# change to login with your credentials
# con = openeo::connect("https://openeo.cloud")
# login(
#     login_type = "oidc"
#     , provider = "egi"
#     , config = list(
#         client_id = "<client_id>"
#         , secret = "<secret>"
#     )
# )

procs = openeo::processes()

cube = procs$load_collection(
    id = collections
    , spatial_extent = ext
    , temporal_extent = period
    , bands = bands
)

cube_ndvi = procs$ndvi(
    data = cube
    , red = "B04"
    , nir = "B08"
    , target_band = "NDVI"
)

cube_ndwi = procs$ndvi(
    data = cube_ndvi
    , red = "B12"
    , nir = "B08"
    , target_band = "NDWI"
)

cube_ind = procs$ndvi(
    data = cube_ndwi
    , red = "B12"
    , nir = "B11"
    , target_band = "NSMI"
)


## create and start job
job = openeo::create_job(cube_ind)
openeo::start_job(job)

id = as.character(job$id)
jobs = openeo::list_jobs()
jobs[[id]]
openeo::log_job(id)

Maybe something has changed from the back-end side regarding the proper way to index the specific bands? In case you can access the job logs, this is the ID showing the error message: “vito-6d5864dd-f1be-41e5-b6e9-b774ab27d50a”

Thanks as always!

This is indeed a new error that I created.

The number of bands in the metadata 6/6 does not match the actual band count in the cubes (left/right): 6/1. You can fix this by explicitly specifying correct band labels.

We are now more strict about band labels, but in this case, I’m wondering if we should have solved it on our side.
@florian.lahn does the R client do a ‘merge_cubes’ here in the background somewhere?

I was able to check a bit deeper, the error is entirely on the backend side. There’s indeed a merge_cubes going on that I wasn’t aware of.

I made one commit already, but may need to look a little bit further on how to make this work. It can even be made more efficient.

No, the R client doesn’t do implicit things, you always need to explicitly call merge_cubes.

Hi @datascience ,
we deployed a fix on openeo-dev.vito.be, could you try running against that instance?
We’ll deploy into production if it works.

I fixed the original issue, but also now avoid using merge_cubes in the background, which should make your job more efficient as well. We also added a unit test to avoid regressions in this process in the future.

Hi,

we tried it as you suggested and everything works perfectly. Thank you.