Hey there,
I’m trying to get SENTINEL2-L2A data from the whole of Great Britan from the year 2019 as cloud free as possible. Therefore I would partition Great Britan into smaller chunks and then would like to reduce the temporal dimension of the result from load_collection to the one time point with the lowest cloud_cover. But I sadly don’t know how to do that.
So the code for one chunk would look something like that:
library(openeo)
connection = connect(host = “https://openeo.dataspace.copernicus.eu”)
login()
p = processes()
datacube = p$load_collection(
id = “SENTINEL2_L2A”,
bands = c(paste0(“B0”, 1:9),“B8A”, “B11”, “B12”),
spatial_extent = list(west = -3.570557, south = 52.435921, east = -2.268677, north = 53.130294), # just a smaller chunk
temporal_extent = c(“2019-03-01”, “2019-04-01”), # I would preferre that every chunk comes from roughly the same time period of 2019 to avoid temporal artefacts in the combined map of GB
properties = list(“eo:cloud_cover” = function(x) x <= 10)) # if this cloud cover value can’t be reached, I would enlargen the time period or shrink the chunk size
min_cloud_cover_reducer = function(data,context) {
??? # This is where I’m stuck
}
reduced = p$reduce_dimension(data = datacube, reducer = min_reducer, dimension=“t”)
formats = list_file_formats()
result = p$save_result(data = reduced, format = formats$output$netCDF)
job = create_job(graph=result,title = “test”)
start_job(job = job)
download_results(job = job, folder = “data/tmp/”)
I would be glad if anyone could help me.
Moreover if someone has a smarter idea of getting these satalite images for Great Britan, I would absolutly be open to that!
Thanks in advance,
Daniel