When loading a collection, for example SENTINEL2_L1C, I would like to have more insights than the Connection.describe_collection method would give me. For example, I would like to learn how what parameters were submitted to the SentinelHub source to find out how whether negative reflectance values are clamped. Or in another case, what kind of interpolation is being used.
In this case I am only getting dates back (no time - hours, minutes, seconds).
Where can I find more about how the collections are composed?
Hi Jaap,
at this level of detail, the only generic answer I can give is, āitās in the open source codeā. More specifically, for the Sentinelhub case, for instance:
Of course, I donāt expect most users to actually do that, so weāll have to answer all your questions separately.
In a case like clamping of negative reflectance values, we would have to run a test ourselves to see what happens.
About the timestamps, could that be the same question as this one:
Thanks Jeroen, useful to know that we are working with day-composites.
Looking at this line, it seems that the processing options are supplied as argument.
Probably at bit of a stretch to influence this process directly. It is best to figure out a new way of filtering out poor images (nightly images + cloudy images), that gives similar results to how we process in earthengine at the moment. I know that we can at least filter out cloud using:
Hi Jaap,
nightly images are a bit new to me, is this something that can happen for instance at higher latitudes in winter?
If you have an example of how you filter them out in GEE, we might be able to indeed do an openEO equivalent, or is this by some kind of filter on the hour of the day?
But perhaps the OpenEO processing already filters out these images.
Another thing that I am looking to filter out is images like the following with missing or very high reflectance data (in this case data is missing and there is a constant, highly reflective result. I am plotting the green band):
We do have solar angles available as a band, so some filtering based on that value might just work.
Iām only guessing that it can also occur that multiple products are captured on the same day, so the filtering needs to go before the compositing? That would be something new that I have to figure out, or perhaps if we can have a property at the product level, it could be solved in catalog filtering, which is easier. @daniel.thiex Is this something that might work for sentinelhub?
For the high reflectance values, I would say a filter on the pixel level might be easiest?
It seems that the particular product you are looking at with high reflectance, is simply clouds, and the product should be filtered out if you set a threshold at 90% or lower.
Itās not that there is data missing but you just happen to be at the edge of the orbit where the right side of the image is within the orbit and the satellite acquired data on that day. The left side is outside of the orbit and on the day you are requesting data the area was not covered by the satellite.
There is currently not such a property on the product level that could be used for this for filtering before loading the data.
For Sentinel 2 this is not the case so there should be only 1 image per location per day. For other collections or longer timespans this might however happen, so something to keep in mind.
As Jeroen mentioned there is the sunZenithAngles band available which could be used one the pixel level. If you are on that level already the SCL band which is a scene classification band might also be useful.