Dear Terrascope-forum,
I have a few questions regarding the processing of LAI, NDVI, and FAPAR timeseries. I successfully generated timeseries for my own polygon using the notebooks.
However, I was wondering how clouds and shadows are handled in these timeseries. Are pixels with clouds or shadows automatically excluded from the calculations, or do they count as invalid pixels (from which you can select the percentage)? Additionally, does the timeseries for a polygon calculate the average of all pixels for which data is available, or how does this work?
Furthermore, I would like to know if it is possible to generate a 3D dataset for my polygon, where the NDVI values over time are displayed per pixel. Is this something that can be implemented in the notebooks?
Thank you in advance for your help!
Best regards
Emma
Comments
Hi Emma, although we apply a…
Hi Emma, although we apply a cloud mask as provided by the scene classification, some pixels may slip through. In general terms: whenever you see a pixel in your polygon in the Terrascope viewer, it will be used for the calculation of the average over the polygon for that date. If you want something more elaborate, I'm sure it can be done using openEO.
As for the stack of NDVI maps over time: yes it can be implemented. It would be a better version than the timelapse feature in the viewer. I assume you can do this yourself. If not, reach out and we'll see what we can do for you. Good luck!
Thank you for the quick…
Thank you for the quick response. I have now tried to solve my question via the notebooks and connection to eoplaza. Only I can't manage to process my batch job. Could someone please tell me how this comes to happen? When I run my code the batch job does appear in the openEO web editor but its progress stays at 0%. My custom code from the notebook 08 is attached below. Thanks in advance!
Adjusted notebook 08:
import os import openeo import rasterio import matplotlib.pyplot as plt
dir_downloads = "./tmp/08_openEO" os.makedirs(dir_downloads, exist_ok=True)
eoconn = openeo.connect("openeo.vito.be").authenticate_oidc()
sorted(eoconn.list_collection_ids()) sentinel2_cube = eoconn.load_collection( "SENTINEL2_L2A", spatial_extent={"west": 4.634111, "south": 50.678972, "east": 4.640083, "north": 50.682472}, temporal_extent=["2016-01-01", "2024-01-01"], #links open interval dus is van 01-01-2016 tot 31-12-2023 bands=["B04", "B08", "SCL"], )
# Select the "SCL" band from the data cube scl_band = sentinel2_cube.band("SCL") # Build mask to mask out everything but class 4 (vegetation) mask = (scl_band != 4)
s2_masked = sentinel2_cube.mask(mask).filter_bands(["B04", "B08"])
red = s2_masked.band("B04") * 1e-4 nir = s2_masked.band("B08") * 1e-4
ndvi_cube = (nir-red)/(nir+red)
job_title = "NDVI_timeseriesTest" p_ndvi = os.path.join(dir_downloads, "{}.netCDF".format(job_title))
ndvi_job = ndvi_cube.create_job(title=job_title, out_format="netCDF") ndvi_job.start_and_wait() ndvi_job.download_result(p_ndvi)
The 'start_and_wait' call…
The 'start_and_wait' call should indeed have started your job.
In the web editor, you should be able to retrieve the job id 'j-xxx' of your job, with that number, we can inspect what went wrong.