Error downloading s2 images

Hi, 

Since march, I've been trying to download some mosaics of these images, but I keep getting errors.

s2-weekly_2026-02-23_2026-03-01_B02-B03-B04-B08-B11-B12_mean.tif

s2-weekly_2026-03-02_2026-03-08_B02-B03-B04-B08-B11-B12_mean.tif

s2-weekly_2026-03-09_2026-03-15_B02-B03-B04-B08-B11-B12_mean.tif

s2-weekly_2026-03-16_2026-03-22_B02-B03-B04-B08-B11-B12_mean.tif

 

NOTE: later images do work again, so maybe it's a temporary issue?

s2-weekly_2026-03-23_2026-03-29_B02-B03-B04-B08-B11-B12_mean

s2-weekly_2026-03-30_2026-04-05_B02-B03-B04-B08-B11-B12_mean

 

Another strange thing, since these images I've had to add the code below, because the prefix isnt being send anymore and the fields have changed from name, is that to be expected?

 

collection_info = conn.describe_collection(collection)

summaries = collection_info["summaries"]

if "eo:bands" in summaries:

     band_key = "eo:bands"

     data_type = "type"

elif "raster:bands" in summaries:

     band_key = "raster:bands"

     data_type = "type"

elif "bands" in summaries:

     band_key = "bands"

     data_type = "data_type"

 

I also saw there was a deprecation warning for the cloud masking we use: mask_scl_dilation

cube = cube.process(

     "mask_scl_dilation",

     data=cube,

     scl_band_name=cloud_filter_band,

)

If we want to the newer to_scl_dilation_mask, I see in the doc (https://docs.openeo.cloud/processes/#to_scl_dilation_mask) you can pass mask1_values & mask2_values. Are these used for filtering? And if yes, which filters did the old process use, so we can match the old functionality.

 

 

Anyways, I see this error when I execute these images =>  '26/04/10 08:16:47 INFO Configuration: resource-types.xml not found\n' '26/04/10 08:16:47 INFO ResourceUtils: Unable to find 'resource-types.xml'.\n"

I've also tried playing with the memory parameters, but to no avail.


[   {   'id': '[1775808997400, 9223372036854775807]',
        'level': 'info',
        'message': "EJR creating job_id='j-2604100816374d3e836a8d5081c1f897' "
                   "created='2026-04-10T08:16:37Z'",
        'time': '2026-04-10T08:16:37.400Z'},
    {   'id': '[1775808998671, 9223372036854775807]',
        'level': 'info',
        'message': "Starting job 'j-2604100816374d3e836a8d5081c1f897' from "
                   'user '
                   xxx",
        'time': '2026-04-10T08:16:38.671Z'},
    {   'id': '[1775808998671, 9223372036854775807]',
        'level': 'info',
        'message': "EJR update job_id='j-2604100816374d3e836a8d5081c1f897' "
                   "data={'proxy_user': 'xxx'}",
        'time': '2026-04-10T08:16:38.671Z'},
    {   'id': '[1775808998737, 9223372036854775807]',
        'level': 'info',
        'message': 'Determined container image '
                   "'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702' "
                   'from process graph with set(udf_runtimes)=set()',
        'time': '2026-04-10T08:16:38.737Z'},
    {   'id': '[1775808998737, 9223372036854775807]',
        'level': 'info',
        'message': 'No job_options["image-name"] specified, setting fallback '
                   "'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
        'time': '2026-04-10T08:16:38.737Z'},
    {   'id': '[1775808998737, 9223372036854775807]',
        'level': 'info',
        'message': 'Best image match for runtimes=set(): '
                   "'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
        'time': '2026-04-10T08:16:38.737Z'},
    {   'id': '[1775808998761, 9223372036854775807]',
        'level': 'info',
        'message': 'ZooKeeperUserDefinedProcessRepository with '
                   "self._root='/openeo/udps' self._zk_client_reuse=True",
        'time': '2026-04-10T08:16:38.761Z'},
    {   'id': '[1775809001083, 9223372036854775807]',
        'level': 'warning',
        'message': 'Failed to enrich collection metadata of '
                   'SENTINEL2_L1C_INCD: https://resto.c-scale.zcu.cz',
        'time': '2026-04-10T08:16:41.083Z'},
    {   'id': '[1775809001302, 9223372036854775807]',
        'level': 'warning',
        'message': 'No STAC data available for collection with id '
                   'sentinel-3-olci',
        'time': '2026-04-10T08:16:41.302Z'},
    {   'id': '[1775809001348, 9223372036854775807]',
        'level': 'warning',
        'message': 'No STAC data available for collection with id '
                   'byoc-a9743257-ef21-4fd3-999d-15c5c7fcacbd',
        'time': '2026-04-10T08:16:41.348Z'},
    {   'id': '[1775809003834, 9223372036854775807]',
        'level': 'info',
        'message': "Merging SENTINEL2_L1C from ['SENTINEL2_L1C_SENTINELHUB']",
        'time': '2026-04-10T08:16:43.834Z'},
    {   'id': '[1775809003835, 9223372036854775807]',
        'level': 'info',
        'message': "Merging SENTINEL2_L2A from ['SENTINEL2_L2A', "
                   "'TERRASCOPE_S2_TOC_V2', 'SENTINEL2_L2A_SENTINELHUB']",
        'time': '2026-04-10T08:16:43.835Z'},
    {   'id': '[1775809003860, 9223372036854775807]',
        'level': 'info',
        'message': 'Dry run extracted these source constraints: '
                   "[(('load_collection', ('TERRASCOPE_S2_TOC_V2', "
                   "(('eo:cloud_cover', (('lte', 80.0),)),), ('B02', 'B03', "
                   "'B04', 'B08', 'B11', 'B12', 'SCL'))), {'pixel_buffer': "
                   "{'buffer_size': [100.5, 100.5]}, 'temporal_extent': "
                   "('2026-03-02', '2026-03-09'), 'spatial_extent': {'west': "
                   "2.530489164377812, 'south': 50.646159831055506, 'east': "
                   "5.952960980443944, 'north': 51.503991645411425, 'crs': "
                   "'EPSG:4326'}, 'bands': ['B02', 'B03', 'B04', 'B08', 'B11', "
                   "'B12', 'SCL'], 'process_type': [<ProcessType.FOCAL_SPACE: "
                   "6>], 'properties': {'eo:cloud_cover': {'process_graph': "
                   "{'lte1': {'process_id': 'lte', 'arguments': {'x': "
                   "{'from_parameter': 'value'}, 'y': 80.0}, 'result': "
                   'True}}}}})]',
        'time': '2026-04-10T08:16:43.860Z'},
    {   'id': '[1775809003860, 9223372036854775807]',
        'level': 'info',
        'message': 'Doing setup_kerberos_auth',
        'time': '2026-04-10T08:16:43.860Z'},
    {   'id': '[1775809003885, 9223372036854775807]',
        'level': 'info',
        'message': 'Using '
                   "image_name='vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
        'time': '2026-04-10T08:16:43.885Z'},
    {   'id': '[1775809003939, 9223372036854775807]',
        'level': 'warning',
        'message': '/opt/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py:1064: '
                   'InsecureRequestWarning: Unverified HTTPS request is being '
                   "made to host 'ipa01.int-services.rscluster.vito.be'. "
                   'Adding certificate verification is strongly advised. See: '
                   'https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings\n'
                   '  warnings.warn(\n',
        'time': '2026-04-10T08:16:43.939Z'},
    {   'id': '[1775809004019, 9223372036854775807]',
        'level': 'info',
        'message': "_verify_proxy_user: valid 'xxx'",
        'time': '2026-04-10T08:16:44.019Z'},
    {   'id': '[1775809004020, 9223372036854775807]',
        'level': 'info',
        'message': 'Submitting job with command '
                   "['/opt/venv/lib64/python3.8/site-packages/openeogeotrellis/deploy/submit_batch_job_spark3.sh', "
                   "'openEO "
                   'batch_//dg3.be/alp/datagis/satellite_periodic/BEFL/s2-agri-weekly/s2-agri-weekly_2026-03-02_2026-03-08_B02-B03-B04-B08-B11-B12_mean.tif_j-2604100816374d3e836a8d5081c1f897_user '
                   "xxx@egi.eu', "
                   "'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in', "
                   "'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897', "
                   "'out', 'job_metadata.json', 'openeo@VGT.VITO.BE', "
                   "'/opt/openeo.keytab', 'xxx', '1.2.0', '8G', "
                   "'8G', '8G', '5', '2', '8G', 'default', 'false', '[]', "
                   "'custom_processes.py', '80', "
                   "'xxx@egi.eu', "
                   "'j-2604100816374d3e836a8d5081c1f897', '0.1', '1', "
                   "'default', "
                   "'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_lgfvt9jv.properties', "
                   "'', 'INFO', '/opt/backendconfig_mep.py', "
                   "'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897/udf-py-deps.d', "
                   "'https://jobregistry.vgt.vito.be', 'mep-prod', "
                   "'openeo-elastic-job-registry:344c737d-760e-4649-afb2-b11afe8c718e@https://sso.terrascope.be/auth/realms/terrascope', "
                   "'/var/lib/sss/pipes:/var/lib/sss/pipes:rw,/etc/krb5.conf:/etc/krb5.conf:ro,/data/MTDA:/data/MTDA:ro,/data/projects/OpenEO:/data/projects/OpenEO:rw,/data/users:/data/users:rw,/data/open:/data/open:ro,/data/worldcereal_data:/data/worldcereal_data:ro,/etc/hadoop/conf/:/etc/hadoop/conf/:ro,/tmp_epod:/tmp_epod:rw', "
                   "'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897/udf-py-deps.zip', "
                   "'OPENEO_STAC_OIDC_CLIENT_SECRET_STAC_OPENEO_DEV "
                   'OPENEO_STAC_OIDC_CLIENT_SECRET_STAC_OPENEO SWIFT_URL '
                   'SWIFT_ACCESS_KEY_ID SWIFT_SECRET_ACCESS_KEY '
                   "CORSA_MODEL_DIR', '8589934592', 'hdfs:///spark2-history/', "
                   "'hdfs:///spark2-history/', 'epod-ha.vgt.vito.be:18481', "
                   "'']",
        'time': '2026-04-10T08:16:44.020Z'},
    {   'id': '[1775809010636, 9223372036854775807]',
        'level': 'info',
        'message': 'Submitted job, output was: Warning: Ignore classpath  with '
                   'proxy user specified in Cluster mode when '
                   'spark.submit.proxyUser.allowCustomClasspathInClusterMode '
                   'is disabled\n'
                   '26/04/10 08:16:46 WARN NativeCodeLoader: Unable to load '
                   'native-hadoop library for your platform... using '
                   'builtin-java classes where applicable\n'
                   '26/04/10 08:16:46 WARN DomainSocketFactory: The '
                   'short-circuit local reads feature cannot be used because '
                   'libhadoop cannot be loaded.\n'
                   '26/04/10 08:16:47 INFO Configuration: resource-types.xml '
                   'not found\n'
                   '26/04/10 08:16:47 INFO ResourceUtils: Unable to find '
                   "'resource-types.xml'.\n"
                   '26/04/10 08:16:47 INFO Client: Verifying our application '
                   'has not requested more than the maximum memory capability '
                   'of the cluster (52224 MB per container)\n'
                   '26/04/10 08:16:47 INFO Client: Will allocate AM container, '
                   'with 16384 MB memory including 8192 MB overhead\n'
                   '26/04/10 08:16:47 INFO Client: Setting up container launch '
                   'context for our AM\n'
                   '26/04/10 08:16:47 INFO Client: Setting up the launch '
                   'environment for our AM container\n'
                   '26/04/10 08:16:47 INFO Client: Preparing resources for our '
                   'AM container\n'
                   '26/04/10 08:16:47 INFO Client: Uploading resource '
                   'file:/opt/layercatalog.json -> '
                   'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/layercatalog.json\n'
                   '26/04/10 08:16:47 INFO Client: Uploading resource '
                   'file:/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in '
                   '-> '
                   'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in\n'
                   '26/04/10 08:16:47 INFO Client: Uploading resource '
                   'file:/opt/openeo-logging-static.jar -> '
                   'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/openeo-logging-static.jar\n'
                   '26/04/10 08:16:47 INFO Client: Uploading resource '
                   'file:/opt/client.conf -> '
                   'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/client.conf\n'
                   '26/04/10 08:16:48 INFO Client: Uploading resource '
                   'file:/opt/http_credentials.json -> '
                   'hdfs://hacluster/us

Sentinel-1 relative orbit number selection

Hi.

I am using the openEO API in python to download Sentinel-1 data. I want to download the data for a specific relative orbit number and orbit direction but I could not get the load_collection method with the properties argument work with the usual STAC keys for these values. Any ideas how I can do this? 

Error downloading TERRASCOPE_S2_TOC_V2[2025-05-26 / 2025-06-02]

Hello

I keep getting an error when downloading a specific S2 photo range. I suspect maybe an image corruption has occured, because other time periodes work just fine.

I get a pretty huge stacktrace, but these are the highlights:


    {   'id': '[1749817012387, 17005]',
        'level': 'info',
        'message': 'load_collection: Creating raster datacube for '
                   "TERRASCOPE_S2_TOC_V2 with arguments {'temporal_extent': "
                   "('2025-05-26', '2025-06-02'), 'spatial_extent': {'west': "
                   "2.530489164377812, 'south': 50.646159831055506, 'east': "
                   "5.952960980443944, 'north': 51.503991645411425, 'crs': "
                   "'EPSG:4326'}, 'global_extent': {'west': 466800, 'south': "
                   "5610580, 'east': 708770, 'north': 5710010, 'crs': "
                   "'EPSG:32631'}, 'bands': ['B02', 'B03', 'B04', 'B08', "
                   "'B11', 'B12', 'SCL'], 'properties': {'eo:cloud_cover': "
                   "{'process_graph': {'lte1': {'result': True, 'process_id': "
                   "'lte', 'arguments': {'x': {'from_parameter': 'value'}, "
                   "'y': 80}}}}}, 'aggregate_spatial_geometries': None, "
                   "'sar_backscatter': None, 'process_types': "
                   "{<ProcessType.GLOBAL_TIME: 4>}, 'custom_mask': {'method': "
                   "'mask_scl_dilation', 'scl_band_name': None}, 'data_mask': "
                   "None, 'target_crs': None, 'target_resolution': None, "
                   "'resample_method': 'near', 'pixel_buffer': None}, "
                   "environment: {'vault_token': None, "
                   "'sentinel_hub_client_alias': 'default', "
                   "'max_soft_errors_ratio': 0.1, 'dependencies': [], "
                   "'pyramid_levels': 'highest', 'require_bounds': True, "
                   "'correlation_id': 'j-250613121558405e905bcf390fb6547e', "
                   "'user': "
                   "User('xxx@egi.eu', "
                   "None), 'openeo_api_version': '1.2'}",
        'time': '2025-06-13T12:16:52.387Z'},
    ...
    {   'id': '[1749819025201, 45095]',
        'level': 'warning',
        'message': 'A part of the process graph failed, and will be retried, '
                   'the reason was: "Job aborted due to stage failure: Task 0 '
                   'in stage 11.0 failed 4 times, most recent failure: Lost '
                   'task 0.3 in stage 11.0 (TID 1142) (epod189.vgt.vito.be '
                   'executor 23): ExecutorLostFailure (executor 23 exited '
                   'caused by one of the running tasks) Reason: Container from '
                   'a bad node: container_e5156_1749151415540_18817_01_000034 '
                   'on host: epod189.vgt.vito.be. Exit status: 143. '
                   'Diagnostics: [2025-06-13 14:50:24.695]Container killed on '
                   'request. Exit code is 143\n'
                   '[2025-06-13 14:50:24.716]Container exited with a non-zero '
                   'exit code 143. \n'
                   '[2025-06-13 14:50:24.716]Killed by external signal\n'
                   '.\n'
                   'Driver stacktrace:"\n'
                   'Your job may still complete if the failure was caused by a '
                   'transient error, but will take more time. A common cause '
                   'of transient errors is too little executor memory '
                   '(overhead). Too low executor-memory can be seen by a high '
                   "'garbage collection' time, which was: 0.008 seconds.\n",
        'time': '2025-06-13T12:50:25.201Z'},
 ..
    {   'id': '[1749819071990, 25576]',
        'level': 'error',
        'message': 'OpenEO batch job failed: Your batch job failed because '
                   'workers used too much memory. The same task was attempted '
                   'multiple times. Consider increasing executor-memory, '
                   'python-memory or executor-memoryOverhead or contact the '
                   'developers to investigate.',
        'time': '2025-06-13T12:51:11.990Z'}]

 

Thanks!

Problems downloading sentinel 2 images

For several days (May 2025) I have been experiencing problems when trying to download images from Sentinel 2. When I click on the download button I get the message “Something went wrong while fetching products”. Is it a known issue?

Increased Load on FreeIPA Servers

We are currently experiencing elevated load on the FreeIPA servers, which may intermittently affect login functionality and Kerberos authentication. Our team is actively working to optimize performance and resolve these issues.

We appreciate your patience and will provide updates as the situation evolves.

Update ( 07-07-2025 ):

The cause of the elevated load was found and solved. Login functionality and Kerberos authentication should be more consistent now.

Firewall maintenance 25th of March

As part of our ongoing infrastructure enhancements, a scheduled service interruption will take place on Tuesday, March 25th, from 12:00 to 14:00 CET.

As a result of this change, some shares will be temporarily unavailable on the user VMs and no new notebooks will be able to start.

We appreciate your patience and understanding.

possible openEO authentication issues due to EGI-check-in maintenance until aprox 14:00

Please keep in mind that due to EGI-check-in maintenance you may experience some authentication issues on openeo.vito.be until 14:00 today.

 

If you are using the openeo-python-client then the following line might cause an unclear error due to this maintenance:

openeo.connect("openeo.vito.be/openeo/1.1").authenticate_oidc()

Questions about processing of LAI, NDVI, and FAPAR timeseries

Dear Terrascope-forum,

I have a few questions regarding the processing of LAI, NDVI, and FAPAR timeseries. I successfully generated timeseries for my own polygon using the notebooks.

However, I was wondering how clouds and shadows are handled in these timeseries. Are pixels with clouds or shadows automatically excluded from the calculations, or do they count as invalid pixels (from which you can select the percentage)? Additionally, does the timeseries for a polygon calculate the average of all pixels for which data is available, or how does this work?

Furthermore, I would like to know if it is possible to generate a 3D dataset for my polygon, where the NDVI values over time are displayed per pixel. Is this something that can be implemented in the notebooks?

Thank you in advance for your help!

Best regards

Emma

Temporal composite function in the process wizard

I am new to the OpenEO web editor and Copernicus in general, but it seems to be working great for me so far. My question is about the temporal composite function in the process wizard, which also seems to be working great. What does the temporal composite function in the wizard actually do? So for example, if I am doing an NDVI calculation for a one-month temporal coverage, what would the results be for the various temporal function options? I should mention that what I am ultimately interested in doing is finding the average NDVI for a region of about 1,000 acres of wooded terrain every few days over several summer months and over several years. Thanks for any advice you can offer.

Downloading data using geometry

Dear all,

I am using the python API. 

When downloading data like esa world cover, does the API always just give the whole tile where the area of interest lies in? (multiple tiles when the area of interest intersects with more tiles?) For my use case, I want to pass a geometry like the adiminstration boundary of a region and only get the data within this boundary or pass a bounding box and only get the data within the bounding box, but not the whole tile. I also opened an issue over on github (https://github.com/VITObelgium/terracatalogueclient/issues/6). Moreover, when the area of interest intersects with multiple tiles, I want to have the data already be processed and not have multiple files which I have to merge. Is this possible? :)  

blijf op de hoogte!
abonneer op onze nieuwsbrief
nieuwe perspectieven

Blijf op de hoogte!