Hi,
Since march, I've been trying to download some mosaics of these images, but I keep getting errors.
s2-weekly_2026-02-23_2026-03-01_B02-B03-B04-B08-B11-B12_mean.tif
s2-weekly_2026-03-02_2026-03-08_B02-B03-B04-B08-B11-B12_mean.tif
s2-weekly_2026-03-09_2026-03-15_B02-B03-B04-B08-B11-B12_mean.tif
s2-weekly_2026-03-16_2026-03-22_B02-B03-B04-B08-B11-B12_mean.tif
NOTE: later images do work again, so maybe it's a temporary issue?
s2-weekly_2026-03-23_2026-03-29_B02-B03-B04-B08-B11-B12_mean
s2-weekly_2026-03-30_2026-04-05_B02-B03-B04-B08-B11-B12_mean
Another strange thing, since these images I've had to add the code below, because the prefix isnt being send anymore and the fields have changed from name, is that to be expected?
collection_info = conn.describe_collection(collection)
summaries = collection_info["summaries"]
if "eo:bands" in summaries:
band_key = "eo:bands"
data_type = "type"
elif "raster:bands" in summaries:
band_key = "raster:bands"
data_type = "type"
elif "bands" in summaries:
band_key = "bands"
data_type = "data_type"
I also saw there was a deprecation warning for the cloud masking we use: mask_scl_dilation
cube = cube.process(
"mask_scl_dilation",
data=cube,
scl_band_name=cloud_filter_band,
)
If we want to the newer to_scl_dilation_mask, I see in the doc (https://docs.openeo.cloud/processes/#to_scl_dilation_mask) you can pass mask1_values & mask2_values. Are these used for filtering? And if yes, which filters did the old process use, so we can match the old functionality.
Anyways, I see this error when I execute these images => '26/04/10 08:16:47 INFO Configuration: resource-types.xml not found\n' '26/04/10 08:16:47 INFO ResourceUtils: Unable to find 'resource-types.xml'.\n"
I've also tried playing with the memory parameters, but to no avail.
[ { 'id': '[1775808997400, 9223372036854775807]',
'level': 'info',
'message': "EJR creating job_id='j-2604100816374d3e836a8d5081c1f897' "
"created='2026-04-10T08:16:37Z'",
'time': '2026-04-10T08:16:37.400Z'},
{ 'id': '[1775808998671, 9223372036854775807]',
'level': 'info',
'message': "Starting job 'j-2604100816374d3e836a8d5081c1f897' from "
'user '
xxx",
'time': '2026-04-10T08:16:38.671Z'},
{ 'id': '[1775808998671, 9223372036854775807]',
'level': 'info',
'message': "EJR update job_id='j-2604100816374d3e836a8d5081c1f897' "
"data={'proxy_user': 'xxx'}",
'time': '2026-04-10T08:16:38.671Z'},
{ 'id': '[1775808998737, 9223372036854775807]',
'level': 'info',
'message': 'Determined container image '
"'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702' "
'from process graph with set(udf_runtimes)=set()',
'time': '2026-04-10T08:16:38.737Z'},
{ 'id': '[1775808998737, 9223372036854775807]',
'level': 'info',
'message': 'No job_options["image-name"] specified, setting fallback '
"'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
'time': '2026-04-10T08:16:38.737Z'},
{ 'id': '[1775808998737, 9223372036854775807]',
'level': 'info',
'message': 'Best image match for runtimes=set(): '
"'vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
'time': '2026-04-10T08:16:38.737Z'},
{ 'id': '[1775808998761, 9223372036854775807]',
'level': 'info',
'message': 'ZooKeeperUserDefinedProcessRepository with '
"self._root='/openeo/udps' self._zk_client_reuse=True",
'time': '2026-04-10T08:16:38.761Z'},
{ 'id': '[1775809001083, 9223372036854775807]',
'level': 'warning',
'message': 'Failed to enrich collection metadata of '
'SENTINEL2_L1C_INCD: https://resto.c-scale.zcu.cz',
'time': '2026-04-10T08:16:41.083Z'},
{ 'id': '[1775809001302, 9223372036854775807]',
'level': 'warning',
'message': 'No STAC data available for collection with id '
'sentinel-3-olci',
'time': '2026-04-10T08:16:41.302Z'},
{ 'id': '[1775809001348, 9223372036854775807]',
'level': 'warning',
'message': 'No STAC data available for collection with id '
'byoc-a9743257-ef21-4fd3-999d-15c5c7fcacbd',
'time': '2026-04-10T08:16:41.348Z'},
{ 'id': '[1775809003834, 9223372036854775807]',
'level': 'info',
'message': "Merging SENTINEL2_L1C from ['SENTINEL2_L1C_SENTINELHUB']",
'time': '2026-04-10T08:16:43.834Z'},
{ 'id': '[1775809003835, 9223372036854775807]',
'level': 'info',
'message': "Merging SENTINEL2_L2A from ['SENTINEL2_L2A', "
"'TERRASCOPE_S2_TOC_V2', 'SENTINEL2_L2A_SENTINELHUB']",
'time': '2026-04-10T08:16:43.835Z'},
{ 'id': '[1775809003860, 9223372036854775807]',
'level': 'info',
'message': 'Dry run extracted these source constraints: '
"[(('load_collection', ('TERRASCOPE_S2_TOC_V2', "
"(('eo:cloud_cover', (('lte', 80.0),)),), ('B02', 'B03', "
"'B04', 'B08', 'B11', 'B12', 'SCL'))), {'pixel_buffer': "
"{'buffer_size': [100.5, 100.5]}, 'temporal_extent': "
"('2026-03-02', '2026-03-09'), 'spatial_extent': {'west': "
"2.530489164377812, 'south': 50.646159831055506, 'east': "
"5.952960980443944, 'north': 51.503991645411425, 'crs': "
"'EPSG:4326'}, 'bands': ['B02', 'B03', 'B04', 'B08', 'B11', "
"'B12', 'SCL'], 'process_type': [<ProcessType.FOCAL_SPACE: "
"6>], 'properties': {'eo:cloud_cover': {'process_graph': "
"{'lte1': {'process_id': 'lte', 'arguments': {'x': "
"{'from_parameter': 'value'}, 'y': 80.0}, 'result': "
'True}}}}})]',
'time': '2026-04-10T08:16:43.860Z'},
{ 'id': '[1775809003860, 9223372036854775807]',
'level': 'info',
'message': 'Doing setup_kerberos_auth',
'time': '2026-04-10T08:16:43.860Z'},
{ 'id': '[1775809003885, 9223372036854775807]',
'level': 'info',
'message': 'Using '
"image_name='vito-docker-private.artifactory.vgt.vito.be/openeo-yarn-python311:20260331-702'",
'time': '2026-04-10T08:16:43.885Z'},
{ 'id': '[1775809003939, 9223372036854775807]',
'level': 'warning',
'message': '/opt/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py:1064: '
'InsecureRequestWarning: Unverified HTTPS request is being '
"made to host 'ipa01.int-services.rscluster.vito.be'. "
'Adding certificate verification is strongly advised. See: '
'https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings\n'
' warnings.warn(\n',
'time': '2026-04-10T08:16:43.939Z'},
{ 'id': '[1775809004019, 9223372036854775807]',
'level': 'info',
'message': "_verify_proxy_user: valid 'xxx'",
'time': '2026-04-10T08:16:44.019Z'},
{ 'id': '[1775809004020, 9223372036854775807]',
'level': 'info',
'message': 'Submitting job with command '
"['/opt/venv/lib64/python3.8/site-packages/openeogeotrellis/deploy/submit_batch_job_spark3.sh', "
"'openEO "
'batch_//dg3.be/alp/datagis/satellite_periodic/BEFL/s2-agri-weekly/s2-agri-weekly_2026-03-02_2026-03-08_B02-B03-B04-B08-B11-B12_mean.tif_j-2604100816374d3e836a8d5081c1f897_user '
"xxx@egi.eu', "
"'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in', "
"'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897', "
"'out', 'job_metadata.json', 'openeo@VGT.VITO.BE', "
"'/opt/openeo.keytab', 'xxx', '1.2.0', '8G', "
"'8G', '8G', '5', '2', '8G', 'default', 'false', '[]', "
"'custom_processes.py', '80', "
"'xxx@egi.eu', "
"'j-2604100816374d3e836a8d5081c1f897', '0.1', '1', "
"'default', "
"'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_lgfvt9jv.properties', "
"'', 'INFO', '/opt/backendconfig_mep.py', "
"'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897/udf-py-deps.d', "
"'https://jobregistry.vgt.vito.be', 'mep-prod', "
"'openeo-elastic-job-registry:344c737d-760e-4649-afb2-b11afe8c718e@https://sso.terrascope.be/auth/realms/terrascope', "
"'/var/lib/sss/pipes:/var/lib/sss/pipes:rw,/etc/krb5.conf:/etc/krb5.conf:ro,/data/MTDA:/data/MTDA:ro,/data/projects/OpenEO:/data/projects/OpenEO:rw,/data/users:/data/users:rw,/data/open:/data/open:ro,/data/worldcereal_data:/data/worldcereal_data:ro,/etc/hadoop/conf/:/etc/hadoop/conf/:ro,/tmp_epod:/tmp_epod:rw', "
"'/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897/udf-py-deps.zip', "
"'OPENEO_STAC_OIDC_CLIENT_SECRET_STAC_OPENEO_DEV "
'OPENEO_STAC_OIDC_CLIENT_SECRET_STAC_OPENEO SWIFT_URL '
'SWIFT_ACCESS_KEY_ID SWIFT_SECRET_ACCESS_KEY '
"CORSA_MODEL_DIR', '8589934592', 'hdfs:///spark2-history/', "
"'hdfs:///spark2-history/', 'epod-ha.vgt.vito.be:18481', "
"'']",
'time': '2026-04-10T08:16:44.020Z'},
{ 'id': '[1775809010636, 9223372036854775807]',
'level': 'info',
'message': 'Submitted job, output was: Warning: Ignore classpath with '
'proxy user specified in Cluster mode when '
'spark.submit.proxyUser.allowCustomClasspathInClusterMode '
'is disabled\n'
'26/04/10 08:16:46 WARN NativeCodeLoader: Unable to load '
'native-hadoop library for your platform... using '
'builtin-java classes where applicable\n'
'26/04/10 08:16:46 WARN DomainSocketFactory: The '
'short-circuit local reads feature cannot be used because '
'libhadoop cannot be loaded.\n'
'26/04/10 08:16:47 INFO Configuration: resource-types.xml '
'not found\n'
'26/04/10 08:16:47 INFO ResourceUtils: Unable to find '
"'resource-types.xml'.\n"
'26/04/10 08:16:47 INFO Client: Verifying our application '
'has not requested more than the maximum memory capability '
'of the cluster (52224 MB per container)\n'
'26/04/10 08:16:47 INFO Client: Will allocate AM container, '
'with 16384 MB memory including 8192 MB overhead\n'
'26/04/10 08:16:47 INFO Client: Setting up container launch '
'context for our AM\n'
'26/04/10 08:16:47 INFO Client: Setting up the launch '
'environment for our AM container\n'
'26/04/10 08:16:47 INFO Client: Preparing resources for our '
'AM container\n'
'26/04/10 08:16:47 INFO Client: Uploading resource '
'file:/opt/layercatalog.json -> '
'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/layercatalog.json\n'
'26/04/10 08:16:47 INFO Client: Uploading resource '
'file:/data/projects/OpenEO/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in '
'-> '
'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/j-2604100816374d3e836a8d5081c1f897_4yghh49b.in\n'
'26/04/10 08:16:47 INFO Client: Uploading resource '
'file:/opt/openeo-logging-static.jar -> '
'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/openeo-logging-static.jar\n'
'26/04/10 08:16:47 INFO Client: Uploading resource '
'file:/opt/client.conf -> '
'hdfs://hacluster/user/xxx/.sparkStaging/application_1773834702471_112912/client.conf\n'
'26/04/10 08:16:48 INFO Client: Uploading resource '
'file:/opt/http_credentials.json -> '
'hdfs://hacluster/us
Comments
Hi Jeroen, Some time ago we…
Hi Jeroen,
Some time ago we migrated Sentinel2 to be using STAC. This might have had an influence on this process graph. When running with only the load_collection and the to_scl_dilation_mask process, I got a valid result (j-2604101000174f7aa384480073f84443). I don't know the precise root cause yet, I can check further on Monday. Greetings, Emile Sonneveld
Hey EmileAnd if you run it…
Hey Emile
And if you run it with the old mask_scl_dilation?
I also get a crash with the new one:
cube = conn.load_collection( collection, spatial_extent=spatial_extent, temporal_extent=period, bands=bands_to_load, max_cloud_cover=max_cloud_cover, ) scl_cube = conn.load_collection( collection, spatial_extent=spatial_extent, temporal_extent=period, max_cloud_cover=max_cloud_cover, bands=["SCL"], ) mask = scl_cube.process( "to_scl_dilation_mask", data=scl_cube ) cube = cube.mask(mask)Greetings
Joeri