Error with product path suffix connecting a local archive

Dear Sen4CAP-Team,

I’m trying to connect a Sen4CAP installation to a local data archive but face an error. The gpt commands generated by Sen4CAP refer to the product paths as follows .../Sentinel-1/SAR/SLC/year/month/day/S1ProductID.SAFE and produce errors like this:

Error: [NodeId: ReadOp@sourceProduct] /codede/Sentinel-1/SAR/SLC/2019/09/22/dblbnd.adf (Is a directory)

When calling the same gpt command manually and adding /manifest.safe to the paths of the S1 products, the error does not occur. However, changing the SciHubDataSource.Sentinel1.path.suffix from .SAFE to .SAFE/manifest.safe in the services.properties file did not solve the problem.

Is there another way to change this product referencing behavior?

I’m using the german CODE-DE archive that must be similar to CREODIAS if I’m right. This is the path format I’ve set in the services.properties file:

SciHubDataSource.Sentinel1.enabled = true
SciHubDataSource.Sentinel1.fetch_mode = 4
SciHubDataSource.Sentinel1.scope = 3
SciHubDataSource.Sentinel1.local_archive_path = /codede/Sentinel-1/SAR/SLC
SciHubDataSource.Sentinel1.local.archive.path.format = yyyy/MM/dd
SciHubDataSource.Sentinel1.path.suffix = .SAFE/manifest.safe
SciHubDataSource.Sentinel1.product.format = folder

Best regards,
Felix

Dear Felix,

Normally, you should not add the manifest.safe into the suffix of the path.
Could you please provide an example of path for one of your products something like going into one particular SLC product and type “pwd”?
Also, if possible, it would be faster if you could provide access to that machine to have a look about the structure.

Please let us know.

Best regards,
Cosmin

Dear Cosmin,

thank you very much for your reply.

This is the full path of an exemplary SLC-File:
/codede/Sentinel-1/SAR/SLC/2021/03/04/S1B_IW_SLC__1SDH_20210304T170653_20210304T170718_025868_0315CA_0973.SAFE

Best regards,
Felix

Dear Felix,

Actually, there is no issue with your configuration and also you should not change the SciHubDataSource.Sentinel1.path.suffix from .SAFE to .SAFE/manifest.safe (this is incorrect).
The SLC products were actually correctly ingested by the system but there is an issue with the SNAP processing.
The problem is that you have mounted your /codede repository using s3fs but there is an issue with the s3fs mounting. When mounting in this way, if you do :

[eouser@test ~] ls /codede/Sentinel-1/SAR/SLC/2019/03/07/hello.world -l total 0 [eouser@test ~]

Same with any other non-existent name, s3fs will say that it exists when it actually does not.
It seems that during the pre-processing, SNAP is trying to see if exists a certain file /codede/Sentinel-1/SAR/SLC/2019/09/22/dblbnd.adf but this behaviour of s3fs confuses it and it returns an error.
To solve this issue, there are several options:

  1. Change the mounting of the /codede with NFS or goofys instead of s3fs
  2. If first option is not possible, you should change in your S1 (not needed for S2) the Fetch Mode from “Direct link to product” into “Symbolic link”. After you do that, the following operations are needed:
  • delete the S1 entries from the downloader_history table

psql -U admin sen4cap
sen4cap=# delete from l1_tile_history where downloader_history_id in (select id from downloader_history where satellite_id = 3);
sen4cap=# delete from downloader_history where satellite_id = 3;
sen4cap=# \q

  • restart the sen4cap services

sudo systemctl restart sen2agri-services

Also, I noticed that your MAJA processing did not started because apparently the Sen4CAP/Sen4CAPDistribution/install_script/config/maja/UserConfiguration was not automatically copied into /mnt/archive/gipp/maja/. Once this was copied and the products statuses reset, the L2A processing works OK:

psql -U admin sen4cap
sen4cap=# delete from l1_tile_history where downloader_history_id in (select id from downloader_history where satellite_id = 1 and status_id = 6);
sen4cap=# update downloader_history set status_id = 2 where satellite_id = 1 and status_id = 6;

Best regards,
Cosmin

1 Like

Dear Cosmin,

I chose the second option, set the Fetch Mode to “Symbolic Link” and deleted the S1 entries from the downloader_history table. Everything seems to work now.

Thank you so much, I wouldn’t have figured this out by myself!

Best regards,
Felix