Abstract
The SAR2CUBE project, launched in February 2020, is defined to satisfy simultaneously two objectives. The first one is to facilitate the use of SAR products in the scientific EO community and to promote them as relevant EO assets and the second is to improve the feasibility of hosting SAR products for a data provider. The Sentinel missions within the Copernicus program have defined a new playground with an extraordinary and unique amount of EO information. In particular, the radar pair defined by the twins Sentinel-1A and Sentinel-1B is offering a constant stream of SAR data since they were launched, late 2014 and early 2016 respectively. However, the interferometric capabilities provided by this source are underused. The particular nature of the complex interferometric data often presents a barrier to incorporate these data within the processing chains. The obvious nature of other kinds of sensors, such as optical or multi-spectral data, facilitates the incorporation of these products into different analysis frameworks. To reduce the entry-level barrier of the InSAR-derived products the SAR2CUBE project is designed to provide both SAR and InSAR analysis-ready data (ARD) specifically defined to achieve efficiency and flexibility in processing and analysing this valuable source of information. The first step on the scientific part of this project has been the definition of all the required information that has to be stored in the data cubes, including both the original SLC data from Sentinel-1 as well as auxiliary data that is used during the workflow to be able to compute an analysis ready data product, including for example a digital terrain model (DEM) or precise orbit information. All pre-processing steps can be employed without altering the nature of the original data with the developed SAR SLC datacube. Further layers for efficient geo-coding have been added as well. The second part of the project deals with the implementation of on-the-fly processors for ARD data products based on the developed data model. For different types of analysis different levels of filter might need to be applied, depending on robustness of dealing with noisy data. In some cases, a very strong speckle filter might be desired to provide a smooth image; in other cases, a certain level of noise might be tolerated in order to minimize the reduction of spatial detail. This requires the adaption of existing methods for a data cube ready implementation as opposed to working in the traditional file system, but also provides the opportunity for possible novel methods, fully utilizing the access to complete time series in the cube domain. See figure 1 for a general workflow of the idea. During Fringe we will provide an introduction to the project, the basic ideas behind it and a first proof of concept implementation based on the open data cube and the openEO API for accessing and processing of the data.