Usage
Input data format
SARXarray works with coregistered SLC/interferogram stack. SARXarray provides a reader to perform lazy loading on data stacks in different file formats, including binary format. However, we recommend to store the coregistered stack in zarr
format, and directly load them as an Xarray object by xarray.open_zarr
.
Loading coregistered SLC stack in binary format
If the stack is saved in binary format, it can be read by SARXarray
under two pre-requisites:
- All SLCs/interferograms have the same known raster size and data type;
- All SLCs/interferograms have been resampled to the same raster grid.
For example, let's consider a case of a stack with three SLCs:
import numpy as np
list_slcs = ['data/slc_1.raw', 'data/slc_2.raw', 'data/slc_3.raw']
shape = (10018, 68656) # (azimuth, range)
dtype = np.complex64
We built a list list_slcs
with the paths to the SLCs. In this case they are stored in the same directory called data
. The shape of each SLC should be provided, i.e.: 10018
pixels in azimuth
direction, and 68656
in range direction. The data type is numpy.complex64
.
The coregistered SLC stack can be read using the from_binary
function:
dtype
argument since it's defaulted to numpy.complex64
. The stack will be read as an xarray.Dataset
object, with data variables lazily loaded as Dask Array
:
<xarray.Dataset>
Dimensions: (azimuth: 10018, range: 68656, time: 3)
Coordinates:
* azimuth (azimuth) int64 0 1 2 3 4 5 ... 10013 10014 10015 10016 10017
* range (range) int64 0 1 2 3 4 5 ... 68650 68651 68652 68653 68654 68655
* time (time) int64 0 1 2
Data variables:
complex (azimuth, range, time) complex64 dask.array<chunksize=(4000, 4000, 1), meta=np.ndarray>
amplitude (azimuth, range, time) float32 dask.array<chunksize=(4000, 4000, 1), meta=np.ndarray>
phase (azimuth, range, time) float32 dask.array<chunksize=(4000, 4000, 1), meta=np.ndarray>
The loading chunk size can also be specified manually: