Skip to content

Analyzer

The InSARHub analyzer module provides workflow for InSAR time-series analysis.

  • Import analyzer

    Import the Analyzer class to access all time-series analysis functionality

    from insarhub import Analyzer
    

  • View Available Analyzers

    List all registered analyzer

    Analyzer.available()
    

Available Analyzers

Mintpy_SBAS_Base_Analyzer

InSARHub wrapped Mintpy as one of it's analysis backends. The Mintpy_SBAS_Base_Analyzer is implemented on top of a reusable base configuration class, which provides the full smallbaselineApp logic of Mintpy. Provides users with an experience similar to using MintPy directly, allowing full customization of processing parameters and steps.

Source code in src/insarhub/analyzer/mintpy_base.py
class Mintpy_SBAS_Base_Analyzer(BaseAnalyzer):

    name = 'Mintpy_SBAS_Base_Analyzer'
    default_config = Mintpy_SBAS_Base_Config
    '''
    Base class for Mintpy SBAS analysis. This class provides a template for implementing 
    specific analysis methods using the Mintpy software package.
    '''
    def __init__(self, config: Mintpy_SBAS_Base_Config | None = None):
        super().__init__(config)

        self.workdir = self.config.workdir
        self.tmp_dir = self.workdir.joinpath('tmp')
        self.clip_dir = self.workdir.joinpath('clip')
        self.cfg_path = self.workdir.joinpath('mintpy.cfg')

    def prep_data(self):
        """Write the MintPy config file to workdir."""
        self.config.write_mintpy_config(self.cfg_path)

    def _cds_authorize(self):
        if self._check_cdsapirc():
           return True
        else: 
            while True:
                self._cds_token = getpass.getpass("Enter your CDS api token at https://cds.climate.copernicus.eu/profile: ")
                cdsrc_path = Path.home().joinpath(".cdsapirc")
                if cdsrc_path.is_file():
                    cdsrc_path.unlink()
                cdsrc_entry = f"\nurl: https://cds.climate.copernicus.eu/api\nkey: {self._cds_token}"
                with open(cdsrc_path, 'a') as f:
                    f.write(cdsrc_entry)
                    print(f"{Fore.GREEN}Credentials saved to {cdsrc_path}.\n")
                try:
                    tmp = (Path.home().joinpath(".cdsrc_test")).mkdir(exist_ok=True)
                    pyaps3.ECMWFdload(['20200601','20200901'], hr='14', filedir=tmp, model='ERA5', snwe=(30,40,120,140))
                    shutil.rmtree(tmp)
                    print(f"{Fore.GREEN}Authentication successful.\n")
                except requests.exceptions.HTTPError as e:
                    if e.response.status_code == 401:
                        print(f'{Fore.RED} Authentication Failed please check your token')
                break

    def _check_cdsapirc(self):
        """Check if .cdsapirc token exist under home directory."""
        cdsapirc_path = Path.home().joinpath('.cdsapirc')
        if not cdsapirc_path.is_file():            
            print(f"{Fore.RED}No .cdsapirc file found in your home directory. Will prompt login.\n")
            return False
        else: 
            with cdsapirc_path.open() as f:
                content = f.read()
                if 'key:' in content:
                    return True
                else:
                    print(f"{Fore.RED}no api token found under .cdsapirc. Will prompt login.\n")
                    return False

    def run(self, steps=None):
        """
        Run the MintPy SBAS time-series analysis workflow.

        This method writes the MintPy configuration file, optionally authorizes
        CDS access for tropospheric correction, and executes the selected
        MintPy processing steps using TimeSeriesAnalysis.

        Args:
            steps (list[str] | None, optional):
                List of MintPy processing steps to execute. If None, the
                default full workflow is executed:
                    [
                        'load_data', 'modify_network', 'reference_point',
                        'invert_network', 'correct_LOD', 'correct_SET',
                        'correct_ionosphere', 'correct_troposphere',
                        'deramp', 'correct_topography', 'residual_RMS',
                        'reference_date', 'velocity', 'geocode',
                        'google_earth', 'hdfeos5'
                    ]

        Raises:
            RuntimeError: If tropospheric delay method requires CDS authorization
                and authorization fails.
            Exception: Propagates exceptions raised during MintPy execution.

        Notes:
            - If `troposphericDelay_method` is set to 'pyaps', CDS
            authorization is performed before running MintPy.
            - The configuration file is written to `self.cfg_path`.
            - Processing is executed inside `self.workdir`.
            - This method wraps MintPy TimeSeriesAnalysis for SBAS workflows.
        """
        if self.config.troposphericDelay_method == 'pyaps':
            self._cds_authorize()

        run_steps = steps or [
            'load_data', 'modify_network', 'reference_point', 'invert_network',
            'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere',
            'deramp', 'correct_topography', 'residual_RMS', 'reference_date',
            'velocity', 'geocode', 'google_earth', 'hdfeos5'
        ]
        print(f'{Style.BRIGHT}{Fore.MAGENTA}Running MintPy Analysis...{Fore.RESET}')
        app = TimeSeriesAnalysis(self.cfg_path.as_posix(), self.workdir.as_posix())
        app.open()
        app.run(steps=run_steps)

    def cleanup(self):
        """
        Remove temporary files and directories generated during processing.

        This method deletes the temporary working directories and any `.zip`
        archives in `self.workdir`. If debug mode is enabled, temporary files
        are preserved and a message is printed instead.

        Behavior:
            - Deletes `self.tmp_dir` and `self.clip_dir` if they exist.
            - Deletes all `.zip` files in `self.workdir`.
            - Prints informative messages for each removal or failure.
            - Respects `self.config.debug`; no files are deleted in debug mode.

        Raises:
            Exception: Propagates any unexpected errors raised during removal.

        Notes:
            - Useful for freeing disk space after large InSAR or MintPy
            processing workflows.
            - Temporary directories should contain only non-essential files
            to avoid accidental data loss.
        """

        if self.config.debug:
            print(f"{Fore.YELLOW}Debug mode is enabled. Keeping temporary files at: {self.workdir}{Fore.RESET}")
            return
        print(f"{Fore.CYAN}Step: Cleaning up temporary directories...{Fore.RESET}")

        for folder in [self.tmp_dir, self.clip_dir]:
            if folder.exists() and folder.is_dir():
                try:
                    shutil.rmtree(folder)
                    print(f"  Removed: {folder.relative_to(self.workdir)}")
                except Exception as e:
                    print(f"{Fore.RED}  Failed to remove {folder}: {e}{Fore.RESET}")

        zips = list(self.workdir.glob('*.zip'))
        if zips:
            print(f"{Fore.CYAN}Step: Removing zip archives...{Fore.RESET}")
            for zf in zips:
                try:
                    zf.unlink()
                    print(f"  Removed: {zf.name}")
                except Exception as e:
                    print(f"{Fore.RED}  Failed to remove {zf.name}: {e}{Fore.RESET}")

        print(f"{Fore.GREEN}Cleanup complete.{Fore.RESET}")

Usage

  • Create Analyzer with Parameters

    Initialize a analyzer instance

    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', 
                                workdir="/your/work/dir",
                                load_processor= "hyp3", ....)
    
    OR
    params = {"workdir":"/your/work/dir","load_processor": "hyp3" ....}
    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', **params)
    
    OR

    from insarhub.config import Mintpy_SBAS_Base_Config
    cfg = Mintpy_SBAS_Base_Config(workdir="/your/work/dir",
                                  load_processor= "hyp3",
                                  ....)
    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', config=cfg)
    

    The base configure Mintpy_SBAS_Base_Config contains all parameters from Mintpy smallbaselineApp.cfg. For detailed descriptions and usage of each parameters, please refer to the official Mintpy config documentation.

    Source code in src/insarhub/config/defaultconfig.py
    @dataclass
    class Mintpy_SBAS_Base_Config:
        '''
        Dataclass containing all configuration options for Mintpy SBAS jobs.
        '''
        name: str = "Mintpy_SBAS_Base_Config"
        workdir: Path | str = field(default_factory=lambda: Path.cwd())
        debug: bool = False 
    
        ## computing resource configuration
        compute_maxMemory : float | int = _env['memory']
        compute_cluster : str = 'local' # Mintpy's slurm parallel processing is buggy, so we will handle parallel processing with dask instead. Switch to none to turn off parallel processing to save memory.
        compute_numWorker : int = _env['cpu']
        compute_config: str = 'none'
    
        ## Load data
        load_processor: str = 'auto'
        load_autoPath: str = 'auto' 
        load_updateMode: str = 'auto'
        load_compression: str = 'auto'
        ##---------for ISCE only:
        load_metaFile: str = 'auto'
        load_baselineDir: str = 'auto'
        ##---------interferogram stack:
        load_unwFile: str = 'auto'
        load_corFile: str = 'auto'
        load_connCompFile: str = 'auto'
        load_intFile: str = 'auto'
        load_magFile: str = 'auto'
        ##---------ionosphere stack (optional):
        load_ionUnwFile: str = 'auto'
        load_ionCorFile: str = 'auto'
        load_ionConnCompFile: str = 'auto'
        ##---------offset stack (optional):
        load_azOffFile: str = 'auto'
        load_rgOffFile: str = 'auto'
        load_azOffStdFile: str = 'auto'
        load_rgOffStdFile: str = 'auto'
        load_offSnrFile: str = 'auto'
        ##---------geometry:
        load_demFile: str = 'auto'
        load_lookupYFile: str = 'auto'
        load_lookupXFile: str = 'auto'
        load_incAngleFile: str = 'auto'
        load_azAngleFile: str = 'auto'
        load_shadowMaskFile: str = 'auto'
        load_waterMaskFile: str = 'auto'
        load_bperpFile: str = 'auto'
        ##---------subset (optional):
        subset_yx: str = 'auto'
        subset_lalo: str = 'auto'
        ##---------multilook (optional):
        multilook_method: str = 'auto'
        multilook_ystep: str | int = 'auto'
        multilook_xstep: str | int= 'auto'
    
        # 2. Modify Network
        network_tempBaseMax: str | float = 'auto'
        network_perpBaseMax: str | float = 'auto'
        network_connNumMax: str | int = 'auto'
        network_startDate: str = 'auto'
        network_endDate: str = 'auto'
        network_excludeDate: str = 'auto'
        network_excludeDate12: str = 'auto'
        network_excludeIfgIndex: str = 'auto'
        network_referenceFile: str = 'auto'
        ## 2) Data-driven network modification
        network_coherenceBased: str = 'auto'
        network_minCoherence: str |float = 'auto'
        ## b - Effective Coherence Ratio network modification = (threshold + MST) by default
        network_areaRatioBased: str = 'auto'
        network_minAreaRatio: str |float= 'auto'
        ## Additional common parameters for the 2) data-driven network modification
        network_keepMinSpanTree: str = 'auto'
        network_maskFile: str = 'auto'
        network_aoiYX: str = 'auto'
        network_aoiLALO: str = 'auto'
    
        # 3. Reference Point
        reference_yx: str = 'auto'
        reference_lalo: str = 'auto'
        reference_maskFile: str = 'auto'
        reference_coherenceFile: str = 'auto'
        reference_minCoherence: str |float = 'auto'
    
        # 4. Correct Unwrap Error
        unwrapError_method: str = 'auto'
        unwrapError_waterMaskFile: str = 'auto'
        unwrapError_connCompMinArea: str |float = 'auto'
        ## phase_closure options:
        unwrapError_numSample: str | int= 'auto'
        ## bridging options:
        unwrapError_ramp: str = 'auto'
        unwrapError_bridgePtsRadius: str | int= 'auto'
    
        # 5. Invert Network
        networkInversion_weightFunc: str = 'auto'
        networkInversion_waterMaskFile: str = 'auto'
        networkInversion_minNormVelocity: str = 'auto'
        ## mask options for unwrapPhase of each interferogram before inversion (recommend if weightFunct=no):
        networkInversion_maskDataset: str = 'auto'
        networkInversion_maskThreshold: str | float = 'auto'
        networkInversion_minRedundancy: str | float = 'auto'
        ## Temporal coherence is calculated and used to generate the mask as the reliability measure
        networkInversion_minTempCoh: str | float = 'auto'
        networkInversion_minNumPixel: str | int = 'auto'
        networkInversion_shadowMask: str = 'auto'
    
        # 6. Correct SET (Solid Earth Tides)
        solidEarthTides: str = 'auto'
    
        # 7. Correct Ionosphere
        ionosphericDelay_method: str = 'auto'
        ionosphericDelay_excludeDate: str = 'auto'
        ionosphericDelay_excludeDate12: str = 'auto'
    
        # 8. Correct Troposphere
        troposphericDelay_method: str = 'auto'
        ## Notes for pyaps:
        troposphericDelay_weatherModel: str = 'auto'
        troposphericDelay_weatherDir: str = 'auto'
    
        ## Notes for height_correlation:
        troposphericDelay_polyOrder: str | int = 'auto'
        troposphericDelay_looks: str | int = 'auto'
        troposphericDelay_minCorrelation: str | float = 'auto'
        ## Notes for gacos:
        troposphericDelay_gacosDir: str = 'auto'
    
        # 9. Deramp
        deramp: str = 'auto'
        deramp_maskFile: str = 'auto'
    
        # 10. Correct Topography
        topographicResidual: str = 'auto'
        topographicResidual_polyOrder: str = 'auto'
        topographicResidual_phaseVelocity: str = 'auto'
        topographicResidual_stepDate: str = 'auto'
        topographicResidual_excludeDate: str = 'auto'
        topographicResidual_pixelwiseGeometry: str = 'auto'
    
        # 11.1 Residual RMS
        residualRMS_maskFile: str = 'auto'
        residualRMS_deramp: str = 'auto'
        residualRMS_cutoff: str | float = 'auto'
    
        # 11.2 Reference Date
        reference_date: str = 'auto'
    
        # 12. Velocity
        timeFunc_startDate: str = 'auto'
        timeFunc_endDate: str = 'auto'
        timeFunc_excludeDate: str = 'auto'
        ## Fit a suite of time functions
        timeFunc_polynomial: str | int = 'auto'
        timeFunc_periodic: str = 'auto'
        timeFunc_stepDate: str = 'auto'
        timeFunc_exp: str = 'auto'
        timeFunc_log: str = 'auto'
        ## Uncertainty quantification methods:
        timeFunc_uncertaintyQuantification: str = 'auto'
        timeFunc_timeSeriesCovFile: str = 'auto'
        timeFunc_bootstrapCount: str | int = 'auto'
    
        # 13.1 Geocode
        geocode: str = 'auto'
        geocode_SNWE: str = 'auto'
        geocode_laloStep: str = 'auto'
        geocode_interpMethod: str = 'auto'
        geocode_fillValue: str | float = 'auto'
    
        # 13.2 Google Earth
        save_kmz: str = 'auto'
    
        # 13.3 HDFEOS5
        save_hdfEos5: str = 'auto'
        save_hdfEos5_update: str = 'auto'
        save_hdfEos5_subset: str = 'auto'
    
        # 13.4 Plot
        plot: str = 'auto'
        plot_dpi: str | int = 'auto'
        plot_maxMemory: str | int = 'auto'
    
        def __post_init__(self):
            if isinstance(self.workdir, str):
                self.workdir = Path(self.workdir).expanduser().resolve()
    
        def write_mintpy_config(self, outpath: Union[Path, str]):
            """
            Writes the dataclass to a mintpy .cfg file, excluding operational 
            parameters that MintPy doesn't recognize.
            """
            outpath = Path(outpath).expanduser().resolve()
            exclude_fields = ['name', 'workdir', 'debug']
    
            with open(outpath, 'w') as f:
                f.write("## MintPy Config File Generated via InSARHub\n")
    
                for key, value in asdict(self).items():
                    if key in exclude_fields:
                        continue
    
                    parts = key.split('_')
                    if len(parts) > 1:
                        mintpy_key = f"mintpy.{parts[0]}.{'.'.join(parts[1:])}"
                    else:
                        mintpy_key = f"mintpy.{parts[0]}"
    
                    f.write(f"{mintpy_key:<40} = {value}\n")
    
            return Path(outpath).resolve()
    
  • Run
    Run the Mintpy time-series analysis based on provid configuration

    analyzer.run()
    

    Parameters:

    Name Type Description Default
    steps list[str] | None

    List of MintPy processing steps to execute. If None, the default full workflow is executed: [ 'load_data', 'modify_network', 'reference_point', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5' ]

    None

    Raises:

    Type Description
    RuntimeError

    If tropospheric delay method requires CDS authorization and authorization fails.

    Exception

    Propagates exceptions raised during MintPy execution.

  • Clean up

    Remove intermediate processing files generated during the time-series process

    analyzer.cleanup()
    

    Raises:

    Type Description
    Exception

    Propagates any unexpected errors raised during removal.

Hyp3_SBAS

The Hyp3_SBAS is specialized analyzer that extends Mintpy_SBAS_Base_Analyzer, preconfigured specifically for processing Time-series data for Hyp3 InSAR product.

Source code in src/insarhub/analyzer/hyp3_sbas.py
class Hyp3_SBAS(Mintpy_SBAS_Base_Analyzer):
    name = 'Hyp3_SBAS'
    default_config = Hyp3_SBAS_Config
    required = ['unw_phase.tif', 'corr.tif',  'dem.tif'] # also need meta files to get the date and other info
    optional = ['lv_theta.tif', 'lv_phi.tif', 'water_mask.tif']
    def __init__(self, config: Hyp3_SBAS_Config | None = None):
        super().__init__(config)

    def prep_data(self):
        """
        Prepare input data for analysis by performing unzipping, collection, clipping, and parameter setup.

        This method orchestrates the preprocessing steps required before running the analysis workflow. 
        It ensures that all input files are available, aligned, and properly configured.

        Steps performed:
            1. `_unzip_hyp3()`: Extracts any compressed Hyp3 output files.
            2. `_collect_files()`: Gathers relevant input files (e.g., DEMs, interferograms).
            3. `_get_common_overlap(files['dem'])`: Computes the spatial overlap extent among input rasters.
            4. `_clip_rasters(files, overlap_extent)`: Clips input rasters to the common overlapping area.
            5. `_set_load_parameters()`: Sets parameters required for loading the preprocessed data into memory.

        Raises:
            FileNotFoundError: If required input files are missing.
            ValueError: If no common overlap region can be determined among rasters.
            Exception: Propagates any unexpected errors during preprocessing.

        Notes:
            - This method must be called before running the analysis workflow.
            - Designed for workflows using Hyp3-derived Sentinel-1 products.
            - Ensures consistent spatial coverage across all input datasets.
        """
        self._unzip_hyp3()
        files = self._collect_files()
        overlap_extent = self._get_common_overlap(files['dem'])
        self._clip_rasters(files, overlap_extent)
        self._set_load_parameters()
        super().prep_data()

    def _unzip_hyp3(self):
        print(f'{Fore.CYAN}Unzipping HyP3 Products...{Fore.RESET}')

        hyp3_results = list(self.workdir.rglob('*.zip'))
        self.tmp_dir.mkdir(parents=True, exist_ok=True)

        with tqdm(hyp3_results, desc="Processing", unit="file") as pbar:
            for zip_file in pbar:
                extract_target = self.tmp_dir / zip_file.stem
                with zipfile.ZipFile(zip_file, 'r') as zf:
                    needs_extraction = True
                    if extract_target.is_dir():
                        files_in_zip = {Path(f).name for f in zf.namelist() if not f.endswith('/')}
                        folder_files = {f.name for f in extract_target.iterdir() if f.is_file()}
                        if files_in_zip.issubset(folder_files):
                            needs_extraction = False
                            pbar.set_description(f"File Exist: {zip_file.stem[:30]}...")
                    if needs_extraction:
                        pbar.set_description(f"Extracting: {zip_file.stem[:30]}...")
                        if extract_target.is_dir():
                            shutil.rmtree(extract_target)

                        zf.extractall(self.tmp_dir)
        print(f'\n{Fore.GREEN}Unzipping complete.{Fore.RESET}')

    def _collect_files(self):
        print(f'{Fore.CYAN}Mapping file paths...{Fore.RESET}')
        all_required = {ext.split('.')[0]: ext for ext in self.required}    
        all_optional = {ext.split('.')[0]: ext for ext in self.optional}
        files = defaultdict(list)
        files['meta'] = [m for m in self.tmp_dir.rglob('*.txt') if 'README' not in m.name]
        for cat_name, ext in {**all_required, **all_optional}.items():
            files[cat_name] = list(self.tmp_dir.rglob(f"*_{ext}"))

        missing_req = [name for name, ext in all_required.items() if not files[name]]
        if missing_req or not files['meta']:
            print(f"\033[K", end="\r") # Clear current line
            msg = []
            if missing_req: msg.append(f"Missing rasters: {missing_req}")
            if not files['meta']: msg.append("Missing metadata (.txt) files")

            error_report = f"{Fore.RED}CRITICAL ERROR: {'. '.join(msg)}.{Fore.RESET}\n" \
                           f"MintPy requires these files to extract dates and baselines."
            raise FileNotFoundError(error_report)
        missing_opt = [name for name in all_optional if not files[name]]

        total_pairs = len(files['unw_phase'])
        status_msg = f"{Fore.GREEN}Found {total_pairs} pairs | Metadata: OK"
        if missing_opt:
            status_msg += f" | {Fore.YELLOW}Missing optional: {missing_opt}"

        print(f"\r\033[K{status_msg}{Fore.RESET}")
        return files

    def _get_common_overlap(self, dem_files):
        ulx_l, uly_l, lrx_l, lry_l = [], [], [], []
        for f in dem_files:
            ds = gdal.Open(f.as_posix())
            gt = ds.GetGeoTransform() # (ulx, xres, xrot, uly, yrot, yres)
            ulx, uly = gt[0], gt[3]
            lrx, lry = gt[0] + gt[1] * ds.RasterXSize, gt[3] + gt[5] * ds.RasterYSize
            ulx_l.append(ulx)
            uly_l.append(uly)
            lrx_l.append(lrx)
            lry_l.append(lry)
            ds = None
        return  (max(ulx_l), min(uly_l), min(lrx_l), max(lry_l))

    def _clip_rasters(self, files, overlap_extent):
        print(f'{Fore.CYAN}Clipping rasters to common overlap...{Fore.RESET}')
        self.clip_dir.mkdir(parents=True, exist_ok=True)
        categories = [k for k in files.keys() if k != 'meta']

        with tqdm(categories, desc="Progress", position=0, dynamic_ncols=True) as pbar_out:
            for key in pbar_out:
                file_list = files[key]
                pbar_out.set_description(f"Group: {key}")

                # Inner progress bar for individual files in this group
                # leave=False ensures the inner bar disappears when the group is done
                with tqdm(file_list, desc=f"  -> Clipping", leave=False, position=1, unit="file", dynamic_ncols=True) as pbar_in:
                    for f in pbar_in:
                        out = self.clip_dir / f"{f.stem}_clip.tif"

                        if out.exists():
                            pbar_in.set_postfix_str(f"Skip: {f.name[:15]}...")
                            # Update postfix instead of printing to avoid creating new lines
                            continue

                        pbar_in.set_postfix_str(f"File: {f.name[:15]}...")

                        try:
                            gdal.Translate(
                                destName=out.as_posix(),
                                srcDS=f.as_posix(),
                                projWin=overlap_extent
                            )
                        except Exception as e:
                            tqdm.write(f"{Fore.RED}Error clipping {f.name}: {e}{Fore.RESET}")

            # Handle metadata separately as it's just a file copy (no progress bar needed)
        if 'meta' in files:
            print(f"\r{Fore.CYAN}Step: Copying metadata files... \033[K", end="", flush=True)
            for f in files['meta']:
                shutil.copy(f, self.clip_dir / f.name)

        print(f'\n{Fore.GREEN}Clipping complete.{Fore.RESET}')

    def _set_load_parameters(self):
        self.config.load_unwFile = (self.clip_dir / '*_unw_phase_clip.tif').as_posix()
        self.config.load_corFile = (self.clip_dir / '*_corr_clip.tif').as_posix()
        self.config.load_demFile = (self.clip_dir / '*_dem_clip.tif').as_posix()
        opt_map = {
            'lv_theta': 'load_incAngleFile',
            'lv_phi': 'load_azAngleFile',
            'water_mask': 'load_waterMaskFile'
        }
        for k, cfg_attr in opt_map.items():
            if list(self.clip_dir.glob(f"*_{k}_clip.tif")):
                setattr(self.config, cfg_attr, (self.clip_dir / f"*_{k}_clip.tif").as_posix())

Usage

  • Create Analyzer with Parameters

    Initialize a analyzer instance

    analyzer = Analyzer.create('Hyp3_SBAS', 
                                workdir="/your/work/dir")
    
    OR
    params = {"workdir":"/your/work/dir"}
    analyzer = Analyzer.create('Hyp3_SBAS', **params)
    
    OR

    from insarhub.config import Mintpy_SBAS_Base_Config
    cfg = Mintpy_SBAS_Base_Config(workdir="/your/work/dir")
    analyzer = Analyzer.create('Hyp3_SBAS', config=cfg)
    
    • Prepare data Prepare interferogram data download from hyp3 server to mintpy
    analyzer.prep_data()
    

    Raises:

    Type Description
    FileNotFoundError

    If required input files are missing.

    ValueError

    If no common overlap region can be determined among rasters.

    Exception

    Propagates any unexpected errors during preprocessing.

    Source code in src/insarhub/analyzer/hyp3_sbas.py
    def prep_data(self):
        """
        Prepare input data for analysis by performing unzipping, collection, clipping, and parameter setup.
    
        This method orchestrates the preprocessing steps required before running the analysis workflow. 
        It ensures that all input files are available, aligned, and properly configured.
    
        Steps performed:
            1. `_unzip_hyp3()`: Extracts any compressed Hyp3 output files.
            2. `_collect_files()`: Gathers relevant input files (e.g., DEMs, interferograms).
            3. `_get_common_overlap(files['dem'])`: Computes the spatial overlap extent among input rasters.
            4. `_clip_rasters(files, overlap_extent)`: Clips input rasters to the common overlapping area.
            5. `_set_load_parameters()`: Sets parameters required for loading the preprocessed data into memory.
    
        Raises:
            FileNotFoundError: If required input files are missing.
            ValueError: If no common overlap region can be determined among rasters.
            Exception: Propagates any unexpected errors during preprocessing.
    
        Notes:
            - This method must be called before running the analysis workflow.
            - Designed for workflows using Hyp3-derived Sentinel-1 products.
            - Ensures consistent spatial coverage across all input datasets.
        """
        self._unzip_hyp3()
        files = self._collect_files()
        overlap_extent = self._get_common_overlap(files['dem'])
        self._clip_rasters(files, overlap_extent)
        self._set_load_parameters()
        super().prep_data()
    
    • Run
      Run the Mintpy time-series analysis based on provid configuration
    analyzer.run()
    

    Parameters:

    Name Type Description Default
    steps list[str] | None

    List of MintPy processing steps to execute. If None, the default full workflow is executed: [ 'load_data', 'modify_network', 'reference_point', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5' ]

    None

    Raises:

    Type Description
    RuntimeError

    If tropospheric delay method requires CDS authorization and authorization fails.

    Exception

    Propagates exceptions raised during MintPy execution.

    Source code in src/insarhub/analyzer/mintpy_base.py
    def run(self, steps=None):
        """
        Run the MintPy SBAS time-series analysis workflow.
    
        This method writes the MintPy configuration file, optionally authorizes
        CDS access for tropospheric correction, and executes the selected
        MintPy processing steps using TimeSeriesAnalysis.
    
        Args:
            steps (list[str] | None, optional):
                List of MintPy processing steps to execute. If None, the
                default full workflow is executed:
                    [
                        'load_data', 'modify_network', 'reference_point',
                        'invert_network', 'correct_LOD', 'correct_SET',
                        'correct_ionosphere', 'correct_troposphere',
                        'deramp', 'correct_topography', 'residual_RMS',
                        'reference_date', 'velocity', 'geocode',
                        'google_earth', 'hdfeos5'
                    ]
    
        Raises:
            RuntimeError: If tropospheric delay method requires CDS authorization
                and authorization fails.
            Exception: Propagates exceptions raised during MintPy execution.
    
        Notes:
            - If `troposphericDelay_method` is set to 'pyaps', CDS
            authorization is performed before running MintPy.
            - The configuration file is written to `self.cfg_path`.
            - Processing is executed inside `self.workdir`.
            - This method wraps MintPy TimeSeriesAnalysis for SBAS workflows.
        """
        if self.config.troposphericDelay_method == 'pyaps':
            self._cds_authorize()
    
        run_steps = steps or [
            'load_data', 'modify_network', 'reference_point', 'invert_network',
            'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere',
            'deramp', 'correct_topography', 'residual_RMS', 'reference_date',
            'velocity', 'geocode', 'google_earth', 'hdfeos5'
        ]
        print(f'{Style.BRIGHT}{Fore.MAGENTA}Running MintPy Analysis...{Fore.RESET}')
        app = TimeSeriesAnalysis(self.cfg_path.as_posix(), self.workdir.as_posix())
        app.open()
        app.run(steps=run_steps)
    
    • Clean up

    Remove intermediate processing files generated during the time-series process

    analyzer.cleanup()
    

    Raises:

    Type Description
    Exception

    Propagates any unexpected errors raised during removal.

⚠️ Major Redesign

InSARScript v1.1.0 has change of APIs, this documentation is not compatible with version v1.0.0.