Skip to content

Analyzer

The InSARHub analyzer module provides workflow for InSAR time-series analysis.

  • Import analyzer

    Import the Analyzer class to access all time-series analysis functionality

    from insarhub import Analyzer
    

  • View Available Analyzers

    List all registered analyzer

    Analyzer.available()
    

Available Analyzers

Mintpy_SBAS_Base_Analyzer

InSARHub wrapped Mintpy as one of it's analysis backends. The Mintpy_SBAS_Base_Analyzer is implemented on top of a reusable base configuration class, which provides the full smallbaselineApp logic of Mintpy. Provides users with an experience similar to using MintPy directly, allowing full customization of processing parameters and steps.

Source code in src/insarhub/analyzer/mintpy_base.py
class Mintpy_SBAS_Base_Analyzer(BaseAnalyzer):

    description = "Generic MintPy SBAS analyzer, fully customizable configs."
    compatible_processor = 'all'
    default_config = Mintpy_SBAS_Base_Config
    '''
    Base class for Mintpy SBAS analysis. This class provides a template for implementing 
    specific analysis methods using the Mintpy software package.
    '''
    def __init__(self, config: Mintpy_SBAS_Base_Config | None = None):
        super().__init__(config)

        self.workdir = self.config.workdir
        self.tmp_dir = self.workdir.joinpath('tmp')
        self.clip_dir = self.workdir.joinpath('clip')
        self.cfg_path = self.workdir.joinpath('mintpy.cfg')
        write_workflow_marker(self.workdir, analyzer=type(self).name)

    def prep_data(self):
        """Write the MintPy config file to workdir."""
        self.config.write_mintpy_config(self.cfg_path)

    def _validate_cds_token(self, key: str) -> bool:
        """Validate a CDS API token via a lightweight HTTP request (no download)."""
        try:
            import requests as _requests
            resp = _requests.get(
                "https://cds.climate.copernicus.eu/api/retrieve/v1/jobs",
                headers={"PRIVATE-TOKEN": key},
                params={"limit": 1},
                timeout=5,
            )
            return resp.status_code == 200
        except Exception:
            return False

    def _cds_authorize(self):
        """Ensure valid CDS credentials exist, prompting the user if needed."""
        cdsapirc_path = Path.home() / ".cdsapirc"
        # Try existing .cdsapirc first
        if cdsapirc_path.is_file():
            key = None
            for line in cdsapirc_path.read_text().splitlines():
                if line.strip().startswith("key:"):
                    key = line.split(":", 1)[1].strip()
                    break
            if key and self._validate_cds_token(key):
                return True
            print(f"{Fore.YELLOW}CDS token in .cdsapirc is invalid or expired. Will prompt login.\n")

        # Prompt user for a valid token
        while True:
            self._cds_token = getpass.getpass("Enter your CDS api token at https://cds.climate.copernicus.eu/profile: ")
            if not self._validate_cds_token(self._cds_token):
                print(f"{Fore.RED}Authentication failed. Please check your token and try again.\n")
                continue
            cdsapirc_path.write_text(f"url: https://cds.climate.copernicus.eu/api\nkey: {self._cds_token}\n")
            print(f"{Fore.GREEN}Credentials saved to {cdsapirc_path}.\n")
            return True

    def run(self, steps=None):
        """
        Run the MintPy SBAS time-series analysis workflow.

        This method writes the MintPy configuration file, optionally authorizes
        CDS access for tropospheric correction, and executes the selected
        MintPy processing steps using TimeSeriesAnalysis.

        Args:
            steps (list[str] | None, optional):
                List of MintPy processing steps to execute. If None, the
                default full workflow is executed:
                    [
                        'load_data', 'modify_network', 'reference_point',
                        'invert_network', 'correct_LOD', 'correct_SET',
                        'correct_ionosphere', 'correct_troposphere',
                        'deramp', 'correct_topography', 'residual_RMS',
                        'reference_date', 'velocity', 'geocode',
                        'google_earth', 'hdfeos5'
                    ]

        Raises:
            RuntimeError: If tropospheric delay method requires CDS authorization
                and authorization fails.
            Exception: Propagates exceptions raised during MintPy execution.

        Notes:
            - If `troposphericDelay_method` is set to 'pyaps', CDS
            authorization is performed before running MintPy.
            - The configuration file is written to `self.cfg_path`.
            - Processing is executed inside `self.workdir`.
            - This method wraps MintPy TimeSeriesAnalysis for SBAS workflows.
        """
        if self.config.troposphericDelay_method == 'pyaps':
            self._cds_authorize()

        run_steps = steps or [
            'load_data', 'modify_network', 'reference_point', 'invert_network',
            'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere',
            'deramp', 'correct_topography', 'residual_RMS', 'reference_date',
            'velocity', 'geocode', 'google_earth', 'hdfeos5'
        ]
        print(f'{Style.BRIGHT}{Fore.MAGENTA}Running MintPy Analysis...{Fore.RESET}')
        app = TimeSeriesAnalysis(self.cfg_path.as_posix(), self.workdir.as_posix())
        app.open()
        app.run(steps=run_steps)

    def cleanup(self):
        """
        Remove temporary files and directories generated during processing.

        This method deletes the temporary working directories and any `.zip`
        archives in `self.workdir`. If debug mode is enabled, temporary files
        are preserved and a message is printed instead.

        Behavior:
            - Deletes `self.tmp_dir` and `self.clip_dir` if they exist.
            - Deletes all `.zip` files in `self.workdir`.
            - Prints informative messages for each removal or failure.
            - Respects `self.config.debug`; no files are deleted in debug mode.

        Raises:
            Exception: Propagates any unexpected errors raised during removal.

        Notes:
            - Useful for freeing disk space after large InSAR or MintPy
            processing workflows.
            - Temporary directories should contain only non-essential files
            to avoid accidental data loss.
        """

        if self.config.debug:
            print(f"{Fore.YELLOW}Debug mode is enabled. Keeping temporary files at: {self.workdir}{Fore.RESET}")
            return
        print(f"{Fore.CYAN}Step: Cleaning up temporary directories...{Fore.RESET}")

        for folder in [self.tmp_dir, self.clip_dir]:
            if folder.exists() and folder.is_dir():
                try:
                    shutil.rmtree(folder)
                    print(f"  Removed: {folder.relative_to(self.workdir)}")
                except Exception as e:
                    print(f"{Fore.RED}  Failed to remove {folder}: {e}{Fore.RESET}")

        zips = list(self.workdir.glob('*.zip'))
        if zips:
            print(f"{Fore.CYAN}Step: Removing zip archives...{Fore.RESET}")
            for zf in zips:
                try:
                    zf.unlink()
                    print(f"  Removed: {zf.name}")
                except Exception as e:
                    print(f"{Fore.RED}  Failed to remove {zf.name}: {e}{Fore.RESET}")

        print(f"{Fore.GREEN}Cleanup complete.{Fore.RESET}")

Usage

  • Create Analyzer with Parameters

    Initialize a analyzer instance

    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', 
                                workdir="/your/work/dir",
                                load_processor= "hyp3", ....)
    
    OR
    params = {"workdir":"/your/work/dir","load_processor": "hyp3" ....}
    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', **params)
    
    OR

    from insarhub.config import Mintpy_SBAS_Base_Config
    cfg = Mintpy_SBAS_Base_Config(workdir="/your/work/dir",
                                  load_processor= "hyp3",
                                  ....)
    analyzer = Analyzer.create('Mintpy_SBAS_Base_Analyzer', config=cfg)
    

    The base configure Mintpy_SBAS_Base_Config contains all parameters from Mintpy smallbaselineApp.cfg. For detailed descriptions and usage of each parameters, please refer to the official Mintpy config documentation.

    Source code in src/insarhub/config/defaultconfig.py
    355
    356
    357
    358
    359
    360
    361
    362
    363
    364
    365
    366
    367
    368
    369
    370
    371
    372
    373
    374
    375
    376
    377
    378
    379
    380
    381
    382
    383
    384
    385
    386
    387
    388
    389
    390
    391
    392
    393
    394
    395
    396
    397
    398
    399
    400
    401
    402
    403
    404
    405
    406
    407
    408
    409
    410
    411
    412
    413
    414
    415
    416
    417
    418
    419
    420
    421
    422
    423
    424
    425
    426
    427
    428
    429
    430
    431
    432
    433
    434
    435
    436
    437
    438
    439
    440
    441
    442
    443
    444
    445
    446
    447
    448
    449
    450
    451
    452
    453
    454
    455
    456
    457
    458
    459
    460
    461
    462
    463
    464
    465
    466
    467
    468
    469
    470
    471
    472
    473
    474
    475
    476
    477
    478
    479
    480
    481
    482
    483
    484
    485
    486
    487
    488
    489
    490
    491
    492
    493
    494
    495
    496
    497
    498
    499
    500
    501
    502
    503
    504
    505
    506
    507
    508
    509
    510
    511
    512
    513
    514
    515
    516
    517
    518
    519
    520
    521
    522
    523
    524
    525
    526
    527
    528
    529
    530
    531
    532
    533
    534
    535
    536
    537
    538
    539
    540
    541
    542
    543
    544
    545
    546
    547
    548
    549
    550
    551
    552
    553
    554
    555
    556
    557
    558
    559
    560
    561
    562
    563
    564
    565
    566
    567
    568
    569
    570
    571
    572
    573
    574
    575
    576
    577
    578
    579
    580
    581
    582
    583
    584
    585
    586
    587
    588
    589
    590
    591
    592
    593
    594
    595
    596
    597
    598
    599
    600
    601
    602
    603
    604
    605
    606
    607
    608
    609
    610
    611
    612
    613
    614
    615
    616
    617
    618
    619
    620
    621
    622
    623
    624
    625
    626
    627
    628
    629
    630
    631
    632
    633
    634
    635
    636
    637
    638
    639
    640
    641
    642
    643
    644
    645
    646
    647
    648
    649
    650
    651
    652
    653
    654
    655
    656
    657
    658
    659
    660
    661
    662
    663
    664
    665
    666
    667
    668
    669
    670
    671
    672
    673
    674
    675
    676
    677
    678
    679
    680
    681
    682
    683
    684
    685
    686
    687
    688
    689
    690
    691
    692
    693
    694
    695
    696
    697
    698
    699
    700
    701
    702
    703
    704
    705
    706
    707
    708
    709
    710
    711
    712
    713
    714
    715
    716
    717
    718
    719
    720
    721
    722
    723
    724
    725
    726
    727
    728
    729
    730
    731
    732
    733
    734
    735
    736
    737
    738
    739
    740
    741
    742
    743
    744
    745
    746
    747
    748
    749
    750
    751
    752
    753
    754
    755
    756
    757
    758
    759
    760
    761
    762
    763
    764
    765
    766
    767
    768
    769
    770
    771
    772
    773
    774
    775
    776
    777
    778
    779
    780
    781
    782
    783
    784
    785
    786
    787
    788
    789
    790
    791
    792
    793
    794
    795
    796
    797
    798
    799
    800
    801
    802
    803
    804
    805
    806
    807
    808
    809
    810
    811
    812
    813
    814
    815
    816
    817
    818
    819
    820
    821
    822
    823
    824
    825
    826
    827
    828
    829
    830
    831
    832
    833
    834
    835
    836
    837
    838
    839
    840
    841
    842
    843
    844
    845
    846
    847
    848
    849
    850
    851
    852
    853
    854
    855
    856
    857
    858
    859
    860
    861
    862
    863
    864
    865
    866
    867
    @dataclass
    class Mintpy_SBAS_Base_Config:
        '''
        Dataclass containing all configuration options for Mintpy SBAS jobs.
    
        UI metadata is stored in ``_ui_groups`` / ``_ui_fields`` and consumed
        by the API layer to auto-generate the settings panel.
        '''
    
        # ── UI metadata consumed by the API / settings panel ─────────────────────
        _ui_groups: ClassVar[list] = [
            {"label": "Compute Resources",
             "fields": ["compute_maxMemory", "compute_cluster", "compute_numWorker", "compute_config"]},
            {"label": "Load Data",
             "fields": ["load_processor", "load_autoPath", "load_updateMode", "load_compression",
                        "load_metaFile", "load_baselineDir",
                        "load_unwFile", "load_corFile", "load_connCompFile", "load_intFile", "load_magFile",
                        "load_ionUnwFile", "load_ionCorFile", "load_ionConnCompFile",
                        "load_azOffFile", "load_rgOffFile", "load_azOffStdFile", "load_rgOffStdFile", "load_offSnrFile",
                        "load_demFile", "load_lookupYFile", "load_lookupXFile",
                        "load_incAngleFile", "load_azAngleFile", "load_shadowMaskFile", "load_waterMaskFile", "load_bperpFile",
                        "subset_yx", "subset_lalo",
                        "multilook_method", "multilook_ystep", "multilook_xstep"]},
            {"label": "Modify Network",
             "fields": ["network_tempBaseMax", "network_perpBaseMax", "network_connNumMax",
                        "network_startDate", "network_endDate", "network_excludeDate", "network_excludeDate12",
                        "network_excludeIfgIndex", "network_referenceFile",
                        "network_coherenceBased", "network_minCoherence",
                        "network_areaRatioBased", "network_minAreaRatio",
                        "network_keepMinSpanTree", "network_maskFile", "network_aoiYX", "network_aoiLALO"]},
            {"label": "Reference Point",
             "fields": ["reference_yx", "reference_lalo", "reference_maskFile",
                        "reference_coherenceFile", "reference_minCoherence"]},
            {"label": "Unwrap Error Correction",
             "fields": ["unwrapError_method", "unwrapError_waterMaskFile", "unwrapError_connCompMinArea",
                        "unwrapError_numSample", "unwrapError_ramp", "unwrapError_bridgePtsRadius"]},
            {"label": "Network Inversion",
             "fields": ["networkInversion_weightFunc", "networkInversion_waterMaskFile",
                        "networkInversion_minNormVelocity", "networkInversion_maskDataset",
                        "networkInversion_maskThreshold", "networkInversion_minRedundancy",
                        "networkInversion_minTempCoh", "networkInversion_minNumPixel", "networkInversion_shadowMask"]},
            {"label": "Solid Earth Tides",
             "fields": ["solidEarthTides"]},
            {"label": "Ionosphere Correction",
             "fields": ["ionosphericDelay_method", "ionosphericDelay_excludeDate", "ionosphericDelay_excludeDate12"]},
            {"label": "Troposphere Correction",
             "fields": ["troposphericDelay_method", "troposphericDelay_weatherModel", "troposphericDelay_weatherDir",
                        "troposphericDelay_polyOrder", "troposphericDelay_looks", "troposphericDelay_minCorrelation",
                        "troposphericDelay_gacosDir"]},
            {"label": "Deramp",
             "fields": ["deramp", "deramp_maskFile"]},
            {"label": "Topography Correction",
             "fields": ["topographicResidual", "topographicResidual_polyOrder", "topographicResidual_phaseVelocity",
                        "topographicResidual_stepDate", "topographicResidual_excludeDate",
                        "topographicResidual_pixelwiseGeometry"]},
            {"label": "Residual RMS",
             "fields": ["residualRMS_maskFile", "residualRMS_deramp", "residualRMS_cutoff"]},
            {"label": "Reference Date",
             "fields": ["reference_date"]},
            {"label": "Velocity",
             "fields": ["timeFunc_startDate", "timeFunc_endDate", "timeFunc_excludeDate",
                        "timeFunc_polynomial", "timeFunc_periodic", "timeFunc_stepDate",
                        "timeFunc_exp", "timeFunc_log",
                        "timeFunc_uncertaintyQuantification", "timeFunc_timeSeriesCovFile",
                        "timeFunc_bootstrapCount"]},
            {"label": "Geocode",
             "fields": ["geocode", "geocode_SNWE", "geocode_laloStep", "geocode_interpMethod", "geocode_fillValue"]},
            {"label": "Google earth",
             "fields": ["save_kmz"]},
            {"label": "Hdfeos5",
             "fields": ["save_hdfEos5", "save_hdfEos5_update", "save_hdfEos5_subset"]},
            {"label": "Plot",
             "fields": ["plot", "plot_dpi", "plot_maxMemory"]},
        ]
        _ui_fields: ClassVar[dict] = {
            # Compute Resources
            "compute_maxMemory":   {"type": "number", "min": 1, "max": 512, "step": 1,
                                    "default": _env['memory'],
                                    "hint": "Maximum memory size in GB for each dask worker"},
            "compute_cluster":     {"type": "select",
                                    "options": ["local", "slurm", "pbs", "lsf", "oar", "sge", "none"],
                                    "hint": "Cluster type for parallel processing (local = dask LocalCluster)"},
            "compute_numWorker":   {"type": "number", "min": 1, "max": 64, "step": 1,
                                    "default": _env['cpu'],
                                    "hint": "Number of workers for parallel processing"},
            "compute_config":      {"type": "text",
                                    "hint": "Configuration file for dask distributed cluster"},
            # Load Data
            "load_processor":      {"type": "select",
                                    "options": ["auto", "isce", "aria", "hyp3", "gmtsar", "snap", "gamma", "roipac"],
                                    "hint": "SAR processor of the input dataset"},
            "load_autoPath":       {"type": "text",
                                    "hint": "Auto-detect input file paths based on processor type (auto)"},
            "load_updateMode":     {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Skip re-loading if file already exists with same dataset and metadata"},
            "load_compression":    {"type": "select", "options": ["auto", "lzf", "gzip", "no"],
                                    "hint": "Data compression for HDF5 files"},
            "load_metaFile":       {"type": "text",
                                    "hint": "Metadata file path (ISCE only), e.g. reference/IW1.xml"},
            "load_baselineDir":    {"type": "text",
                                    "hint": "Baseline directory (ISCE only), e.g. baselines"},
            "load_unwFile":        {"type": "text",
                                    "hint": "Unwrapped interferogram file(s), e.g. ./../pairs/*/filt*.unw"},
            "load_corFile":        {"type": "text",
                                    "hint": "Coherence file(s), e.g. ./../pairs/*/filt*.cor"},
            "load_connCompFile":   {"type": "text",
                                    "hint": "Connected components file(s), e.g. ./../pairs/*/filt*.unw.conncomp"},
            "load_intFile":        {"type": "text",
                                    "hint": "Wrapped interferogram file(s), e.g. ./../pairs/*/filt*.int"},
            "load_magFile":        {"type": "text",
                                    "hint": "Interferogram magnitude file(s), e.g. ./../pairs/*/filt*.int"},
            "load_ionUnwFile":     {"type": "text", "hint": "Unwrapped ionospheric phase file(s)"},
            "load_ionCorFile":     {"type": "text", "hint": "Ionospheric coherence file(s)"},
            "load_ionConnCompFile":{"type": "text", "hint": "Ionospheric connected component file(s)"},
            "load_azOffFile":      {"type": "text", "hint": "Azimuth offset file(s)"},
            "load_rgOffFile":      {"type": "text", "hint": "Range offset file(s)"},
            "load_azOffStdFile":   {"type": "text", "hint": "Azimuth offset standard deviation file(s)"},
            "load_rgOffStdFile":   {"type": "text", "hint": "Range offset standard deviation file(s)"},
            "load_offSnrFile":     {"type": "text", "hint": "Offset SNR file(s)"},
            "load_demFile":        {"type": "text",
                                    "hint": "DEM file in radar/geo coordinates, e.g. ./inputs/geometryRadar.h5"},
            "load_lookupYFile":    {"type": "text",
                                    "hint": "Lookup table lat/y file, e.g. ./inputs/geometryGeo.h5"},
            "load_lookupXFile":    {"type": "text", "hint": "Lookup table lon/x file"},
            "load_incAngleFile":   {"type": "text", "hint": "Incidence angle file"},
            "load_azAngleFile":    {"type": "text", "hint": "Azimuth angle file"},
            "load_shadowMaskFile": {"type": "text", "hint": "Shadow/layover mask file"},
            "load_waterMaskFile":  {"type": "text", "hint": "Water mask file"},
            "load_bperpFile":      {"type": "text", "hint": "Perpendicular baseline file"},
            "subset_yx":           {"type": "text", "hint": "Subset in row/column, e.g. 1200:2000,0:2000"},
            "subset_lalo":         {"type": "text", "hint": "Subset in lat/lon, e.g. 37.5:38.5,-118.5:-117.5"},
            "multilook_method":    {"type": "select", "options": ["auto", "mean", "nearest", "no"],
                                    "hint": "Multilook method: mean, nearest, or no for skip"},
            "multilook_ystep":     {"type": "auto_number", "hint": "Multilook factor in y/azimuth direction"},
            "multilook_xstep":     {"type": "auto_number", "hint": "Multilook factor in x/range direction"},
            # Modify Network
            "network_tempBaseMax":     {"type": "auto_number", "hint": "Maximum temporal baseline in days"},
            "network_perpBaseMax":     {"type": "auto_number", "hint": "Maximum perpendicular baseline in meters"},
            "network_connNumMax":      {"type": "auto_number", "hint": "Maximum number of nearest-neighbor connections"},
            "network_startDate":       {"type": "text", "hint": "Start date in YYYYMMDD format"},
            "network_endDate":         {"type": "text", "hint": "End date in YYYYMMDD format"},
            "network_excludeDate":     {"type": "text", "hint": "Date(s) to exclude in YYYYMMDD, separated by space"},
            "network_excludeDate12":   {"type": "text",
                                        "hint": "Interferogram date pairs to exclude, e.g. 20150115_20150127"},
            "network_excludeIfgIndex": {"type": "text",
                                        "hint": "Index(es) of interferograms to exclude, e.g. 2 8 230"},
            "network_referenceFile":   {"type": "text",
                                        "hint": "Reference network file (pairs in date12_list.txt format)"},
            "network_coherenceBased":  {"type": "select", "options": ["auto", "yes", "no"],
                                        "hint": "Enable coherence-based network modification"},
            "network_minCoherence":    {"type": "number", "min": 0, "max": 1, "step": 0.05,
                                        "hint": "Minimum coherence threshold for coherence-based modification"},
            "network_areaRatioBased":  {"type": "select", "options": ["auto", "yes", "no"],
                                        "hint": "Enable area-ratio-based network modification (ECR method)"},
            "network_minAreaRatio":    {"type": "auto_number",
                                        "hint": "Minimum area ratio for area-ratio-based modification"},
            "network_keepMinSpanTree": {"type": "select", "options": ["auto", "yes", "no"],
                                        "hint": "Keep the minimum spanning tree of the network"},
            "network_maskFile":        {"type": "text",
                                        "hint": "Mask file for coherence-based network modification"},
            "network_aoiYX":           {"type": "text",
                                        "hint": "AOI in row/column for coherence calculation, e.g. 100:200,300:400"},
            "network_aoiLALO":         {"type": "text",
                                        "hint": "AOI in lat/lon for coherence calculation, e.g. 37.5:38.0,-118.0:-117.5"},
            # Reference Point
            "reference_yx":            {"type": "text", "hint": "Reference point in row/column, e.g. 257 151"},
            "reference_lalo":          {"type": "text", "hint": "Reference point in lat/lon, e.g. 37.65 -118.45"},
            "reference_maskFile":      {"type": "text", "hint": "Mask file for reference point selection"},
            "reference_coherenceFile": {"type": "text", "hint": "Coherence file for reference point selection"},
            "reference_minCoherence":  {"type": "auto_number",
                                        "hint": "Minimum coherence for reference point selection"},
            # Unwrap Error
            "unwrapError_method":          {"type": "select",
                                            "options": ["auto", "bridging", "phase_closure",
                                                        "bridging+phase_closure", "no"],
                                            "hint": "Phase unwrapping error correction method"},
            "unwrapError_waterMaskFile":   {"type": "text", "hint": "Water mask file for bridging method"},
            "unwrapError_connCompMinArea": {"type": "auto_number",
                                            "hint": "Minimum area in pixels for a connected component"},
            "unwrapError_numSample":       {"type": "auto_number",
                                            "hint": "Number of randomly sampled triplets for phase_closure method"},
            "unwrapError_ramp":            {"type": "select", "options": ["auto", "linear", "quadratic", "no"],
                                            "hint": "Remove ramp before bridging"},
            "unwrapError_bridgePtsRadius": {"type": "auto_number",
                                            "hint": "Radius in pixels to search for bridge points"},
            # Network Inversion
            "networkInversion_weightFunc":      {"type": "select", "options": ["auto", "var", "fim", "no"],
                                                 "hint": "var = spatial variance, fim = Fisher info matrix, no = uniform"},
            "networkInversion_waterMaskFile":   {"type": "text", "hint": "Water mask file applied before inversion"},
            "networkInversion_minNormVelocity": {"type": "select", "options": ["auto", "yes", "no"],
                                                 "hint": "Minimize L2-norm of velocity (vs. timeseries) in SBAS inversion"},
            "networkInversion_maskDataset":     {"type": "text",
                                                 "hint": "Dataset for masking, e.g. coherence or connectComponent"},
            "networkInversion_maskThreshold":   {"type": "number", "min": 0, "max": 1, "step": 0.05,
                                                 "hint": "Threshold for maskDataset to mask unwrapped phase"},
            "networkInversion_minRedundancy":   {"type": "auto_number",
                                                 "hint": "Minimum redundancy of interferograms per pixel"},
            "networkInversion_minTempCoh":      {"type": "auto_number",
                                                 "hint": "Minimum temporal coherence for pixel masking"},
            "networkInversion_minNumPixel":     {"type": "auto_number",
                                                 "hint": "Minimum number of coherent pixels to proceed"},
            "networkInversion_shadowMask":      {"type": "select", "options": ["auto", "yes", "no"],
                                                 "hint": "Use shadow mask from geometry"},
            # Solid Earth Tides
            "solidEarthTides":  {"type": "select", "options": ["auto", "yes", "no"],
                                 "hint": "Correct for solid earth tides using pysolid"},
            # Ionosphere
            "ionosphericDelay_method":       {"type": "select", "options": ["auto", "split_spectrum", "no"],
                                              "hint": "Ionospheric delay correction method"},
            "ionosphericDelay_excludeDate":  {"type": "text",
                                              "hint": "Dates to exclude from ionospheric correction, e.g. 20180202 20180414"},
            "ionosphericDelay_excludeDate12":{"type": "text",
                                              "hint": "Interferogram date pairs to exclude from ionospheric correction"},
            # Troposphere
            "troposphericDelay_method":         {"type": "select",
                                                 "options": ["auto", "pyaps", "gacos", "height_correlation", "no"],
                                                 "hint": "Tropospheric delay correction method"},
            "troposphericDelay_weatherModel":   {"type": "select",
                                                 "options": ["auto", "ERA5", "ERA5T", "MERRA", "NARR"],
                                                 "hint": "Weather model for pyaps (ERA5 recommended)"},
            "troposphericDelay_weatherDir":     {"type": "text",
                                                 "hint": "Directory of downloaded weather data files for pyaps"},
            "troposphericDelay_polyOrder":      {"type": "auto_number",
                                                 "hint": "Polynomial order for height-correlation method"},
            "troposphericDelay_looks":          {"type": "auto_number",
                                                 "hint": "Extra multilook factor for height-correlation estimation"},
            "troposphericDelay_minCorrelation": {"type": "auto_number",
                                                 "hint": "Minimum correlation between height and phase"},
            "troposphericDelay_gacosDir":       {"type": "text", "hint": "Directory of GACOS delay files"},
            # Deramp
            "deramp":          {"type": "select", "options": ["auto", "linear", "quadratic", "no"],
                                "hint": "Remove phase ramp in x/y direction"},
            "deramp_maskFile": {"type": "text", "hint": "Mask file for ramp estimation"},
            # Topography
            "topographicResidual":                 {"type": "select", "options": ["auto", "yes", "no"],
                                                    "hint": "Correct topographic residuals (DEM error)"},
            "topographicResidual_polyOrder":       {"type": "auto_number",
                                                    "hint": "Polynomial order for DEM error estimation"},
            "topographicResidual_phaseVelocity":   {"type": "select", "options": ["auto", "yes", "no"],
                                                    "hint": "Minimize phase velocity (not phase) in DEM error inversion"},
            "topographicResidual_stepDate":        {"type": "text",
                                                    "hint": "Step function date(s) for co-seismic jumps, e.g. 20140911"},
            "topographicResidual_excludeDate":     {"type": "text",
                                                    "hint": "Dates to exclude in DEM error inversion"},
            "topographicResidual_pixelwiseGeometry":{"type": "select", "options": ["auto", "yes", "no"],
                                                     "hint": "Use pixel-wise geometry in DEM error estimation"},
            # Residual RMS
            "residualRMS_maskFile": {"type": "text", "hint": "Mask file for residual phase quality assessment"},
            "residualRMS_deramp":   {"type": "select", "options": ["auto", "linear", "quadratic", "no"],
                                     "hint": "Remove ramp before RMS calculation"},
            "residualRMS_cutoff":   {"type": "auto_number",
                                     "hint": "Cutoff value in RMS threshold for outlier date detection"},
            # Reference Date
            "reference_date": {"type": "text",
                               "hint": "Reference date in YYYYMMDD; 'auto' = first date with full coherence"},
            # Velocity
            "timeFunc_startDate":                {"type": "text", "hint": "Start date of the time function fit"},
            "timeFunc_endDate":                  {"type": "text", "hint": "End date of the time function fit"},
            "timeFunc_excludeDate":              {"type": "text",
                                                  "hint": "Date(s) to exclude from time function fitting"},
            "timeFunc_polynomial":               {"type": "auto_number",
                                                  "hint": "Polynomial order: 1 = linear velocity, 2 = acceleration"},
            "timeFunc_periodic":                 {"type": "text",
                                                  "hint": "Periodic periods in years, e.g. 1.0 0.5 for annual+semi-annual"},
            "timeFunc_stepDate":                 {"type": "text",
                                                  "hint": "Step function date(s), e.g. 20161231 for co-seismic jump"},
            "timeFunc_exp":                      {"type": "text",
                                                  "hint": "Exponential decay: onset_date char_time, e.g. 20181026 60"},
            "timeFunc_log":                      {"type": "text",
                                                  "hint": "Logarithmic relaxation: onset_date char_time, e.g. 20181026 60"},
            "timeFunc_uncertaintyQuantification":{"type": "select", "options": ["auto", "bootstrap", "residue"],
                                                  "hint": "Method for velocity uncertainty quantification"},
            "timeFunc_timeSeriesCovFile":        {"type": "text",
                                                  "hint": "Time-series covariance file for uncertainty propagation"},
            "timeFunc_bootstrapCount":           {"type": "auto_number",
                                                  "hint": "Number of bootstrap iterations"},
            # Geocode
            "geocode":              {"type": "select", "options": ["auto", "yes", "no"],
                                     "hint": "Geocode datasets in radar coordinates to geo coordinates"},
            "geocode_SNWE":         {"type": "text",
                                     "hint": "Bounding box: south north west east, e.g. 31 40 -115 -100"},
            "geocode_laloStep":     {"type": "text",
                                     "hint": "Output pixel size in lat/lon, e.g. -0.000833 0.000833 (≈90 m)"},
            "geocode_interpMethod": {"type": "select", "options": ["auto", "nearest", "linear"],
                                     "hint": "Interpolation method for geocoding"},
            "geocode_fillValue":    {"type": "text",
                                     "hint": "Fill value for pixels outside coverage, e.g. nan or 0"},
            # Google Earth
            "save_kmz":            {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Save geocoded velocity to Google Earth KMZ file"},
            # HDF-EOS5
            "save_hdfEos5":        {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Save time-series to HDF-EOS5 format"},
            "save_hdfEos5_update": {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Update HDF-EOS5 file if already exists"},
            "save_hdfEos5_subset": {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Save subset of HDF-EOS5 file"},
            # Plot
            "plot":                {"type": "select", "options": ["auto", "yes", "no"],
                                    "hint": "Plot results during processing"},
            "plot_dpi":            {"type": "auto_number", "hint": "Figure DPI for saved plots"},
            "plot_maxMemory":      {"type": "auto_number",
                                    "hint": "Maximum memory in GB for plot_smallbaseline.py"},
        }
        # ─────────────────────────────────────────────────────────────────────────
    
        name: str = "Mintpy_SBAS_Base_Config"
        workdir: Path | str = field(default_factory=lambda: Path.cwd())
        debug: bool = False 
    
        ## computing resource configuration
        compute_maxMemory : float | int = _env['memory']
        compute_cluster : str = 'local' # Mintpy's slurm parallel processing is buggy, so we will handle parallel processing with dask instead. Switch to none to turn off parallel processing to save memory.
        compute_numWorker : int = _env['cpu']
        compute_config: str = 'none'
    
        ## Load data
        load_processor: str = 'auto'
        load_autoPath: str = 'auto' 
        load_updateMode: str = 'auto'
        load_compression: str = 'auto'
        ##---------for ISCE only:
        load_metaFile: str = 'auto'
        load_baselineDir: str = 'auto'
        ##---------interferogram stack:
        load_unwFile: str = 'auto'
        load_corFile: str = 'auto'
        load_connCompFile: str = 'auto'
        load_intFile: str = 'auto'
        load_magFile: str = 'auto'
        ##---------ionosphere stack (optional):
        load_ionUnwFile: str = 'auto'
        load_ionCorFile: str = 'auto'
        load_ionConnCompFile: str = 'auto'
        ##---------offset stack (optional):
        load_azOffFile: str = 'auto'
        load_rgOffFile: str = 'auto'
        load_azOffStdFile: str = 'auto'
        load_rgOffStdFile: str = 'auto'
        load_offSnrFile: str = 'auto'
        ##---------geometry:
        load_demFile: str = 'auto'
        load_lookupYFile: str = 'auto'
        load_lookupXFile: str = 'auto'
        load_incAngleFile: str = 'auto'
        load_azAngleFile: str = 'auto'
        load_shadowMaskFile: str = 'auto'
        load_waterMaskFile: str = 'auto'
        load_bperpFile: str = 'auto'
        ##---------subset (optional):
        subset_yx: str = 'auto'
        subset_lalo: str = 'auto'
        ##---------multilook (optional):
        multilook_method: str = 'auto'
        multilook_ystep: str | int = 'auto'
        multilook_xstep: str | int= 'auto'
    
        # 2. Modify Network
        network_tempBaseMax: str | float = 'auto'
        network_perpBaseMax: str | float = 'auto'
        network_connNumMax: str | int = 'auto'
        network_startDate: str = 'auto'
        network_endDate: str = 'auto'
        network_excludeDate: str = 'auto'
        network_excludeDate12: str = 'auto'
        network_excludeIfgIndex: str = 'auto'
        network_referenceFile: str = 'auto'
        ## 2) Data-driven network modification
        network_coherenceBased: str = 'auto'
        network_minCoherence: str |float = 'auto'
        ## b - Effective Coherence Ratio network modification = (threshold + MST) by default
        network_areaRatioBased: str = 'auto'
        network_minAreaRatio: str |float= 'auto'
        ## Additional common parameters for the 2) data-driven network modification
        network_keepMinSpanTree: str = 'auto'
        network_maskFile: str = 'auto'
        network_aoiYX: str = 'auto'
        network_aoiLALO: str = 'auto'
    
        # 3. Reference Point
        reference_yx: str = 'auto'
        reference_lalo: str = 'auto'
        reference_maskFile: str = 'auto'
        reference_coherenceFile: str = 'auto'
        reference_minCoherence: str |float = 'auto'
    
        # 4. Correct Unwrap Error
        unwrapError_method: str = 'auto'
        unwrapError_waterMaskFile: str = 'auto'
        unwrapError_connCompMinArea: str |float = 'auto'
        ## phase_closure options:
        unwrapError_numSample: str | int= 'auto'
        ## bridging options:
        unwrapError_ramp: str = 'auto'
        unwrapError_bridgePtsRadius: str | int= 'auto'
    
        # 5. Invert Network
        networkInversion_weightFunc: str = 'auto'
        networkInversion_waterMaskFile: str = 'auto'
        networkInversion_minNormVelocity: str = 'auto'
        ## mask options for unwrapPhase of each interferogram before inversion (recommend if weightFunct=no):
        networkInversion_maskDataset: str = 'auto'
        networkInversion_maskThreshold: str | float = 'auto'
        networkInversion_minRedundancy: str | float = 'auto'
        ## Temporal coherence is calculated and used to generate the mask as the reliability measure
        networkInversion_minTempCoh: str | float = 'auto'
        networkInversion_minNumPixel: str | int = 'auto'
        networkInversion_shadowMask: str = 'auto'
    
        # 6. Correct SET (Solid Earth Tides)
        solidEarthTides: str = 'auto'
    
        # 7. Correct Ionosphere
        ionosphericDelay_method: str = 'auto'
        ionosphericDelay_excludeDate: str = 'auto'
        ionosphericDelay_excludeDate12: str = 'auto'
    
        # 8. Correct Troposphere
        troposphericDelay_method: str = 'auto'
        ## Notes for pyaps:
        troposphericDelay_weatherModel: str = 'auto'
        troposphericDelay_weatherDir: str = 'auto'
    
        ## Notes for height_correlation:
        troposphericDelay_polyOrder: str | int = 'auto'
        troposphericDelay_looks: str | int = 'auto'
        troposphericDelay_minCorrelation: str | float = 'auto'
        ## Notes for gacos:
        troposphericDelay_gacosDir: str = 'auto'
    
        # 9. Deramp
        deramp: str = 'auto'
        deramp_maskFile: str = 'auto'
    
        # 10. Correct Topography
        topographicResidual: str = 'auto'
        topographicResidual_polyOrder: str = 'auto'
        topographicResidual_phaseVelocity: str = 'auto'
        topographicResidual_stepDate: str = 'auto'
        topographicResidual_excludeDate: str = 'auto'
        topographicResidual_pixelwiseGeometry: str = 'auto'
    
        # 11.1 Residual RMS
        residualRMS_maskFile: str = 'auto'
        residualRMS_deramp: str = 'auto'
        residualRMS_cutoff: str | float = 'auto'
    
        # 11.2 Reference Date
        reference_date: str = 'auto'
    
        # 12. Velocity
        timeFunc_startDate: str = 'auto'
        timeFunc_endDate: str = 'auto'
        timeFunc_excludeDate: str = 'auto'
        ## Fit a suite of time functions
        timeFunc_polynomial: str | int = 'auto'
        timeFunc_periodic: str = 'auto'
        timeFunc_stepDate: str = 'auto'
        timeFunc_exp: str = 'auto'
        timeFunc_log: str = 'auto'
        ## Uncertainty quantification methods:
        timeFunc_uncertaintyQuantification: str = 'auto'
        timeFunc_timeSeriesCovFile: str = 'auto'
        timeFunc_bootstrapCount: str | int = 'auto'
    
        # 13.1 Geocode
        geocode: str = 'auto'
        geocode_SNWE: str = 'auto'
        geocode_laloStep: str = 'auto'
        geocode_interpMethod: str = 'auto'
        geocode_fillValue: str | float = 'auto'
    
        # 13.2 Google Earth
        save_kmz: str = 'auto'
    
        # 13.3 HDFEOS5
        save_hdfEos5: str = 'auto'
        save_hdfEos5_update: str = 'auto'
        save_hdfEos5_subset: str = 'auto'
    
        # 13.4 Plot
        plot: str = 'auto'
        plot_dpi: str | int = 'auto'
        plot_maxMemory: str | int = 'auto'
    
        def __post_init__(self):
            if isinstance(self.workdir, str):
                self.workdir = Path(self.workdir).expanduser().resolve()
    
        def write_mintpy_config(self, outpath: Union[Path, str]):
            """
            Writes the dataclass to a mintpy .cfg file, excluding operational 
            parameters that MintPy doesn't recognize.
            """
            outpath = Path(outpath).expanduser().resolve()
            exclude_fields = ['name', 'workdir', 'debug']
    
            with open(outpath, 'w') as f:
                f.write("## MintPy Config File Generated via InSARHub\n")
    
                for key, value in asdict(self).items():
                    if key in exclude_fields:
                        continue
    
                    parts = key.split('_')
                    if len(parts) > 1:
                        mintpy_key = f"mintpy.{parts[0]}.{'.'.join(parts[1:])}"
                    else:
                        mintpy_key = f"mintpy.{parts[0]}"
    
                    f.write(f"{mintpy_key:<40} = {value}\n")
    
            return Path(outpath).resolve()
    
  • Run
    Run the Mintpy time-series analysis based on provid configuration

    analyzer.run()
    

    Parameters:

    Name Type Description Default
    steps list[str] | None

    List of MintPy processing steps to execute. If None, the default full workflow is executed: [ 'load_data', 'modify_network', 'reference_point', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5' ]

    None

    Raises:

    Type Description
    RuntimeError

    If tropospheric delay method requires CDS authorization and authorization fails.

    Exception

    Propagates exceptions raised during MintPy execution.

  • Clean up

    Remove intermediate processing files generated during the time-series process

    analyzer.cleanup()
    

    Raises:

    Type Description
    Exception

    Propagates any unexpected errors raised during removal.

Hyp3_SBAS

The Hyp3_SBAS is specialized analyzer that extends Mintpy_SBAS_Base_Analyzer, preconfigured specifically for processing Time-series data for Hyp3 InSAR product.

Source code in src/insarhub/analyzer/hyp3_sbas.py
class Hyp3_SBAS(Mintpy_SBAS_Base_Analyzer):
    name = 'Hyp3_SBAS'
    description = "SBAS time-series analysis of HyP3 InSAR outputs using MintPy."
    compatible_processor = "Hyp3_InSAR"
    default_config = Hyp3_SBAS_Config
    required = ['unw_phase.tif', 'corr.tif',  'dem.tif'] # also need meta files to get the date and other info
    optional = ['lv_theta.tif', 'lv_phi.tif', 'water_mask.tif']
    def __init__(self, config: Hyp3_SBAS_Config | None = None):
        super().__init__(config)

    def prep_data(self):
        """
        Prepare input data for analysis by performing unzipping, collection, clipping, and parameter setup.

        This method orchestrates the preprocessing steps required before running the analysis workflow. 
        It ensures that all input files are available, aligned, and properly configured.

        Steps performed:
            1. `_unzip_hyp3()`: Extracts any compressed Hyp3 output files.
            2. `_collect_files()`: Gathers relevant input files (e.g., DEMs, interferograms).
            3. `_get_common_overlap(files['dem'])`: Computes the spatial overlap extent among input rasters.
            4. `_clip_rasters(files, overlap_extent)`: Clips input rasters to the common overlapping area.
            5. `_set_load_parameters()`: Sets parameters required for loading the preprocessed data into memory.

        Raises:
            FileNotFoundError: If required input files are missing.
            ValueError: If no common overlap region can be determined among rasters.
            Exception: Propagates any unexpected errors during preprocessing.

        Notes:
            - This method must be called before running the analysis workflow.
            - Designed for workflows using Hyp3-derived Sentinel-1 products.
            - Ensures consistent spatial coverage across all input datasets.
        """
        self._unzip_hyp3()
        files = self._collect_files()
        overlap_extent = self._get_common_overlap(files['dem'])
        self._clip_rasters(files, overlap_extent)
        self._set_load_parameters()
        super().prep_data()

    def _unzip_hyp3(self):
        print(f'{Fore.CYAN}Unzipping HyP3 Products...{Fore.RESET}')

        hyp3_results = list(self.workdir.rglob('*.zip'))
        self.tmp_dir.mkdir(parents=True, exist_ok=True)

        with tqdm(hyp3_results, desc="Processing", unit="file") as pbar:
            for zip_file in pbar:
                extract_target = self.tmp_dir / zip_file.stem
                with zipfile.ZipFile(zip_file, 'r') as zf:
                    needs_extraction = True
                    if extract_target.is_dir():
                        files_in_zip = {Path(f).name for f in zf.namelist() if not f.endswith('/')}
                        folder_files = {f.name for f in extract_target.iterdir() if f.is_file()}
                        if files_in_zip.issubset(folder_files):
                            needs_extraction = False
                            pbar.set_description(f"File Exist: {zip_file.stem[:30]}...")
                    if needs_extraction:
                        pbar.set_description(f"Extracting: {zip_file.stem[:30]}...")
                        if extract_target.is_dir():
                            shutil.rmtree(extract_target)

                        zf.extractall(self.tmp_dir)
        print(f'\n{Fore.GREEN}Unzipping complete.{Fore.RESET}')

    def _collect_files(self):
        print(f'{Fore.CYAN}Mapping file paths...{Fore.RESET}')
        all_required = {ext.split('.')[0]: ext for ext in self.required}    
        all_optional = {ext.split('.')[0]: ext for ext in self.optional}
        files = defaultdict(list)
        files['meta'] = [m for m in self.tmp_dir.rglob('*.txt') if 'README' not in m.name]
        for cat_name, ext in {**all_required, **all_optional}.items():
            files[cat_name] = list(self.tmp_dir.rglob(f"*_{ext}"))

        missing_req = [name for name, ext in all_required.items() if not files[name]]
        if missing_req or not files['meta']:
            print(f"\033[K", end="\r") # Clear current line
            msg = []
            if missing_req: msg.append(f"Missing rasters: {missing_req}")
            if not files['meta']: msg.append("Missing metadata (.txt) files")

            error_report = f"{Fore.RED}CRITICAL ERROR: {'. '.join(msg)}.{Fore.RESET}\n" \
                           f"MintPy requires these files to extract dates and baselines."
            raise FileNotFoundError(error_report)
        missing_opt = [name for name in all_optional if not files[name]]

        total_pairs = len(files['unw_phase'])
        status_msg = f"{Fore.GREEN}Found {total_pairs} pairs | Metadata: OK"
        if missing_opt:
            status_msg += f" | {Fore.YELLOW}Missing optional: {missing_opt}"

        print(f"\r\033[K{status_msg}{Fore.RESET}")
        return files

    def _get_common_overlap(self, dem_files):
        ulx_l, uly_l, lrx_l, lry_l = [], [], [], []
        for f in dem_files:
            ds = gdal.Open(f.as_posix())
            gt = ds.GetGeoTransform() # (ulx, xres, xrot, uly, yrot, yres)
            ulx, uly = gt[0], gt[3]
            lrx, lry = gt[0] + gt[1] * ds.RasterXSize, gt[3] + gt[5] * ds.RasterYSize
            ulx_l.append(ulx)
            uly_l.append(uly)
            lrx_l.append(lrx)
            lry_l.append(lry)
            ds = None
        return  (max(ulx_l), min(uly_l), min(lrx_l), max(lry_l))

    def _clip_rasters(self, files, overlap_extent):
        print(f'{Fore.CYAN}Clipping rasters to common overlap...{Fore.RESET}')
        self.clip_dir.mkdir(parents=True, exist_ok=True)
        categories = [k for k in files.keys() if k != 'meta']

        with tqdm(categories, desc="Progress", position=0, dynamic_ncols=True) as pbar_out:
            for key in pbar_out:
                file_list = files[key]
                pbar_out.set_description(f"Group: {key}")

                # Inner progress bar for individual files in this group
                # leave=False ensures the inner bar disappears when the group is done
                with tqdm(file_list, desc=f"  -> Clipping", leave=False, position=1, unit="file", dynamic_ncols=True) as pbar_in:
                    for f in pbar_in:
                        out = self.clip_dir / f"{f.stem}_clip.tif"

                        if out.exists():
                            pbar_in.set_postfix_str(f"Skip: {f.name[:15]}...")
                            # Update postfix instead of printing to avoid creating new lines
                            continue

                        pbar_in.set_postfix_str(f"File: {f.name[:15]}...")

                        try:
                            gdal.Translate(
                                destName=out.as_posix(),
                                srcDS=f.as_posix(),
                                projWin=overlap_extent
                            )
                        except Exception as e:
                            tqdm.write(f"{Fore.RED}Error clipping {f.name}: {e}{Fore.RESET}")

            # Handle metadata separately as it's just a file copy (no progress bar needed)
        if 'meta' in files:
            print(f"\r{Fore.CYAN}Step: Copying metadata files... \033[K", end="", flush=True)
            for f in files['meta']:
                shutil.copy(f, self.clip_dir / f.name)

        print(f'\n{Fore.GREEN}Clipping complete.{Fore.RESET}')

    def _set_load_parameters(self):
        self.config.load_unwFile = (self.clip_dir / '*_unw_phase_clip.tif').as_posix()
        self.config.load_corFile = (self.clip_dir / '*_corr_clip.tif').as_posix()
        self.config.load_demFile = (self.clip_dir / '*_dem_clip.tif').as_posix()
        opt_map = {
            'lv_theta': 'load_incAngleFile',
            'lv_phi': 'load_azAngleFile',
            'water_mask': 'load_waterMaskFile'
        }
        for k, cfg_attr in opt_map.items():
            if list(self.clip_dir.glob(f"*_{k}_clip.tif")):
                setattr(self.config, cfg_attr, (self.clip_dir / f"*_{k}_clip.tif").as_posix())

Usage

  • Create Analyzer with Parameters

    Initialize a analyzer instance

    analyzer = Analyzer.create('Hyp3_SBAS', 
                                workdir="/your/work/dir")
    
    OR
    params = {"workdir":"/your/work/dir"}
    analyzer = Analyzer.create('Hyp3_SBAS', **params)
    
    OR

    from insarhub.config import Mintpy_SBAS_Base_Config
    cfg = Mintpy_SBAS_Base_Config(workdir="/your/work/dir")
    analyzer = Analyzer.create('Hyp3_SBAS', config=cfg)
    
    • Prepare data Prepare interferogram data download from hyp3 server to mintpy
    analyzer.prep_data()
    

    Raises:

    Type Description
    FileNotFoundError

    If required input files are missing.

    ValueError

    If no common overlap region can be determined among rasters.

    Exception

    Propagates any unexpected errors during preprocessing.

    Source code in src/insarhub/analyzer/hyp3_sbas.py
    def prep_data(self):
        """
        Prepare input data for analysis by performing unzipping, collection, clipping, and parameter setup.
    
        This method orchestrates the preprocessing steps required before running the analysis workflow. 
        It ensures that all input files are available, aligned, and properly configured.
    
        Steps performed:
            1. `_unzip_hyp3()`: Extracts any compressed Hyp3 output files.
            2. `_collect_files()`: Gathers relevant input files (e.g., DEMs, interferograms).
            3. `_get_common_overlap(files['dem'])`: Computes the spatial overlap extent among input rasters.
            4. `_clip_rasters(files, overlap_extent)`: Clips input rasters to the common overlapping area.
            5. `_set_load_parameters()`: Sets parameters required for loading the preprocessed data into memory.
    
        Raises:
            FileNotFoundError: If required input files are missing.
            ValueError: If no common overlap region can be determined among rasters.
            Exception: Propagates any unexpected errors during preprocessing.
    
        Notes:
            - This method must be called before running the analysis workflow.
            - Designed for workflows using Hyp3-derived Sentinel-1 products.
            - Ensures consistent spatial coverage across all input datasets.
        """
        self._unzip_hyp3()
        files = self._collect_files()
        overlap_extent = self._get_common_overlap(files['dem'])
        self._clip_rasters(files, overlap_extent)
        self._set_load_parameters()
        super().prep_data()
    
    • Run
      Run the Mintpy time-series analysis based on provid configuration
    analyzer.run()
    

    Parameters:

    Name Type Description Default
    steps list[str] | None

    List of MintPy processing steps to execute. If None, the default full workflow is executed: [ 'load_data', 'modify_network', 'reference_point', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5' ]

    None

    Raises:

    Type Description
    RuntimeError

    If tropospheric delay method requires CDS authorization and authorization fails.

    Exception

    Propagates exceptions raised during MintPy execution.

    Source code in src/insarhub/analyzer/mintpy_base.py
    def run(self, steps=None):
        """
        Run the MintPy SBAS time-series analysis workflow.
    
        This method writes the MintPy configuration file, optionally authorizes
        CDS access for tropospheric correction, and executes the selected
        MintPy processing steps using TimeSeriesAnalysis.
    
        Args:
            steps (list[str] | None, optional):
                List of MintPy processing steps to execute. If None, the
                default full workflow is executed:
                    [
                        'load_data', 'modify_network', 'reference_point',
                        'invert_network', 'correct_LOD', 'correct_SET',
                        'correct_ionosphere', 'correct_troposphere',
                        'deramp', 'correct_topography', 'residual_RMS',
                        'reference_date', 'velocity', 'geocode',
                        'google_earth', 'hdfeos5'
                    ]
    
        Raises:
            RuntimeError: If tropospheric delay method requires CDS authorization
                and authorization fails.
            Exception: Propagates exceptions raised during MintPy execution.
    
        Notes:
            - If `troposphericDelay_method` is set to 'pyaps', CDS
            authorization is performed before running MintPy.
            - The configuration file is written to `self.cfg_path`.
            - Processing is executed inside `self.workdir`.
            - This method wraps MintPy TimeSeriesAnalysis for SBAS workflows.
        """
        if self.config.troposphericDelay_method == 'pyaps':
            self._cds_authorize()
    
        run_steps = steps or [
            'load_data', 'modify_network', 'reference_point', 'invert_network',
            'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere',
            'deramp', 'correct_topography', 'residual_RMS', 'reference_date',
            'velocity', 'geocode', 'google_earth', 'hdfeos5'
        ]
        print(f'{Style.BRIGHT}{Fore.MAGENTA}Running MintPy Analysis...{Fore.RESET}')
        app = TimeSeriesAnalysis(self.cfg_path.as_posix(), self.workdir.as_posix())
        app.open()
        app.run(steps=run_steps)
    
    • Clean up

    Remove intermediate processing files generated during the time-series process

    analyzer.cleanup()
    

    Raises:

    Type Description
    Exception

    Propagates any unexpected errors raised during removal.

⚠️ Major Redesign

InSARScript v1.1.0 has change of APIs, this documentation is not compatible with version v1.0.0.