diff --git a/contrib/stack/topsStack/README.md b/contrib/stack/topsStack/README.md index f0ebfaa..488f805 100644 --- a/contrib/stack/topsStack/README.md +++ b/contrib/stack/topsStack/README.md @@ -10,7 +10,7 @@ To use the sentinel stack processor, make sure to add the path of your `contrib/ The scripts provides support for Sentinel-1 TOPS stack processing. Currently supported workflows include a coregistered stack of SLC, interferograms, offsets, and coherence. -`stackSentinel.py` generates all configuration and run files required to be executed on a stack of Sentinel-1 TOPS data. When stackSentinel.py is executed for a given workflow (-W option) a **configs** and **run_files** folder is generated. No processing is performed at this stage. Within the run_files folder different run\_#\_description files are contained which are to be executed as shell scripts in the run number order. Each of these run scripts call specific configure files contained in the “configs” folder which call ISCE in a modular fashion. The configure and run files will change depending on the selected workflow. To make run_# files executable, change the file permission accordingly (e.g., `chmod +x run_1_unpack_slc`). +`stackSentinel.py` generates all configuration and run files required to be executed on a stack of Sentinel-1 TOPS data. When stackSentinel.py is executed for a given workflow (-W option) a **configs** and **run_files** folder is generated. No processing is performed at this stage. Within the run_files folder different run\_#\_description files are contained which are to be executed as shell scripts in the run number order. Each of these run scripts call specific configure files contained in the “configs” folder which call ISCE in a modular fashion. The configure and run files will change depending on the selected workflow. To make run_# files executable, change the file permission accordingly (e.g., `chmod +x run_01_unpack_slc`). ```bash stackSentinel.py -H #To see workflow examples, @@ -74,53 +74,53 @@ stackSentinel.py -s ../SLC/ -d ../DEM/demLat_N18_N20_Lon_W100_W097.dem.wgs84 -a by running the command above, the configs and run_files folders are created. User needs to execute each run file in order. The order is specified by the index number of the run file name. For the example above, the run_files folder includes the following files: -- run_1_unpack_slc_topo_master -- run_2_average_baseline -- run_3_extract_burst_overlaps -- run_4_overlap_geo2rdr_resample -- run_5_pairs_misreg -- run_6_timeseries_misreg -- run_7_geo2rdr_resample -- run_8_extract_stack_valid_region -- run_9_merge +- run_01_unpack_slc_topo_master +- run_02_average_baseline +- run_03_extract_burst_overlaps +- run_04_overlap_geo2rdr_resample +- run_05_pairs_misreg +- run_06_timeseries_misreg +- run_07_geo2rdr_resample +- run_08_extract_stack_valid_region +- run_09_merge - run_10_grid_baseline The generated run files are self descriptive. Below is a short explanation on what each run_file does: -**run_1_unpack_slc_topo_master:** +**run_01_unpack_slc_topo_master:** Includes commands to unpack Sentinel-1 TOPS SLCs using ISCE readers. For older SLCs which need antenna elevation pattern correction, the file is extracted and written to disk. For newer version of SLCs which don’t need the elevation antenna pattern correction, only a gdal virtual “vrt” file (and isce xml file) is generated. The “.vrt” file points to the Sentinel SLC file and reads them whenever required during the processing. If a user wants to write the “.vrt” SLC file to disk, it can be done easily using gdal_translate (e.g. gdal_translate –of ENVI File.vrt File.slc). -The “run_1_unpack_slc_topo_master” also includes a command that refers to the config file of the stack master, which includes configuration for running topo for the stack master. Note that in the pair-wise processing strategy one should run topo (mapping from range-Doppler to geo coordinate) for all pairs. However, with stackSentinel, topo needs to be run only one time for the master in the stack. +The “run_01_unpack_slc_topo_master” also includes a command that refers to the config file of the stack master, which includes configuration for running topo for the stack master. Note that in the pair-wise processing strategy one should run topo (mapping from range-Doppler to geo coordinate) for all pairs. However, with stackSentinel, topo needs to be run only one time for the master in the stack. -**run_2_average_baseline:** +**run_02_average_baseline:** Computes average baseline for the stack. These baselines are not used for processing anywhere. They are only an approximation and can be used for plotting purposes. A more precise baseline grid is estimated later in run_10. -**run_3_extract_burst_overlaps:** +**run_03_extract_burst_overlaps:** Burst overlaps are extracted for estimating azimuth misregistration using NESD technique. If coregistration method is chosen to be “geometry”, then this run file won’t exist and the overlaps are not extracted. -**run_4_overlap_geo2rdr_resample:*** +**run_04_overlap_geo2rdr_resample:*** Running geo2rdr to estimate geometrical offsets between slave burst overlaps and the stack master burst overlaps. The slave burst overlaps are then resampled to the stack master burst overlaps. -**run_5_pairs_misreg:** +**run_05_pairs_misreg:** Using the coregistered stack burst overlaps generated from the previous step, differential overlap interferograms are generated and are used for estimating azimuth misregistration using Enhanced Spectral Diversity (ESD) technique. -**run_6_timeseries_misreg:** +**run_06_timeseries_misreg:** A time-series of azimuth and range misregistration is estimated with respect to the stack master. The time-series is a least squares esatimation from the pair misregistration from the previous step. -**run_7_geo2rdr_resample:** +**run_07_geo2rdr_resample:** Using orbit and DEM, geometrical offsets among all slave SLCs and the stack master is computed. The goometrical offsets, together with the misregistration time-series (from previous step) are used for precise coregistration of each burst SLC. -**run_8_extract_stack_valid_region:** +**run_08_extract_stack_valid_region:** The valid region between burst SLCs at the overlap area of the bursts slightly changes for different acquisitions. Therefore we need to keep track of these overlaps which will be used during merging bursts. Without these knowledge, lines of invalid data may appear in the merged products at the burst overlaps. -**run_9_merge:** +**run_09_merge:** Merges all bursts for the master and coregistered SLCs. The geometry files are also merged including longitude, latitude, shadow and layer mask, line-of-sight files, etc. . @@ -166,4 +166,4 @@ stackSentinel.py -s ../SLC/ -d ../../MexicoCity/demLat_N18_N20_Lon_W100_W097.dem This workflow is basically similar to the previous one. The difference is that the interferograms are not unwrapped. -#### 5. Execute the commands in run files (run_1*, run_2*, etc) in the "run_files" folder #### +#### 5. Execute the commands in run files (run_01*, run_02*, etc) in the "run_files" folder #### diff --git a/contrib/stack/topsStack/stackSentinel.py b/contrib/stack/topsStack/stackSentinel.py index c70b819..afb75e5 100755 --- a/contrib/stack/topsStack/stackSentinel.py +++ b/contrib/stack/topsStack/stackSentinel.py @@ -4,35 +4,43 @@ ####################### -import os, imp, sys, glob +import os, sys, glob import argparse import configparser -import datetime +import datetime import time import numpy as np + +# suppress matplotlib DEBUG message +from matplotlib.path import Path as Path +import logging +mpl_logger = logging.getLogger('matplotlib') +mpl_logger.setLevel(logging.WARNING) + import isce import isceobj from isceobj.Sensor.TOPS.Sentinel1 import Sentinel1 from Stack import config, run, sentinelSLC + helpstr= ''' Stack processor for Sentinel-1 data using ISCE software. For a full list of different options, try stackSentinel.py -h -stackSentinel.py generates all configuration and run files required to be executed for a stack of Sentinel-1 TOPS data. +stackSentinel.py generates all configuration and run files required to be executed for a stack of Sentinel-1 TOPS data. Following are required to start processing: -1) a folder that includes Sentinel-1 SLCs, -2) a DEM (Digital Elevation Model) -3) a folder that includes precise orbits (use dloadOrbits.py to download/ update your orbit folder. Missing orbits downloaded on the fly.) -4) a folder for Sentinel-1 Aux files (which is used for correcting the Elevation Antenna Pattern). +1) a folder that includes Sentinel-1 SLCs, +2) a DEM (Digital Elevation Model) +3) a folder that includes precise orbits (use dloadOrbits.py to download/ update your orbit folder. Missing orbits downloaded on the fly.) +4) a folder for Sentinel-1 Aux files (which is used for correcting the Elevation Antenna Pattern). Note that stackSentinel.py does not process any data. It only prepares a lot of input files for processing and a lot of run files. Then you need to execute all those generated run files in order. To know what is really going on, after running stackSentinel.py, look at each run file generated by stackSentinel.py. Each run file actually has several commands that are independent from each other and can be executed in parallel. The config files for each run file include the processing options to execute a specific command/function. -Note also that run files need to be executed in order, i.e., running run_3 needs results from run_2, etc. +Note also that run files need to be executed in order, i.e., running run_03 needs results from run_02, etc. ############################################## @@ -60,7 +68,7 @@ stackSentinel.py -s ../SLC/ -d ../../MexicoCity/demLat_N18_N20_Lon_W100_W097.dem %%%%%%%%%%%%%%% Example 4: -# slc workflow that produces a coregistered stack of SLCs +# slc workflow that produces a coregistered stack of SLCs stackSentinel.py -s ../SLC/ -d ../../MexicoCity/demLat_N18_N20_Lon_W100_W097.dem.wgs84 -b '19 20 -99.5 -98.5' -a ../../AuxDir/ -o ../../Orbits -C NESD -W slc @@ -96,10 +104,10 @@ def createParser(): parser.add_argument('-a', '--aux_directory', dest='aux_dirname', type=str, required=True, help='Directory with all aux files') - + parser.add_argument('-w', '--working_directory', dest='work_dir', type=str, default='./', help='Working directory ') - + parser.add_argument('-d', '--dem', dest='dem', type=str, required=True, help='Directory with the DEM') @@ -128,7 +136,7 @@ def createParser(): parser.add_argument('-z', '--azimuth_looks', dest='azimuthLooks', type=str, default='3' , help='Number of looks in azimuth for interferogram multi-looking. -- Default : 3') - + parser.add_argument('-r', '--range_looks', dest='rangeLooks', type=str, default='9' , help='Number of looks in range for interferogram multi-looking. -- Default : 9') @@ -158,7 +166,7 @@ def createParser(): parser.add_argument('--stop_date', dest='stopDate', type=str, default=None , help='Stop date for stack processing. Acquisitions after stop date are ignored. format should be YYYY-MM-DD e.g., 2017-02-26') - + parser.add_argument('-useGPU', '--useGPU', dest='useGPU',action='store_true', default=False, help='Allow App to use GPU when available') @@ -183,7 +191,7 @@ def cmdLineParse(iargs = None): #################################### def get_dates(inps): # Given the SLC directory This function extracts the acquisition dates - # and prepares a dictionary of sentinel slc files such that keys are + # and prepares a dictionary of sentinel slc files such that keys are # acquisition dates and values are object instances of sentinelSLC class # which is defined in Stack.py @@ -205,7 +213,7 @@ def get_dates(inps): SAFE_files = [] for line in open(inps.slc_dirname): SAFE_files.append(str.replace(line,'\n','').strip()) - + else: SAFE_files = glob.glob(os.path.join(inps.slc_dirname,'S1*_IW_SLC*zip')) # changed to zip file by Minyan Zhong @@ -232,8 +240,8 @@ def get_dates(inps): ################################ # write down the list of SAFE files in a txt file which will be used: - f = open('SAFE_files.txt','w') - safe_count=0 + f = open('SAFE_files.txt','w') + safe_count=0 safe_dict={} bbox_poly = [[bbox[0],bbox[2]],[bbox[0],bbox[3]],[bbox[1],bbox[3]],[bbox[1],bbox[2]]] for safe in SAFE_files: @@ -258,14 +266,13 @@ def get_dates(inps): for pnt in pnts: lon = float(pnt.split(',')[0]) lat = float(pnt.split(',')[1]) - + # keep track of all the corners to see of the product is larger than the bbox lats.append(lat) lons.append(lon) - import matplotlib - from matplotlib.path import Path as Path + # bbox = SNWE # polygon = bbox[0] bbox[2] SW @@ -273,21 +280,21 @@ def get_dates(inps): # bbox[1] bbox[3] NE # bbox[1] bbox[2] NW - poly =Path(bbox_poly) + poly = Path(bbox_poly) point = (lat,lon) in_bbox = poly.contains_point(point) - + # product corner falls within BBOX (SNWE) if in_bbox: reject_SAFE=False - + # If the product is till being rejected, check if the BBOX corners fall within the frame if reject_SAFE: for point in bbox_poly: - frame= [[a,b] for a,b in zip(lats,lons)] - poly=Path(frame) + frame = [[a,b] for a,b in zip(lats,lons)] + poly = Path(frame) in_frame = poly.contains_point(point) if in_frame: reject_SAFE=False @@ -328,7 +335,7 @@ def get_dates(inps): for date in dateList: #safe_dict[date].get_lat_lon() safe_dict[date].get_lat_lon_v2() - + #safe_dict[date].get_lat_lon_v3(inps) S.append(safe_dict[date].SNWE[0]) N.append(safe_dict[date].SNWE[1]) @@ -339,7 +346,7 @@ def get_dates(inps): if safe_dict[date].SNWE[0] <= bbox[0] and safe_dict[date].SNWE[1] >= bbox[1]: safe_dict_bbox[date] = safe_dict[date] safe_dict_bbox_finclude[date] = safe_dict[date] - elif date in includeList: + elif date in includeList: safe_dict_finclude[date] = safe_dict[date] safe_dict_bbox_finclude[date] = safe_dict[date] @@ -353,7 +360,7 @@ def get_dates(inps): print (max(S),min(N),max(W),min(E)) print ("*****************************************") if max(S) > min(N): - print ("""WARNING: + print ("""WARNING: There might not be overlap between some dates""") print ("*****************************************") ################################ @@ -396,10 +403,10 @@ def get_dates(inps): sys.exit(1) inps.master_date = dateList[0] print ("The master date was not chosen. The first date is considered as master date.") - + print ("") print ("All SLCs will be coregistered to : " + inps.master_date) - + slaveList = [key for key in safe_dict.keys()] slaveList.sort() slaveList.remove(inps.master_date) @@ -408,7 +415,7 @@ def get_dates(inps): print ("") return dateList, inps.master_date, slaveList, safe_dict - + def selectNeighborPairs(dateList, num_connections, updateStack=False): # should be changed to able to account for the existed aquisitions -- Minyan Zhong pairs = [] @@ -427,7 +434,7 @@ def selectNeighborPairs(dateList, num_connections, updateStack=False): # should ######################################## -# Below are few workflow examples. +# Below are few workflow examples. def slcStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, updateStack, mergeSLC=False): ############################# @@ -436,19 +443,19 @@ def slcStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, upd if not updateStack: i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_unpack_topo_master") + runObj.configure(inps, 'run_{:02d}_unpack_topo_master'.format(i)) runObj.unpackStackMasterSLC(safe_dict) runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_unpack_slave_slc") + runObj.configure(inps, 'run_{:02d}_unpack_slave_slc'.format(i)) runObj.unpackSlavesSLC(stackMasterDate, slaveDates, safe_dict) runObj.finalize() - + i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_average_baseline") + runObj.configure(inps, 'run_{:02d}_average_baseline'.format(i)) runObj.averageBaseline(stackMasterDate, slaveDates) runObj.finalize() @@ -456,25 +463,25 @@ def slcStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, upd if not updateStack: i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_extract_burst_overlaps") + runObj.configure(inps, 'run_{:02d}_extract_burst_overlaps'.format(i)) runObj.extractOverlaps() runObj.finalize() i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_overlap_geo2rdr") + runObj.configure(inps, 'run_{:02d}_overlap_geo2rdr'.format(i)) runObj.geo2rdr_offset(slaveDates) runObj.finalize() i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_overlap_resample") + runObj.configure(inps, 'run_{:02d}_overlap_resample'.format(i)) runObj.resample_with_carrier(slaveDates) runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_pairs_misreg") + runObj.configure(inps, 'run_{:02d}_pairs_misreg'.format(i)) if updateStack: runObj.pairs_misregistration(slaveDates, safe_dict) else: @@ -483,39 +490,39 @@ def slcStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, upd i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_timeseries_misreg") + runObj.configure(inps, 'run_{:02d}_timeseries_misreg'.format(i)) runObj.timeseries_misregistration() runObj.finalize() i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_fullBurst_geo2rdr") + runObj.configure(inps, 'run_{:02d}_fullBurst_geo2rdr'.format(i)) runObj.geo2rdr_offset(slaveDates, fullBurst='True') runObj.finalize() i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_fullBurst_resample") + runObj.configure(inps, 'run_{:02d}_fullBurst_resample'.format(i)) runObj.resample_with_carrier(slaveDates, fullBurst='True') runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_extract_stack_valid_region") + runObj.configure(inps, 'run_{:02d}_extract_stack_valid_region'.format(i)) runObj.extractStackValidRegion() runObj.finalize() if mergeSLC: i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge") + runObj.configure(inps, 'run_{:02d}_merge'.format(i)) runObj.mergeMaster(stackMasterDate, virtual = 'False') runObj.mergeSlaveSLC(slaveDates, virtual = 'False') - runObj.finalize() - + runObj.finalize() + i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_grid_baseline") + runObj.configure(inps, 'run_{:02d}_grid_baseline'.format(i)) runObj.gridBaseline(stackMasterDate, slaveDates) runObj.finalize() @@ -530,20 +537,20 @@ def correlationStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_d i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge_master_slave_slc") + runObj.configure(inps, 'run_{:02d}_merge_master_slave_slc'.format(i)) runObj.mergeMaster(stackMasterDate, virtual = 'True') runObj.mergeSlaveSLC(slaveDates, virtual = 'True') runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge_burst_igram") + runObj.configure(inps, 'run_{:02d}_merge_burst_igram'.format(i)) runObj.burstIgram_mergeBurst(acquisitionDates, safe_dict, pairs) runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_filter_coherence") + runObj.configure(inps, 'run_{:02d}_filter_coherence'.format(i)) runObj.filter_coherence(pairs) runObj.finalize() @@ -554,32 +561,32 @@ def interferogramStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge_master_slave_slc") + runObj.configure(inps, 'run_{:02d}_merge_master_slave_slc'.format(i)) runObj.mergeMaster(stackMasterDate, virtual = 'True') runObj.mergeSlaveSLC(slaveDates, virtual = 'True') runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_generate_burst_igram") + runObj.configure(inps, 'run_{:02d}_generate_burst_igram'.format(i)) runObj.generate_burstIgram(acquisitionDates, safe_dict, pairs) runObj.finalize() i += 1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge_burst_igram") + runObj.configure(inps, 'run_{:02d}_merge_burst_igram'.format(i)) runObj.igram_mergeBurst(acquisitionDates, safe_dict, pairs) runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_filter_coherence") + runObj.configure(inps, 'run_{:02d}_filter_coherence'.format(i)) runObj.filter_coherence(pairs) runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_unwrap") + runObj.configure(inps, 'run_{:02d}_unwrap'.format(i)) runObj.unwrap(pairs) runObj.finalize() @@ -590,15 +597,15 @@ def offsetStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_merge_master_slave_slc") + runObj.configure(inps, 'run_{:02d}_merge_master_slave_slc'.format(i)) runObj.mergeMaster(stackMasterDate, virtual = 'False') runObj.mergeSlaveSLC(slaveDates, virtual = 'False') runObj.finalize() i+=1 runObj = run() - runObj.configure(inps, 'run_' + str(i) + "_dense_offsets") - runObj.denseOffsets(pairs) + runObj.configure(inps, 'run_{:02d}_dense_offsets'.format(i)) + runObj.denseOffsets(pairs) runObj.finalize() @@ -669,7 +676,7 @@ def checkCurrentStatus(inps): slaveDates = latestCoregSLCs + newAcquisitions slaveDates.sort() - + acquisitionDates = slaveDates.copy() acquisitionDates.append(stackMasterDate) acquisitionDates.sort() @@ -710,7 +717,7 @@ def main(iargs=None): print('') print('**************************') print('run_files folder exists.') - print(os.path.join(inps.work_dir, 'run_files'), ' already exists.') + print(os.path.join(inps.work_dir, 'run_files'), ' already exists.') print('Please remove or rename this folder and try again.') print('') print('**************************') @@ -744,8 +751,8 @@ def main(iargs=None): print ('Workflow: ', inps.workflow) print ('*****************************************') if inps.workflow == 'interferogram': - - interferogramStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, pairs, updateStack) + + interferogramStack(inps, acquisitionDates, stackMasterDate, slaveDates, safe_dict, pairs, updateStack) elif inps.workflow == 'offset': @@ -761,6 +768,5 @@ def main(iargs=None): if __name__ == "__main__": - # Main engine + # Main engine main() -