Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stuck in Floodpy_app.create_S1_stack(overwrite=False) and missing LAI error #78

Closed
JacobJeppesen opened this issue Jul 22, 2024 · 6 comments

Comments

@JacobJeppesen
Copy link

I'm trying to run the example notebooks, but I can't get any further than the pre-processing. When running Floodpy_app.create_S1_stack(overwrite=False), I get this message under the cell in Jupyter:

Coregistrating the primary image (2 GRDs): 
 S1A_IW_GRDH_1SDV_20230906T043947_20230906T044012_050202_060AD8_440E.SAFE.zip 
 S1A_IW_GRDH_1SDV_20230906T044012_20230906T044037_050202_060AD8_49FE.SAFE.zip
with secondary image (2 GRDs): 
 S1A_IW_GRDH_1SDV_20230708T043943_20230708T044008_049327_05EE7E_A3D4.SAFE.zip 
 S1A_IW_GRDH_1SDV_20230708T044008_20230708T044033_049327_05EE7E_C11B.SAFE.zip

It seems to be stuck there, and letting it run for ~2 hours doesn't results in any progress. Looking at htop, it's running gpt from SNAP with lots of cores active. I've only changed paths and credentials at the top, without any further changes to the notebook. I've tried SNAP 9.0 and 10.0, running Ubuntu 22.04 on an AMD 5950x with 128GB RAM.

Then I tried a different region, by changing the location to:

LONMIN = 9.16
LATMIN = 56.08
LONMAX = 9.96
LATMAX = 56.50

I kept the flood and pre-flood start/end dates the sames, and used sel_flood_date = '2023-09-08T17:07:51.075000000'. This gave me the following error instead in Floodpy_app.create_S1_stack(overwrite=False):

---------------------------------------------------------------------------
NoDataInBounds                            Traceback (most recent call last)
Cell In[13], line 1
----> 1 Floodpy_app.create_S1_stack(overwrite=False)

File [~/Downloads/FLOODPY/floodpy/FLOODPYapp.py:168](http://localhost:8888/lab/tree/floodpy/FLOODPYapp.py#line=167), in FloodwaterEstimation.create_S1_stack(self, overwrite)
    165 self.DEM_filename = os.path.join(self.Preprocessing_dir,'DEM.nc')
    166 self.LIA_filename = os.path.join(self.Preprocessing_dir,'LIA.nc')
--> 168 Run_Preprocessing(self, overwrite = overwrite)

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/Preprocessing_S1_data.py:94](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/Preprocessing_S1_data.py#line=93), in Run_Preprocessing(Floodpy_app, overwrite)
     89     print("Please use a smaller AOI.")
     91 if not os.path.exists(flood_xarray_outfile):
     92 
     93     # create xarray for flood (primary) image
---> 94     create_coreg_xarray(netcdf4_out_filename = flood_xarray_outfile,
     95                         snap_hdf5_in_filename = flood_hdf5_outfile+ '.h5',
     96                         geojson_bbox = Floodpy_app.geojson_bbox,
     97                         ref_tif_file = Floodpy_app.lc_mosaic_filename,
     98                         primary_image = True,
     99                         delete_hdf5 = True)
    101 for Pre_flood_date in Pre_flood_dates:
    102     S1_pre_flood_rows = Floodpy_app.query_S1_sel_df.loc[pd.to_datetime(Pre_flood_date): pd.to_datetime(Pre_flood_date) + pd.Timedelta(hours=24)]

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/xarray_funcs.py:162](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/xarray_funcs.py#line=161), in create_coreg_xarray(netcdf4_out_filename, snap_hdf5_in_filename, geojson_bbox, ref_tif_file, primary_image, delete_hdf5)
    159 LIA = S1_file['localIncidenceAngle'][:]
    160 LIA[LIA==0] = np.nan
--> 162 save_LIA_xarray(LIA = LIA,
    163                 S1_lon_vector = S1_lon_vector,
    164                 S1_lat_vector = S1_lat_vector,
    165                 geojson_bbox = geojson_bbox,
    166                 ref_xarray = ref_xarray,
    167                 LIA_xarray_outfile = os.path.join(out_dir,'LIA.nc'))
    169 save_DEM_xarray(DEM = Elevation,
    170                 S1_lon_vector = S1_lon_vector,
    171                 S1_lat_vector = S1_lat_vector,
    172                 geojson_bbox = geojson_bbox,
    173                 ref_xarray = ref_xarray,
    174                 DEM_xarray_outfile = os.path.join(out_dir,'DEM.nc'))
    176 np.save(os.path.join(out_dir,'S1_lat_vector.npy'), S1_lat_vector)

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/xarray_funcs.py:74](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/xarray_funcs.py#line=73), in save_LIA_xarray(LIA, S1_lon_vector, S1_lat_vector, geojson_bbox, ref_xarray, LIA_xarray_outfile)
     70 df.y.attrs['axis'] = 'Y'
     72 df.rio.write_crs("epsg:4326", inplace=True)
---> 74 df_coreg = df.rio.clip(gpd.read_file(geojson_bbox)['geometry'])
     76 S1_coreg = df_coreg.rio.reproject_match(ref_xarray)
     77 S1_coreg.to_netcdf(LIA_xarray_outfile, format='NETCDF4')

File [~/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_dataset.py:380](http://localhost:8888/home/jhj/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_dataset.py#line=379), in RasterDataset.clip(self, geometries, crs, all_touched, drop, invert, from_disk)
    377 try:
    378     x_dim, y_dim = _get_spatial_dims(self._obj, var)
    379     clipped_dataset[var] = (
--> 380         self._obj[var]
    381         .rio.set_spatial_dims(x_dim=x_dim, y_dim=y_dim, inplace=True)
    382         .rio.clip(
    383             geometries,
    384             crs=crs,
    385             all_touched=all_touched,
    386             drop=drop,
    387             invert=invert,
    388             from_disk=from_disk,
    389         )
    390     )
    391 except MissingSpatialDimensionError:
    392     if len(self._obj[var].dims) >= 2 and not get_option(
    393         SKIP_MISSING_SPATIAL_DIMS
    394     ):

File [~/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_array.py:943](http://localhost:8888/home/jhj/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_array.py#line=942), in RasterArray.clip(self, geometries, crs, all_touched, drop, invert, from_disk)
    931     cropped_ds = _clip_xarray(
    932         self._obj,
    933         geometries=geometries,
   (...)
    936         invert=invert,
    937     )
    939 if (
    940     cropped_ds.coords[self.x_dim].size < 1
    941     or cropped_ds.coords[self.y_dim].size < 1
    942 ):
--> 943     raise NoDataInBounds(
    944         f"No data found in bounds.{_get_data_var_message(self._obj)}"
    945     )
    947 # make sure correct attributes preserved & projection added
    948 _add_attrs_proj(cropped_ds, self._obj)

NoDataInBounds: No data found in bounds. Data variable: LIA

I tried a couple of different regions too, but got the exact same error.

Have anyone encountered similar issues? And have any ideas on how to fix them? 🙂

@kleok
Copy link
Owner

kleok commented Jul 22, 2024

Hello @JacobJeppesen
Thanks for using Floodpy. Firstly I would like to focus on the issue you have with the provided example notebook. It seems strange that you don't have a result at 2 hours given your configuration.
Could you please verify that your downloaded S1 images are not corrupted. You can open them in SNAP desktop and see if you could visualize the intensity bands.
Please let me know if this is the case.

@JacobJeppesen
Copy link
Author

Hi @kleok

Thank you for making Floodpy! 🙂

I just opened a couple of the Sentinel-1 files in SNAP. They seem to be fine, and I can visualize the bands. All Sentinel-1 zip files are 1.8GB.

Just to be sure I did everything correctly, I just tried again from scratch:

  1. Installed SNAP 9.0 without updating any packages.
  2. Cloned the Floodpy repo, created conda environment from FLOODPY_gpu_env.yml, and activated the environment.
  3. Installed Jupyter lab and opened the Floodpyapp_notebook.ipynb notebook.
  4. Changed paths to:
# The path of your project. Make sure you have enough free space disk on the specific location.
projectfolder = '/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023'

# The location of floodpy code 
src_dir = '/home/jhj/Downloads/floodpy_issue_test/FLOODPY/floodpy/'

# SNAP ORBIT DIRECTORY
snap_orbit_dir = '/home/jhj/.snap/auxdata/Orbits/Sentinel-1'

# SNAP GPT full path
GPTBIN_PATH = '/home/jhj/esa-snap/bin/gpt'
  1. Created the project folder /home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023
  2. Ran all cells in the notebook.

The issue was the same, where it got stuck on the Coregistration step. How long should the processing time approximately be, if it worked?

I'd normally change credentials in the notebook btw., but this time I wanted to make as few changes as possible, so I just used the defaults.

@JacobJeppesen
Copy link
Author

I continued debugging, and made this change to enable logging from ESA SNAP:

def perform_pair_preprocessing_2GRD_2GRD(gptcommand, primary1, primary2, secondary1, secondary2, outfile, Subset_AOI, xml_file, overwrite):
    
    """
    This function performs the preprocessing of the given pair from the
    input S1 SLC stack. 
    """
    if not overwrite:
        if os.path.exists(outfile+'.h5'):
            return 0
        
    argvs=[gptcommand, '-e',
            xml_file,
            '-Pfilein1='+primary1,
            '-Pfilein2='+primary2,
            '-Pfilein3='+secondary1,
            '-Pfilein4='+secondary2,
            '-Ppolygon='+_shapely_to_snap_polygon(Subset_AOI),
            '-Pfileout='+outfile,]

    print_esa_snap_logs = True
    if print_esa_snap_logs:
        # Append argument for logging from ESA SNAP
        argvs.append('-J-Dsnap.log.level=FINE')

        # Print the command for manual execution
        print("Running command:")
        print(" ".join(argvs))

        # Run the command while printing ESA SNAP logs to stdout in real-time
        with subprocess.Popen(argvs, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True) as process:
            for stdout_line in iter(process.stdout.readline, ""):
                print(stdout_line, end="")
            process.stdout.close()
            return_code = process.wait()
            if return_code:
                raise subprocess.CalledProcessError(return_code, argvs)
    else:
        subprocess.check_call(argvs, stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT)

    return 0

It gave me the following command for the graph processing:

/home/jhj/esa-snap/bin/gpt -e /home/jhj/Downloads/floodpy_issue_test/FLOODPY/floodpy/Preprocessing_S1_data/Graphs/preprocessing_pair_primary_2GRD_secondary_2GRD.xml -Pfilein1=/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023/Sentinel_1_data/S1A_IW_GRDH_1SDV_20230906T043947_20230906T044012_050202_060AD8_440E.SAFE.zip -Pfilein2=/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023/Sentinel_1_data/S1A_IW_GRDH_1SDV_20230906T044012_20230906T044037_050202_060AD8_49FE.SAFE.zip -Pfilein3=/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023/Sentinel_1_data/S1A_IW_GRDH_1SDV_20230708T043943_20230708T044008_049327_05EE7E_A3D4.SAFE.zip -Pfilein4=/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023/Sentinel_1_data/S1A_IW_GRDH_1SDV_20230708T044008_20230708T044033_049327_05EE7E_C11B.SAFE.zip -Ppolygon=POLYGON ((21.82 39.35, 21.82 39.65, 22.3 39.65, 22.3 39.35, 21.82 39.35, 21.82 39.35)) -Pfileout=/home/jhj/Downloads/floodpy_issue_test/FLOODPY/projects/Thessalia_Floods_2023/Preprocessed_20230906T043947/20230708T043956 -J-Dsnap.log.level=FINE

Running it showed that it's getting stuck on RemoveGRDBorderNoiseOp:

.
.
.
FINE: org.esa.snap.core.gpf.internal.OperatorContext: Tile cache assigned to OperatorImageTileStack[Write,Sigma0_VV_mst_06Sep2023]
FINE: org.esa.snap.core.gpf.internal.OperatorContext: Tile cache assigned to OperatorImageTileStack[Write,Sigma0_VH_slv1_08Jul2023]
FINE: org.esa.snap.core.gpf.internal.OperatorContext: Tile cache assigned to OperatorImageTileStack[Write,Sigma0_VV_slv2_08Jul2023]
..20%....30%FINE: org.esa.snap.core.dataio.AbstractProductReader: Finish reading the product: input: /home/jhj/.snap/auxdata/dem/Copernicus 30m Global DEM/Copernicus_DSM_COG_10_N39_00_E021_00_DEM.tif, reader: org.esa.snap.dataio.geotiff.GeoTiffProductReader, size: 3600x3600, elapsed time: 0.095 seconds.
FINE: org.esa.snap.core.dataio.AbstractProductReader: Finish reading the product: input: /home/jhj/.snap/auxdata/dem/Copernicus 30m Global DEM/Copernicus_DSM_COG_10_N39_00_E022_00_DEM.tif, reader: org.esa.snap.dataio.geotiff.GeoTiffProductReader, size: 3600x3600, elapsed time: 0.008 seconds.
..FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: topBorder = 433
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: bottomBorder = 16262
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: leftBorder = 268
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: rightBorder = 26469
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: topBorder = 436
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: bottomBorder = 16252
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: leftBorder = 268
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: rightBorder = 26469
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: topBorder = 219
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: bottomBorder = 16671
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: leftBorder = 189
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: rightBorder = 26459
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: topBorder = 272
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: bottomBorder = 16670
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: leftBorder = 189
FINE: org.esa.s1tbx.calibration.gpf.RemoveGRDBorderNoiseOp: rightBorder = 26459
..40%....50%....60%..

I don't think I've encountered this before. The noise removal normally runs very fast. Have you ever had issues similar to this? It's basically just sitting there, running full load on all cores. However, it is very slowly progressing, so I'll try to let it run for longer tomorrow, and see what happens. It seems like it'll take ~4 hours for this step, but I'll let it have it 🙂

@kleok
Copy link
Owner

kleok commented Jul 23, 2024

Hey @JacobJeppesen,
Thanks for taking the time to dive into this issue. I think that your steps are correct and I believe that the issue is related to processing using ESA-SNAP. For now let's try to resolve this issue using SNAP9 even if the latest version is SNAP10.
Given that the expected completion time for each pair (given that we have 2GRDs for our AOI) is about 10 min, I believe the huge running time is related to the maximum memory usage of SNAP. I suggest you to open /home/jhj/esa-snap/bin/gpt.vmoptions and adjust the maximum memory usage from 512 MB to 32 Gb. Please ensure that -Xmx512m is commented and add at the end of the file -Xmx32G.
Let me know if this worked. Happy to help 😄

kleok added a commit that referenced this issue Jul 23, 2024
kleok added a commit that referenced this issue Jul 23, 2024
@JacobJeppesen
Copy link
Author

The -Xmx32G parameter didn't do the trick, but it got me on the right track. Opened SNAP and went through Tools -> Options -> Performance -> System -> Compute -> Apply. That updated /home/jhj/esa-snap/bin/gpt.vmoptions to:

# Enter one VM parameter per line
# For example, to adjust the maximum memory usage to 512 MB, uncomment the following line:
# -Xmx512m
# To include another file, uncomment the following line:
# -include-options [path to other .vmoption file]
-Xmx90099m
-Xms2048m
-XX:+AggressiveOpts
-Xverify:none
-Dnetbeans.mainclass=org.esa.snap.main.Main
-Dsun.java2d.noddraw=true
-Dsun.awt.nopixfmt=true
-Dsun.java2d.dpiaware=false

Now it all runs smoothly.

Thanks for the help! 🙂

@JacobJeppesen
Copy link
Author

JacobJeppesen commented Jul 24, 2024

Just tested the other region, where I had the LIA error, and it works perfectly after updating with the code from your latest commit 👍

Thanks again! 😃

@kleok kleok closed this as completed Jul 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants