master
Kilian Vos 6 years ago
parent fa41272395
commit 1b0ca75d66

@ -1,23 +1,23 @@
# CoastSat
This software enables the users to extract time-series of shoreline change over the last 30+ years at their site of interest.
CoastSat is an open-source software toolkit written in Python that enables users to obtain time-series of shoreline position at any coastline worldwide from 30+ years (and growing) of publicly available satellite imagery.
![Alt text](https://github.com/kvos/CoastSat/blob/development/classifiers/doc/example.gif?raw=true)
![Alt text](https://github.com/kvos/CoastSat/blob/development/classifiers/doc/example.gif)
The algorithms used in this software are described in:
The underlying approach and application of the CoastSat toolkit are described in detail in:
*Vos K., Splinter K.D., Harley M.D., Simmons J.A., Turner I.L. (submitted). CoastSat: a Google Earth Engine-enabled software to extract shorelines from publicly available satellite imagery, Environmental Modelling and Software*.
*Vos K., Splinter K.D., Harley M.D., Simmons J.A., Turner I.L. (submitted). CoastSat: a Google Earth Engine-enabled Python toolkit to extract shorelines from publicly available satellite imagery, Environmental Modelling and Software*.
There are two main steps:
- retrieval of the satellite images of the region of interest from Google Earth Engine
- extraction of the shorelines from the images using a sub-pixel resolution technique
- assisted retrieval from from Google Earth Engine of all avaiable satellite images spanning the user-defined region of interest and time period
- automated extraction of shorelines from all the selected images using a sub-pixel resolution technique
### Description
Satellite remote sensing can provide low-cost long-term shoreline data capable of resolving the temporal scales of interest to coastal scientists and engineers at sites where no in-situ measurements are available. Satellite imagery spanning the last 30 years with constant revisit periods is publicly available and suitable to extract repeated measurements of the shoreline position.
CoastSat is an open-source Python module that allows the user to extract shorelines from Landsat 5, Landsat 7, Landsat 8 and Sentinel-2 images.
The shoreline detection algorithm implemented in CoastSat combines a sub-pixel border segmentation and an image classification component, which refines the segmentation into four distinct categories such that the shoreline detection is specific to the sand/water interface.
Satellite remote sensing can provide low-cost long-term shoreline data capable of resolving the temporal scales of interest to coastal scientists and engineers at sites where no in-situ field measurements are available. Satellite imagery spanning the last 30 years with constant revisit periods is publicly available and suitable to extract repeated measurements of the shoreline position.
CoastSat enables the non-expert user to extract shorelines from Landsat 5, Landsat 7, Landsat 8 and Sentinel-2 images.
The shoreline detection algorithm implemented in CoastSat is optimised for sandy beach coastlines. It combines a sub-pixel border segmentation and an image classification component, which refines the segmentation into four distinct categories such that the shoreline detection is specific to the sand/water interface.
## 1. Installation
@ -26,12 +26,12 @@ CoastSat requires the following Python packages to run:
conda-forge: python=3.6 | matplotlib | scikit-image | scikit-learn | gdal | earthengine-api | oauth2client | spyder | jupyter | simplekml
PyPi: shapely
```
If you are not a regular Python user and are not sure how to install these packages from *conda-forge* and *PyPi*, the section below shows how to install them step-by-step using Anaconda. Otherwise, install the packages and go directly to section **1.2 Activating Google Earth Engine Python API**.
If you are not a regular Python user and are not sure how to install these packages from *conda-forge* and *PyPi*, the section below explains how to install them step-by-step using Anaconda. More experinced Python users can proceed to install these packages and go directly to section **1.2 Activating Google Earth Engine Python API**.
### 1.1 Installing the packages (Anaconda)
If Anaconda is not already installed on your PC, you can get it at https://www.anaconda.com/download/.
Open the *Anaconda prompt* (in Mac and Linux, open a terminal window) and drive to the folder where you have downloaded/cloned this repository.
If Anaconda is not already installed on your PC, it can be freely downloaded at https://www.anaconda.com/download/.
Open the *Anaconda prompt* (in Mac and Linux, open a terminal window) and select the folder where you have downloaded/cloned this repository.
Create a new environment named *coastsat*:
@ -47,9 +47,9 @@ conda activate coastsat
On Linux systems, type `source activate coastsat` instead.
To know if you have activated coastsat, your terminal command line prompt should now start with (coastsat) when it is activated.
To confrim that you have successfully activated CoastSat, your terminal command line prompt should now start with (coastsat).
Now you need to populate the environment with the packages needed to run CoastSat. All the necessary packages are contained in three platform specific files: `requirements_win64.txt`, `requirements_osx64.txt`, `requirements_linux64.txt`. To install the packages, run one of the following commands, depending on which platform you are operating:
Now you need to populate the environment with the packages needed to run the CoastSat toolkit. All the necessary packages are contained in three platform specific files: `requirements_win64.txt`, `requirements_osx64.txt`, `requirements_linux64.txt`. To install the package for your pc platform, run one of the following commands, depending on which platform you are operating:
#### Windows 64 bits (win64)
@ -69,7 +69,7 @@ conda install --name coastsat --file requirements_osx64.txt
conda install --name coastsat --file requirements_linux64.txt
```
This might take a few minutes... once it is finished run the following command:
This might take a few minutes... once it is finished, run the following command:
```
pip install shapely
@ -79,7 +79,7 @@ All the packages have now been install in an environment called `coastsat`.
### 1.2 Activating Google Earth Engine Python API
Go to https://earthengine.google.com and sign up to Google Earth Engine.
Go to https://earthengine.google.com and sign up to Google Earth Engine (GEE).
![gee_capture](https://user-images.githubusercontent.com/7217258/49348457-a9271300-f6f9-11e8-8c0b-407383940e94.jpg)
@ -89,55 +89,52 @@ Once you have created a Google Earth Engine account, go back to Anaconda and lin
earthengine authenticate
```
A web browser will open, login with your GEE credentials, accept the terms and conditions and copy the authorization code into the Anaconda terminal.
A web browser will open. Login with your GEE credentials, read and accept the terms and conditions, and copy the authorization code into the Anaconda terminal.
Now you are ready to start using the CoastSat toolbox!
## 2. Usage
**Note**: remeber to always activate the `coastsat` environment with `conda activate coastsat` each time you wish to use it.
Your terminal command line prompt should start with (coastsat) when it is activated.
**Note**: remeber to always activate the `coastsat` environment with `conda activate coastsat` each time you are preparing to use it.
Your terminal command line prompt should always start with (coastsat) to confirm that it is activated.
An example of how to run the software in a Jupyter Notebook is provided in the repository (`example_jupyter.ipynb`). To run it, first activate your `coastsat` environment with `conda activate coastsat` (if not already active), and then type:
An example of how to run the software in a Jupyter Notebook is provided in the repository (`example_jupyter.ipynb`). To run this, first activate your `coastsat` environment with `conda activate coastsat` (if not already active), and then type:
```
jupyter notebook
```
A web browser window will open, drive to the directory where you downloaded/cloned this repository and click on `example_jupyter.ipynb`.
The following sections guide the reader through the different functionalities of CoastSat with an example at Narrabeen beach (Australia).
A web browser window will open. Point to the directory where you downloaded/cloned this repository and click on `example_jupyter.ipynb`.
The following sections guide the reader through the different functionalities of CoastSat with an example at Narrabeen-Collaroy beach (Australia). If you prefer to use Spyder or PyCharm or other integrated development environments (IDEs), a Python script is also included `main.py` in the repository.
To run a Jupyter Notebook, put your cursor inside one of the code sections and then hit the 'run' button up in the top menu to run that section and progress forward (as shown in the animation below).
To run a Jupyter Notebook, place your cursor inside one of the code sections and then clikc on the 'run' button up in the top menu to run that section and progress forward (as shown in the animation below).
![example_jupyter](https://user-images.githubusercontent.com/7217258/49705486-8dc88480-fc72-11e8-8300-c342baaf54eb.gif)
### 2.1 Retrieval of the satellite images
To retrieve the satellite images cropped around the the region of interest from Google Earth Engine servers the following user-defined variables are needed:
To retrieve from the GEE server the avaiable satellite images cropped around the required region of coasltine for the particular time period of interest, the following user-defined variables are required:
- `polygon`: the coordinates of the region of interest (longitude/latitude pairs)
- `dates`: dates over which the images will be retrieved (e.g., `dates = ['2017-12-01', '2018-01-01']`)
- `sat_list`: satellite missions to consider (e.g., `sat_list = ['L5', 'L7', 'L8', 'S2']` for Landsat 5, 7, 8 and Sentinel-2 collections).
- `sitename`: name of the site (defines the name of the subfolder where the files will be stored)
- `sat_list`: satellite missions to consider (e.g., `sat_list = ['L5', 'L7', 'L8', 'S2']` for Landsat 5, 7, 8 and Sentinel-2 collections)
- `sitename`: name of the site (user-defined name of the subfolder where the images and other accompanying files will be stored)
The call `metadata = SDS_download.retrieve_images(inputs)` will launch the retrieval of the images and store them as .TIF files (under *.data\sitename*). The metadata contains the exact time of acquisition (UTC) and geometric accuracy of each downloaded image and is saved as `metadata_sitename.pkl`. If the images have already been downloaded previously and the user only wants to run the shoreline detection, the metadata can be loaded directly from this file. The screenshot below shows an example where all the images of Narrabeen-Collaroy (Australia) acquired in December 2017 are retrieved.
The call `metadata = SDS_download.retrieve_images(inputs)` will launch the retrieval of the images and store them as .TIF files (under *.data\sitename*). The metadata contains the exact time of acquisition (UTC) and geometric accuracy of each downloaded image and is saved as `metadata_sitename.pkl`. If the images have already been downloaded previously and the user only wants to run the shoreline detection, the metadata can be loaded directly from this file. The screenshot below shows an example where all the images of Collaroy-Narrrabeen (Australia) acquired in December 2017 are retrieved.
![retrieval](https://user-images.githubusercontent.com/7217258/49353105-0037e280-f710-11e8-9454-c03ce6116c54.PNG)
### 2.2 Shoreline detection
It is finally time to map shorelines!
It is now time to map the sandy shorelines!
The following user-defined settings are required:
- `cloud_thresh`: threshold on maximum cloud cover that is acceptable on the images (value between 0 and 1)
- `cloud_thresh`: threshold on maximum cloud cover that is acceptable on the images (value between 0 and 1 - this may require some initial experimentation)
- `output_epsg`: epsg code defining the spatial reference system of the shoreline coordinates
- `check_detection`: if set to `True` allows the user to quality control each shoreline detection
See http://spatialreference.org/ to find the EPSG number corresponding to your local coordinate system. If the user wants to quality control the mapped shorelines and manually validate each detection, the parameter `check_detection` should be set to `True`.
In addition, there are extra parameters (`min_beach_size`, `buffer_size`, `min_length_sl`) that can be tuned to optimise the shoreline detection (for Advanced users only). For the moment leave those parameters to their default values, we will see later how they can be modified.
In addition, there are extra parameters (`min_beach_size`, `buffer_size`, `min_length_sl`) that can be tuned to optimise the shoreline detection (for Advanced users only). For the moment leave these parameters set to their default values, we will see later how they can be modified.
An example of settings is provided here:
@ -149,24 +146,24 @@ output = SDS_shoreline.extract_shorelines(metadata, settings)
```
When `check_detection` is set to `True`, a figure like the one below appears and asks the user to manually accept/reject each detection by clicking on `keep` or `skip`.
![Alt text](https://github.com/kvos/CoastSat/blob/development/classifiers/doc/batch_detection.gif?raw=true)
![Alt text](https://github.com/kvos/CoastSat/blob/development/classifiers/doc/batch_detection.gif)
Once all the shorelines have been mapped, the output is available in two different formats (saved under *.\data\sitename*):
- `sitename_output.pkl`: contains a list with the shoreline coordinates and the exact timestamp at which the image was captured (UTC time) as well as the geometric accuracy and the cloud cover of the image. The list can be manipulated with Python, a snippet of code to plot the results is provided in the main script.
- `sitename_output.pkl`: contains a list with the shoreline coordinates and the exact timestamp at which the image was captured (UTC time) as well as the geometric accuracy and the cloud cover of each indivdual image. This list can be manipulated with Python, a snippet of code to plot the results is provided in the main script.
- `sitename_output.kml`: this output can be visualised in a GIS software (e.g., QGIS, ArcGIS).
The figure below shows how the satellite-derived shorelines can be opened in GIS software using the `.kml` output.
The figure below shows how the satellite-derived shorelines can be opened in a GIS software (QGIS) using the `.kml` output.
![gis_output](https://user-images.githubusercontent.com/7217258/49361401-15bd0480-f730-11e8-88a8-a127f87ca64a.jpeg)
### Advanced shoreline detection parameters
#### Advanced shoreline detection parameters
As mentioned above, there are extra parameters that can be modified to optimise the shoreline detection:
- `min_beach_area`: minimum allowable object area (in metres^2) for the class sand. During the image classification, some building roofs may be incorrectly labelled as sand. To correct this, all the objects classified as sand containing less than a certain number of connected pixels are removed from the sand class. The default value of `min_beach_area` is 4500 m^2, which corresponds to 20 connected pixels of 15 m^2. If you are looking at a very small beach (<20 connected pixels on the images), decrease the value of this parameter.
As mentioned above, there are some additional parameters that can be modified to optimise the shoreline detection:
- `min_beach_area`: minimum allowable object area (in metres^2) for the class 'sand'. During the image classification, some features (for example, building roofs) may be incorrectly labelled as sand. To correct this, all the objects classified as sand containing less than a certain number of connected pixels are removed from the sand class. The default value of `min_beach_area` is 4500 m^2, which corresponds to 20 connected pixels of 15 m^2. If you are looking at a very small beach (<20 connected pixels on the images), try decreasing the value of this parameter.
- `buffer_size`: radius (in metres) that defines the buffer around sandy pixels that is considered for the shoreline detection. The default value of `buffer_size` is 150 m. This parameter should be increased if you have a very wide (>150 m) surf zone or inter-tidal zone.
- `min_length_sl`: minimum length (in metres) of shoreline perimeter to be valid. This allows to discard small contours that are detected but do not correspond to the actual shoreline. The default value is 200 m. If the shoreline that you are trying to map is shorter than 200 m, decrease the value of this parameter.
- `min_length_sl`: minimum length (in metres) of shoreline perimeter to be valid. This can be used to discard small features that are detected but do not correspond to the sand-water shoreline. The default value is 200 m. If the shoreline that you are trying to map is shorter than 200 m, decrease the value of this parameter.
### Reference shoreline
#### Reference shoreline
There is also an option to manually digitize a reference shoreline before running the batch shoreline detection on all the images. This reference shoreline helps to reject outliers and false detections when mapping shorelines as it only considers as valid shorelines the points that are within a distance from this reference shoreline.
@ -181,6 +178,27 @@ This function allows the user to click points along the shoreline on one of the
The maximum distance (in metres) allowed from the reference shoreline is defined by the parameter `max_dist_ref`. This parameter is set to a default value of 100 m. If you think that your shoreline will move more than 100 m, please change this parameter to an appropriate distance. This may be the case for large nourishments or eroding/accreting coastlines.
### 2.3 Shoreline change analysis
This section shows how to obtain time-series of shoreline change along shore-normal transects.
The user can draw shore-normal transects by calling:
```
settings['transect_length'] = 500 # defines the length of the transects in metres
transects = SDS_transects.draw_transects(output, settings)
```
Once the shore-normal transects have been defined, the intersection between the 2D shorelines and the transects is computed with the following function:
```
settings['along_dist'] = 25
cross_distance = SDS_transects.compute_intersection(output, transects, settings)
```
The parameter `along_dist` defines the along-shore distance around the transect over which shoreline points are selected to compute the intersection. The default value is 25 m, which means that the intersection is computed as the median of the points located within 25 m of the transect (50 m alongshore-median).
An example is illustrated below:
![transects](https://user-images.githubusercontent.com/7217258/49990925-8b985a00-ffd3-11e8-8c54-57e4bf8082dd.gif)
## Issues and Contributions
Having a problem or looking to contribute to the code? Please see the [Issues page](https://github.com/kvos/coastsat/issues).

@ -717,7 +717,7 @@ def get_reference_sl_manual(metadata, settings):
pts_pix_interp = np.append(pts_pix_interp,
np.transpose(np.array([xvals,yinterp])), axis=0)
pts_pix_interp = np.delete(pts_pix_interp,0,axis=0)
plt.plot(pts_pix_interp[:,0], pts_pix_interp[:,1], 'r.', markersize=5)
plt.plot(pts_pix_interp[:,0], pts_pix_interp[:,1], 'r.', markersize=3)
plt.title('Saving reference shoreline as ' + sitename + '_reference_shoreline.pkl ...')
plt.draw()
ginput(n=1, timeout=5, show_clicks=True)

@ -616,8 +616,6 @@ def extract_shorelines(metadata, settings):
os.makedirs(filepath_jpg)
except:
print('')
# loop through satellite list
for satname in metadata.keys():
@ -714,6 +712,9 @@ def extract_shorelines(metadata, settings):
'idxkeep': 'indices of the images that were kept to extract a shoreline'
}
# change the format to have one list sorted by date with all the shorelines (easier to use)
output = SDS_tools.merge_output(output)
# save outputput structure as output.pkl
filepath = os.path.join(os.getcwd(), 'data', sitename)
with open(os.path.join(filepath, sitename + '_output.pkl'), 'wb') as f:
@ -721,17 +722,16 @@ def extract_shorelines(metadata, settings):
# save output as kml for GIS applications
kml = simplekml.Kml()
for satname in metadata.keys():
for i in range(len(output[satname]['shoreline'])):
if len(output[satname]['shoreline'][i]) == 0:
continue
sl = output[satname]['shoreline'][i]
date = output[satname]['timestamp'][i]
newline = kml.newlinestring(name= date.strftime('%Y-%m-%d'))
newline.coords = sl
newline.description = satname + ' shoreline' + '\n' + 'acquired at ' + date.strftime('%H:%M:%S') + ' UTC'
kml.save(os.path.join(filepath, sitename + '_shorelines.kml'))
for i in range(len(output['shorelines'])):
if len(output['shorelines'][i]) == 0:
continue
sl = output['shorelines'][i]
date = output['dates'][i]
newline = kml.newlinestring(name= date.strftime('%Y-%m-%d'))
newline.coords = sl
newline.description = satname + ' shoreline' + '\n' + 'acquired at ' + date.strftime('%H:%M:%S') + ' UTC'
kml.save(os.path.join(filepath, sitename + '_output.kml'))
return output

@ -368,5 +368,40 @@ def mask_raster(fn, mask):
# close dataset and flush cache
raster = None
def merge_output(output):
"""
Function to merge the output dictionnary, which has one key per satellite mission into a
dictionnary containing all the shorelines and dates ordered chronologically.
Arguments:
-----------
output: dict
contains the extracted shorelines and corresponding dates.
Returns:
-----------
output_all_sorted: dict
contains the extracted shorelines sorted by date in a single list
"""
output_all = {'dates':[], 'shorelines':[], 'geoaccuracy':[], 'satname':[], 'image_filename':[]}
for satname in list(output.keys()):
if satname == 'meta':
continue
output_all['dates'] = output_all['dates'] + output[satname]['timestamp']
output_all['shorelines'] = output_all['shorelines'] + output[satname]['shoreline']
output_all['geoaccuracy'] = output_all['geoaccuracy'] + output[satname]['geoaccuracy']
output_all['satname'] = output_all['satname'] + [_ for _ in np.tile(satname,
len(output[satname]['timestamp']))]
output_all['image_filename'] = output_all['image_filename'] + output[satname]['filename']
# sort chronologically
output_all_sorted = {'dates':[], 'shorelines':[], 'geoaccuracy':[], 'satname':[], 'image_filename':[]}
idx_sorted = sorted(range(len(output_all['dates'])), key=output_all['dates'].__getitem__)
output_all_sorted['dates'] = [output_all['dates'][i] for i in idx_sorted]
output_all_sorted['shorelines'] = [output_all['shorelines'][i] for i in idx_sorted]
output_all_sorted['geoaccuracy'] = [output_all['geoaccuracy'][i] for i in idx_sorted]
output_all_sorted['satname'] = [output_all['satname'][i] for i in idx_sorted]
output_all_sorted['image_filename'] = [output_all['image_filename'][i] for i in idx_sorted]
return output_all_sorted

@ -0,0 +1,232 @@
"""This module contains functions to analyze the shoreline data along transects'
Author: Kilian Vos, Water Research Laboratory, University of New South Wales
"""
# load modules
import os
import numpy as np
import matplotlib.pyplot as plt
import pdb
# other modules
import skimage.transform as transform
from pylab import ginput
import pickle
import simplekml
def find_indices(lst, condition):
"imitation of MATLAB find function"
return [i for i, elem in enumerate(lst) if condition(elem)]
def create_transect(origin, orientation, length):
"""
Create a 2D transect of points with 1m interval.
Arguments:
-----------
origin: np.array
contains the X and Y coordinates of the origin of the transect
orientation: int
angle of the transect (anti-clockwise from North) in degrees
length: int
length of the transect in metres
Returns:
-----------
transect: np.array
contains the X and Y coordinates of the transect
"""
x0 = origin[0]
y0 = origin[1]
# orientation of the transect
phi = (90 - orientation)*np.pi/180
# create a vector with points at 1 m intervals
x = np.linspace(0,length,length+1)
y = np.zeros(len(x))
coords = np.zeros((len(x),2))
coords[:,0] = x
coords[:,1] = y
# translate and rotate the vector using the origin and orientation
tf = transform.EuclideanTransform(rotation=phi, translation=(x0,y0))
transect = tf(coords)
return transect
def draw_transects(output, settings):
"""
Allows the user to draw shore-normal transects over the mapped shorelines.
Arguments:
-----------
output: dict
contains the extracted shorelines and corresponding dates.
settings: dict
contains parameters defining :
transect_length: length of the transect in metres
Returns:
-----------
transects: dict
contains the X and Y coordinates of all the transects drawn. These are also saved
as a .pkl and .kml (+ a .jpg figure showing the location of the transects)
"""
sitename = settings['inputs']['sitename']
length = settings['transect_length']
filepath = os.path.join(os.getcwd(), 'data', sitename)
# plot all shorelines
fig1 = plt.figure()
ax1 = fig1.add_subplot(111)
ax1.axis('equal')
ax1.set_xlabel('Eastings [m]')
ax1.set_ylabel('Northings [m]')
ax1.grid(linestyle=':', color='0.5')
for i in range(len(output['shorelines'])):
sl = output['shorelines'][i]
date = output['dates'][i]
ax1.plot(sl[:, 0], sl[:, 1], '.', markersize=3, label=date.strftime('%d-%m-%Y'))
# ax1.legend()
fig1.set_tight_layout(True)
mng = plt.get_current_fig_manager()
mng.window.showMaximized()
ax1.set_title('Click two points to define each transect (first point is the origin of the transect).\n'+
'When all transects have been defined, click on <ENTER>', fontsize=16)
# initialise variable
transects = dict([])
counter = 0
# loop until user breaks it by click <enter>
while 1:
try:
pts = ginput(n=2, timeout=1e9)
origin = pts[0]
except:
fig1.gca().set_title('Transect locations', fontsize=16)
fig1.savefig(os.path.join(filepath, sitename + 'transects.jpg'), dpi=200)
break
counter = counter + 1
# create the transect using the origin, orientation and length
temp = np.array(pts[1]) - np.array(origin)
phi = np.arctan2(temp[1], temp[0])
orientation = -(phi*180/np.pi - 90)
transect = create_transect(origin, orientation, length)
transects[str(counter)] = transect
# plot the transects on the figure
ax1.plot(transect[:,0], transect[:,1], 'b.', markersize=4)
ax1.plot(transect[0,0], transect[0,1], 'rx', markersize=10)
ax1.text(transect[-1,0], transect[-1,1], str(counter), size=16,
bbox=dict(boxstyle="square", ec='k',fc='w'))
plt.draw()
# save as transects.pkl
with open(os.path.join(filepath, sitename + '_transects.pkl'), 'wb') as f:
pickle.dump(transects, f)
# save as transects.kml (for GIS)
kml = simplekml.Kml()
for key in transects.keys():
newline = kml.newlinestring(name=key)
newline.coords = transects[key]
newline.description = 'user-defined cross-shore transect'
kml.save(os.path.join(filepath, sitename + '_transects.kml'))
return transects
def compute_intersection(output, transects, settings):
"""
Computes the intersection between the 2D mapped shorelines and the transects, to generate
time-series of cross-shore distance along each transect.
Arguments:
-----------
output: dict
contains the extracted shorelines and corresponding dates.
settings: dict
contains parameters defining :
along_dist: alongshore distance to caluclate the intersection (median of points
within this distance).
Returns:
-----------
cross_dist: dict
time-series of cross-shore distance along each of the transects. These are not tidally
corrected.
"""
shorelines = output['shorelines']
along_dist = settings['along_dist']
# initialise variables
chainage_mtx = np.zeros((len(shorelines),len(transects),6))
idx_points = []
for i in range(len(shorelines)):
sl = shorelines[i]
idx_points_all = []
for j,key in enumerate(list(transects.keys())):
# compute rotation matrix
X0 = transects[key][0,0]
Y0 = transects[key][0,1]
temp = np.array(transects[key][-1,:]) - np.array(transects[key][0,:])
phi = np.arctan2(temp[1], temp[0])
Mrot = np.array([[np.cos(phi), np.sin(phi)],[-np.sin(phi), np.cos(phi)]])
# calculate point to line distance between shoreline points and the transect
p1 = np.array([X0,Y0])
p2 = transects[key][-1,:]
d_line = np.abs(np.cross(p2-p1,sl-p1)/np.linalg.norm(p2-p1))
# calculate the distance between shoreline points and the origin of the transect
d_origin = np.array([np.linalg.norm(sl[k,:] - p1) for k in range(len(sl))])
# find the shoreline points that are close to the transects and to the origin
# the distance to the origin is hard-coded here to 1 km
logic_close = np.logical_and(d_line <= along_dist, d_origin <= 1000)
idx_close = find_indices(logic_close, lambda e: e == True)
idx_points_all.append(idx_close)
# in case there are no shoreline points close to the transect
if not idx_close:
chainage_mtx[i,j,:] = np.tile(np.nan,(1,6))
else:
# change of base to shore-normal coordinate system
xy_close = np.array([sl[idx_close,0],sl[idx_close,1]]) - np.tile(np.array([[X0],
[Y0]]), (1,len(sl[idx_close])))
xy_rot = np.matmul(Mrot, xy_close)
# compute mean, median, max, min and std of chainage position
n_points = len(xy_rot[0,:])
mean_cross = np.nanmean(xy_rot[0,:])
median_cross = np.nanmedian(xy_rot[0,:])
max_cross = np.nanmax(xy_rot[0,:])
min_cross = np.nanmin(xy_rot[0,:])
std_cross = np.nanstd(xy_rot[0,:])
# store all statistics
chainage_mtx[i,j,:] = np.array([mean_cross, median_cross, max_cross,
min_cross, n_points, std_cross])
# store the indices of the shoreline points that were used
idx_points.append(idx_points_all)
# format into dictionnary
chainage = dict([])
chainage['mean'] = chainage_mtx[:,:,0]
chainage['median'] = chainage_mtx[:,:,1]
chainage['max'] = chainage_mtx[:,:,2]
chainage['min'] = chainage_mtx[:,:,3]
chainage['npoints'] = chainage_mtx[:,:,4]
chainage['std'] = chainage_mtx[:,:,5]
chainage['idx_points'] = idx_points
# only return the median
cross_dist = dict([])
for j,key in enumerate(list(transects.keys())):
cross_dist[key] = chainage['median'][:,j]
return cross_dist

@ -19,7 +19,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
@ -28,7 +28,7 @@
"import warnings\n",
"warnings.filterwarnings(\"ignore\")\n",
"import matplotlib.pyplot as plt\n",
"import SDS_download, SDS_preprocess, SDS_shoreline, SDS_tools"
"import SDS_download, SDS_preprocess, SDS_shoreline, SDS_tools, SDS_transects"
]
},
{
@ -42,7 +42,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
@ -87,7 +87,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
@ -107,7 +107,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
@ -155,17 +155,9 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Reference shoreline already exists and was loaded\n"
]
}
],
"outputs": [],
"source": [
"%matplotlib qt\n",
"settings['reference_shoreline'] = SDS_preprocess.get_reference_sl_manual(metadata, settings)\n",
@ -182,19 +174,11 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": null,
"metadata": {
"scrolled": true
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n"
]
}
],
"outputs": [],
"source": [
"%matplotlib qt\n",
"output = SDS_shoreline.extract_shorelines(metadata, settings)"
@ -204,40 +188,124 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 4. Plot the shorelines\n",
"\n",
"Simple plot of the mapped shorelines"
]
},
{
"cell_type": "code",
"execution_count": 16,
"execution_count": null,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<matplotlib.legend.Legend at 0x23a4f91b6a0>"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"outputs": [],
"source": [
"plt.figure()\n",
"fig = plt.figure()\n",
"plt.axis('equal')\n",
"plt.xlabel('Eastings')\n",
"plt.ylabel('Northings')\n",
"for satname in output.keys():\n",
" if satname == 'meta':\n",
" continue\n",
" for i in range(len(output[satname]['shoreline'])):\n",
" sl = output[satname]['shoreline'][i]\n",
" date = output[satname]['timestamp'][i]\n",
" plt.plot(sl[:,0], sl[:,1], '.', label=date.strftime('%d-%m-%Y'))\n",
"plt.legend()"
"plt.grid(linestyle=':', color='0.5')\n",
"for i in range(len(output['shorelines'])):\n",
" sl = output['shorelines'][i]\n",
" date = output['dates'][i]\n",
" plt.plot(sl[:,0], sl[:,1], '.', label=date.strftime('%d-%m-%Y'))\n",
"plt.legend()\n",
"mng = plt.get_current_fig_manager() \n",
"mng.window.showMaximized() \n",
"fig.set_size_inches([15.76, 8.52])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 4. Shoreline analysis\n",
"\n",
"Shows how to plot the mapped shorelines and draw shore-normal transects and compute time-series of cross-shore distance along the transects."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**If you have already mapped the shorelines**, just load the output file by only running the section below"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"filepath = os.path.join(os.getcwd(), 'data', sitename)\n",
"with open(os.path.join(filepath, sitename + '_output' + '.pkl'), 'rb') as f:\n",
" output = pickle.load(f) "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Create shore-normal transects along the beach"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%matplotlib qt\n",
"settings['transect_length'] = 500 # defines the length of the transects in metres\n",
"transects = SDS_transects.draw_transects(output, settings)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Intersect the transects with the 2D shorelines to obtain time-series of cross-shore distance"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# defines the along-shore distance over which to consider shoreline points to compute the median intersection (robust to outliers)\n",
"settings['along_dist'] = 25 \n",
"cross_distance = SDS_transects.compute_intersection(output, transects, settings) "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Plot the time-series of shoreline change along each transect"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from matplotlib import gridspec\n",
"import numpy as np\n",
"fig = plt.figure()\n",
"gs = gridspec.GridSpec(len(cross_distance),1)\n",
"gs.update(left=0.05, right=0.95, bottom=0.05, top=0.95, hspace=0.05)\n",
"for i,key in enumerate(cross_distance.keys()):\n",
" ax = fig.add_subplot(gs[i,0])\n",
" ax.grid(linestyle=':', color='0.5')\n",
" ax.set_ylim([-50,50])\n",
" if not i == len(cross_distance.keys()):\n",
" ax.set_xticks = []\n",
" ax.plot(output['dates'], cross_distance[key]- np.nanmedian(cross_distance[key]), '-^', markersize=6)\n",
" ax.set_ylabel('distance [m]', fontsize=12)\n",
" ax.text(0.5,0.95,'Transect ' + key, bbox=dict(boxstyle=\"square\", ec='k',fc='w'), ha='center',\n",
" va='top', transform=ax.transAxes, fontsize=14)\n",
"mng = plt.get_current_fig_manager() \n",
"mng.window.showMaximized() \n",
"fig.set_size_inches([15.76, 8.52])"
]
}
],

@ -2,13 +2,17 @@
# Shoreline extraction from satellite images
#==========================================================#
# Kilian Vos WRL 2018
#%% 1. Initial settings
# load modules
import os
import pickle
import warnings
warnings.filterwarnings("ignore")
import matplotlib.pyplot as plt
import SDS_download, SDS_preprocess, SDS_shoreline, SDS_tools
import SDS_download, SDS_preprocess, SDS_shoreline, SDS_tools, SDS_transects
# region of interest (longitude, latitude), can also be loaded from a .kml polygon
polygon = SDS_tools.coords_from_kml('NARRA.kml')
@ -19,10 +23,10 @@ polygon = SDS_tools.coords_from_kml('NARRA.kml')
# [151.301454, -33.700754]]]
# date range
dates = ['2017-12-01', '2018-01-01']
dates = ['2017-12-01', '2018-06-01']
# satellite missions
sat_list = ['S2']
sat_list = ['L8','S2']
# name of the site
sitename = 'NARRA'
@ -35,59 +39,92 @@ inputs = {
'sitename': sitename
}
#%% 2. Retrieve images
# retrieve satellite images from GEE
metadata = SDS_download.retrieve_images(inputs)
#metadata = SDS_download.retrieve_images(inputs)
# if you have already downloaded the images, just load the metadata file
filepath = os.path.join(os.getcwd(), 'data', sitename)
with open(os.path.join(filepath, sitename + '_metadata' + '.pkl'), 'rb') as f:
metadata = pickle.load(f)
#%%
metadata = pickle.load(f)
#%% 3. Batch shoreline detection
# settings for the shoreline extraction
settings = {
settings = {
# general parameters:
'cloud_thresh': 0.2, # threshold on maximum cloud cover
'output_epsg': 28356, # epsg code of spatial reference system desired for the output
'output_epsg': 28356, # epsg code of spatial reference system desired for the output
# quality control:
'check_detection': True, # if True, shows each shoreline detection to the user for validation
# add the inputs defined previously
'inputs': inputs,
# [ONLY FOR ADVANCED USERS] shoreline detection parameters:
'min_beach_area': 4500, # minimum area (in metres^2) for an object to be labelled as a beach
'buffer_size': 150, # radius (in pixels) of disk for buffer around sandy pixels
'min_length_sl': 200, # minimum length of shoreline perimeter to be kept
# quality control:
'check_detection': True, # if True, shows each shoreline detection and lets the user
# decide which ones are correct and which ones are false due to
# the presence of clouds
# add the inputs
'inputs': inputs
'buffer_size': 150, # radius (in metres) of the buffer around sandy pixels considered in the shoreline detection
'min_length_sl': 200, # minimum length (in metres) of shoreline perimeter to be valid
}
# [OPTIONAL] preprocess images (cloud masking, pansharpening/down-sampling)
SDS_preprocess.save_jpg(metadata, settings)
#SDS_preprocess.save_jpg(metadata, settings)
# [OPTIONAL] create a reference shoreline (helps to identify outliers and false detections)
settings['reference_shoreline'] = SDS_preprocess.get_reference_sl_manual(metadata, settings)
# set the max distance (in meters) allowed from the reference shoreline for a detected shoreline to be valid
settings['max_dist_ref'] = 100
# extract shorelines from all images (also saves output.pkl and output.kml)
# extract shorelines from all images (also saves output.pkl and shorelines.kml)
output = SDS_shoreline.extract_shorelines(metadata, settings)
#%%
# basic figure plotting the mapped shorelines
plt.figure()
# plot the mapped shorelines
fig = plt.figure()
plt.axis('equal')
plt.xlabel('Eastings [m]')
plt.ylabel('Northings [m]')
for satname in output.keys():
if satname == 'meta':
continue
for i in range(len(output[satname]['shoreline'])):
sl = output[satname]['shoreline'][i]
date = output[satname]['timestamp'][i]
plt.plot(sl[:, 0], sl[:, 1], '.', label=date.strftime('%d-%m-%Y'))
plt.xlabel('Eastings')
plt.ylabel('Northings')
plt.grid(linestyle=':', color='0.5')
for i in range(len(output['shorelines'])):
sl = output['shorelines'][i]
date = output['dates'][i]
plt.plot(sl[:,0], sl[:,1], '.', label=date.strftime('%d-%m-%Y'))
plt.legend()
mng = plt.get_current_fig_manager()
mng.window.showMaximized()
fig.set_size_inches([15.76, 8.52])
#%% 4. Shoreline analysis
# if you have already mapped the shorelines, just load them
filepath = os.path.join(os.getcwd(), 'data', sitename)
with open(os.path.join(filepath, sitename + '_output' + '.pkl'), 'rb') as f:
output = pickle.load(f)
# create shore-normal transects along the beach
settings['transect_length'] = 500
transects = SDS_transects.draw_transects(output, settings)
# intersect the transects with the 2D shorelines to obtain time-series of cross-shore distance
settings['along_dist'] = 25
cross_distance = SDS_transects.compute_intersection(output, transects, settings)
# plot the time-series
from matplotlib import gridspec
import numpy as np
fig = plt.figure()
gs = gridspec.GridSpec(len(cross_distance),1)
gs.update(left=0.05, right=0.95, bottom=0.05, top=0.95, hspace=0.05)
for i,key in enumerate(cross_distance.keys()):
ax = fig.add_subplot(gs[i,0])
ax.grid(linestyle=':', color='0.5')
ax.set_ylim([-50,50])
if not i == len(cross_distance.keys()):
ax.set_xticks = []
ax.plot(output['dates'], cross_distance[key]- np.nanmedian(cross_distance[key]), '-^', markersize=6)
ax.set_ylabel('distance [m]', fontsize=12)
ax.text(0.5,0.95,'Transect ' + key, bbox=dict(boxstyle="square", ec='k',fc='w'), ha='center',
va='top', transform=ax.transAxes, fontsize=14)
mng = plt.get_current_fig_manager()
mng.window.showMaximized()
fig.set_size_inches([15.76, 8.52])
Loading…
Cancel
Save