You cannot select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
66 lines
4.7 KiB
Markdown
66 lines
4.7 KiB
Markdown
# 2016 Narrabeen Storm EWS Performance
|
|
This repository investigates whether the storm impacts (i.e. Sallenger, 2000) of the June 2016 Narrabeen Storm could
|
|
have been forecasted in advance.
|
|
|
|
## Repository and analysis format
|
|
This repository follows the [Cookiecutter Data Science](https://drivendata.github.io/cookiecutter-data-science/)
|
|
structure where possible. The analysis is done in python (look at the `/src/` folder) with some interactive,
|
|
exploratory notebooks located at `/notebooks`.
|
|
|
|
Development is conducted using a [gitflow](https://www.atlassian
|
|
.com/git/tutorials/comparing-workflows/gitflow-workflow) approach - mainly the `master` branch stores the official
|
|
release history and the `develop` branch serves as an integration branch for features. Other `hotfix` and `feature`
|
|
branches should be created and merged as necessary.
|
|
|
|
## Where to start?
|
|
Check .env
|
|
Uses pipenv
|
|
1. Clone this repository.
|
|
2. Pull data from WRL coastal J drive with `make pull-data`
|
|
3. Check out jupyter notebook `./notebooks/01_exploration.ipynb` which has an example of how to import the data and
|
|
some interactive widgets.
|
|
|
|
## Requirements
|
|
The following requirements are needed to run various bits:
|
|
- [Python 3.6+](https://conda.io/docs/user-guide/install/windows.html): Used for processing and analysing data.
|
|
Jupyter notebooks are used for exploratory analyis and communication.
|
|
- [QGIS](https://www.qgis.org/en/site/forusers/download): Used for looking at raw LIDAR pre/post storm surveys and
|
|
extracting dune crests/toes
|
|
- [rclone](https://rclone.org/downloads/): Data is not tracked by this repository, but is backed up to a remote
|
|
Chris Leaman working directory located on the WRL coastal drive. Rclone is used to sync local and remote copies.
|
|
Ensure rclone.exe is located on your `PATH` environment.
|
|
- [gnuMake](http://gnuwin32.sourceforge.net/packages/make.htm): A list of commands for processing data is provided in
|
|
the `./Makefile`. Use gnuMake to launch these commands. Ensure make.exe is located on your `PATH` environment.
|
|
|
|
## Available data
|
|
Raw, interim and processed data used in this analysis is kept in the `/data/` folder. Data is not tracked in the
|
|
repository due to size constraints, but stored locally. A mirror is kept of the coastal folder J drive which you can
|
|
use to push/pull to, using rclone. In order to get the data, run `make pull-data`.
|
|
|
|
List of data:
|
|
- `/data/raw/processed_shorelines`: This data was recieved from Tom Beuzen in October 2018. It consists of pre/post
|
|
storm profiles at every 100 m sections along beaches ranging from Dee Why to Nambucca . Profiles are based on raw
|
|
aerial LIDAR and were processed by Mitch Harley. Tides and waves (10 m contour and reverse shoaled deepwater) for
|
|
each individual 100 m section is also provided.
|
|
- `/data/raw/raw_lidar`: This is the raw pre/post storm aerial LIDAR which was taken for the June 2016 storm. `.las`
|
|
files are the raw files which have been processed into `.tiff` files using `PDAL`. Note that these files have not
|
|
been corrected for systematic errors, so actual elevations should be taken from the `processed_shorelines` folder.
|
|
Obtained November 2018 from Mitch Harley from the black external HDD labeled "UNSW LIDAR".
|
|
- `/data/raw/profile_features`: Dune toe and crest locations based on prestorm LIDAR. Refer to `/notebooks/qgis.qgz`
|
|
as this shows how they were manually extracted. Note that the shapefiles only show the location (lat/lon) of the dune
|
|
crest and toe. For actual elevations, these locations need to related to the processed shorelines.
|
|
|
|
## Notebooks
|
|
- `/notebooks/01_exploration.ipynb`: Shows how to import processed shorelines, waves and tides. An interactive widget
|
|
plots the location and cross sections.
|
|
- `/notebooks/qgis.qgz`: A QGIS file which is used to explore the aerial LIDAR data in `/data/raw/raw_lidar`. By
|
|
examining the pre-strom lidar, dune crest and dune toe lines are manually extracted. These are stored in the
|
|
`/data/profile_features/`.
|
|
|
|
## TODO
|
|
- [ ] Setup precomit hook for automatic code formatting using [black](https://ljvmiranda921.github.io/notebook/2018/06/21/precommits-using-black-and-flake8/).
|
|
- [ ] Raw tide WL's are interpolated based on location from tide gauges. This probably isn't the most accurate method, but should have a small effect since surge elevation was low during this event. Need to assess the effect of this method.
|
|
- [ ] Estimate max TWL from elevation where pre storm and post storm profiles are the same.
|
|
- [ ] Mitch updated the raw profiles.mat to include more information about the survey time. Our data scripts should be updated to parse this new information and include it in our dataframes.
|
|
- [ ] Implement (bayesian change detection algorithm)[https://github.com/hildensia/bayesian_changepoint_detection] to help detect dune crests and toes from profiles
|
|
- [ ] Implement dune impact calculations |