# 2016 Narrabeen Storm EWS Performance This repository investigates whether the storm impacts (i.e. Sallenger, 2000) f the June 2016 Narrabeen Storm could have been forecasted in advance. ## Repository and analysis format This repository follows the [Cookiecutter Data Science](https://drivendata.github.io/cookiecutter-data-science/) structure where possible. The analysis is done in python (look at the `/src/` folder) with some interactive,exploratory notebooks located at `/notebooks`. Development is conducted using a [gitflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow) approach. The `master` branch stores the officialrelease history and the `develop` branch serves as an integration branch for features. Other `hotfix` and `feature` branches should be created and merged as necessary. ## How to start? ### # Getting software requirements The following requirements are needed to run various bits: - [Anacond](https://www.anaconda.com/download/): Used for processing and analysing data. The Anaconda distribution is used for managing environments and is available for Windows, Mac and Linux. Jupyter notebooks are used for exploratory analyis and communication. - [QGIS](https://www.qgis.org/en/site/forusers/download): Used for looking at raw LIDAR pre/post storm surveys and extracting dune crests/toes - [rclone](https://rclone.org/downloads/): Data is not tracked by this repository, but is backed up to a remote Chris Leaman working directory located on the WRL coastal drive. Rclone is used to sync local and remote copies. Ensure rclone.exe is located on your `PATH` environment. - [gnuMake](http://gnuwin32.sourceforge.net/packages/make.htm): A list of commands for processing data is provided in the `./Makefile`. Use gnuMake to launch these commands. Ensure make.exe is located on your `PATH` environment. - git #### Getting the repository Clone the repository: ``` git clone http://git.wrl.unsw.edu.au:3000/chrisl/nsw-2016-storm-impact.git cd nsw-2016-storm-impact ``` #### Getting the python environment set up Commands for setting up the python environment are provided in the `Makefile`. Simply run the following commands in the repo root directory: ``` make venv-init make venv-activate make venv-requirements-install ``` You can see what these commands are actually running by inspecting the `Makefile`. #### Pull data #### View notebooks ## Available data Raw, interim and processed data used in this analysis is kept in the `/data/` folder. Data is not tracked in the repository due to size constraints, but stored locally. A mirror is kept of the coastal folder J drive which you can use to push/pull to, using rclone. In order to get the data, run `make pull-data`. List of data: - `/data/raw/processed_shorelines`: This data was recieved from Tom Beuzen in October 2018. It consists of pre/poststorm profiles at every 100 m sections along beaches ranging from Dee Why to Nambucca . Profiles are based on raw aerial LIDAR and were processed by Mitch Harley. Tides and waves (10 m contour and reverse shoaled deepwater) for each individual 100 m section is also provided. - `/data/raw/raw_lidar`: This is the raw pre/post storm aerial LIDAR which was taken for the June 2016 storm. `.las` files are the raw files which have been processed into `.tiff` files using `PDAL`. Note that these files have not been corrected for systematic errors, so actual elevations should be taken from the `processed_shorelines` folder. Obtained November 2018 from Mitch Harley from the black external HDD labeled "UNSW LIDAR". - `/data/raw/profile_features`: Dune toe and crest locations based on prestorm LIDAR. Refer to `/notebooks/qgis.qgz` as this shows how they were manually extracted. Note that the shapefiles only show the location (lat/lon) of the dune crest and toe. For actual elevations, these locations need to related to the processed shorelines. ## Notebooks - `/notebooks/01_exploration.ipynb`: Shows how to import processed shorelines, waves and tides. An interactive widget plots the location and cross sections. - `/notebooks/qgis.qgz`: A QGIS file which is used to explore the aerial LIDAR data in `/data/raw/raw_lidar`. Byexamining the pre-strom lidar, dune crest and dune toe lines are manually extracted. These are stored in the `/data/profile_features/`. ## TODO - [ ] Mitch updated the raw profiles.mat to include more information about the survey time. Our data scripts should be updated to parse this new information and include it in our dataframes. - [ ] Setup precomit hook for automatic code formatting using [black](https://ljvmiranda921.github.io/notebook/2018/06/21/precommits-using-black-and-flake8/). Low priority as can run black using the command `make format`. - [ ] Raw tide WL's are interpolated based on location from tide gauges. This probably isn't the most accurate method, but should have a small effect since surge elevation was low during this event. Need to assess the effect of this method. - [ ] Estimate max TWL from elevation where pre storm and post storm profiles are the same. Need to think more about this as runup impacting dune toe will move the dune face back, incorrectly raising the observed twl. Perhaps this estimation of max TWL is only useful for the swash regime. - [ ] Implement (bayesian change detection algorithm)[https://github.com/hildensia/bayesian_changepoint_detection] to help detect dune crests and toes from profiles. Probably low priority at the moment since we are doing manual detection. - [ ] Implement dune impact calculations as per Palmsten & Holman. Calculation should be done in a new dataframe. - [ ] Implement data/interim/*.csv file checking using py.test. Check for correct columns, number of nans etc. Testing of code is probably a lower priority than just checking the interim data files at the moment. - [ ] Investigate using [modin](https://github.com/modin-project/modin) to help speed up analysis. - [ ] Need to think about how relative imports are handled, see [here](https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html). Maybe the click CLI interface should be moved to the `./src/` folder and it can import all the other packages?