Compare commits
No commits in common. 'develop' and 'master' have entirely different histories.
@ -1,18 +0,0 @@
|
||||
# Environment variables go here, these will be automatically read when using "pipenv run python 'file.py'"
|
||||
|
||||
# Location where data is backed up to. Should be able to copy a set of the data from here.
|
||||
DATA_BACKUP_DIR="J:/Coastal/Temp/CKL/nsw_2016_storm_impact/data"
|
||||
|
||||
# Location where the matlab interpreter is located. Required for a couple of data processing scripts.
|
||||
MATLAB_PATH="C:/Program Files/MATLAB/R2016b/bin/win64/MATLAB.exe"
|
||||
|
||||
# Number of threads to use for multi-core processing. Used when calculating time-varying beach slope when estimating
|
||||
# total water level.
|
||||
MULTIPROCESS_THREADS=2
|
||||
|
||||
|
||||
# The settings below should be left as is unless you know what you're doing.
|
||||
|
||||
# Need to set pythonpath so that relative imports can be properly used in with pipenv
|
||||
# Refer to https://stackoverflow.com/q/52986500 and https://stackoverflow.com/a/49797761
|
||||
# PYTHONPATH=${PWD}
|
@ -1,5 +0,0 @@
|
||||
|
||||
|
||||
*.ipynb filter=nbstripout
|
||||
|
||||
*.ipynb diff=ipynb
|
@ -1,90 +1,56 @@
|
||||
# 2016 Narrabeen Storm EWS Performance
|
||||
This repository investigates whether the storm impacts ([Sallenger, 2000](https://www.jstor.org/stable/4300099#metadata_info_tab_contents)) of the June 2016 Narrabeen Storm could have been forecasted in advance. At 100 m intervals along each beach, we hindcast the storm impact as one of the four regimes defined by Sallenger (2000): swash, collision, overwash or inundation.
|
||||
|
||||
![image](https://i.imgur.com/urMx8Yx.jpg)
|
||||
This repository investigates whether the storm impacts (i.e. Sallenger, 2000) of the June 2016 Narrabeen Storm could
|
||||
have been forecasted in advance.
|
||||
|
||||
## Repository and analysis format
|
||||
This repository follows the [Cookiecutter Data Science](https://drivendata.github.io/cookiecutter-data-science/)
|
||||
structure where possible. The analysis is done in python (look at the `/src/` folder) with some interactive,exploratory notebooks located at `/notebooks`.
|
||||
|
||||
Development is conducted using a [gitflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow) approach. The `master` branch stores the officialrelease history and the `develop` branch serves as an integration branch for features. Other `hotfix` and `feature` branches should be created and merged as necessary.
|
||||
|
||||
## How to start?
|
||||
|
||||
#### Getting software requirements
|
||||
This repository follows the [Cookiecutter Data Science](https://drivendata.github.io/cookiecutter-data-science/)
|
||||
structure where possible. The analysis is done in python (look at the `/src/` folder) with some interactive,
|
||||
exploratory notebooks located at `/notebooks`.
|
||||
|
||||
Development is conducted using a [gitflow](https://www.atlassian
|
||||
.com/git/tutorials/comparing-workflows/gitflow-workflow) approach - mainly the `master` branch stores the official
|
||||
release history and the `develop` branch serves as an integration branch for features. Other `hotfix` and `feature`
|
||||
branches should be created and merged as necessary.
|
||||
|
||||
## Where to start?
|
||||
1. Clone this repository.
|
||||
2. Pull data from WRL coastal J drive with `make pull-data`
|
||||
3. Check out jupyter notebook `./notebooks/01_exploration.ipynb` which has an example of how to import the data and
|
||||
some interactive widgets.
|
||||
|
||||
## Requirements
|
||||
The following requirements are needed to run various bits:
|
||||
- [Anaconda](https://www.anaconda.com/download/): Used for processing and analysing data. The Anaconda distribution is used for managing environments and is available for Windows, Mac and Linux. Jupyter notebooks are used for exploratory analyis and communication.
|
||||
- [QGIS](https://www.qgis.org/en/site/forusers/download): Used for looking at raw LIDAR pre/post storm surveys and extracting dune crests/toes
|
||||
- [rclone](https://rclone.org/downloads/): Data is not tracked by this repository, but is backed up to a remote Chris Leaman working directory located on the WRL coastal drive. Rclone is used to sync local and remote copies. Ensure rclone.exe is located on your `PATH` environment.
|
||||
- [gnuMake](http://gnuwin32.sourceforge.net/packages/make.htm): A list of commands for processing data is provided in the `./Makefile`. Use gnuMake to launch these commands. Ensure make.exe is located on your `PATH` environment.
|
||||
- [git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git): You'll need to have git installed to push and pull from this repo. If you're not familiar with the command line usage of git, [Git Extensions](http://gitextensions.github.io/) is a Windows based GUI which makes it easier to work with git. There are a whole bunch of other [git clients](https://github.com/dictcp/awesome-git#client) that are available as well.
|
||||
|
||||
#### Getting the repository
|
||||
Clone the repository onto into your local environment:
|
||||
```sh
|
||||
git clone http://git.wrl.unsw.edu.au:3000/chrisl/nsw-2016-storm-impact.git
|
||||
cd nsw-2016-storm-impact
|
||||
```
|
||||
|
||||
#### Getting the python environment set up
|
||||
Commands for setting up the python environment are provided in the `Makefile`. Simply run the following commands in the repo root directory, ensuring `make` is located on your path:
|
||||
```sh
|
||||
make venv-init
|
||||
make venv-activate
|
||||
make venv-requirements-install
|
||||
```
|
||||
You can see what these commands are actually running by inspecting the `Makefile`.
|
||||
|
||||
#### Pull data
|
||||
The actual raw, interim and processed data are not tracked by the repository as part of good git practices. A copy of the raw data is stored on the WRL Coastal J:\ drive and can be copied using the following command.
|
||||
```sh
|
||||
make pull-data
|
||||
```
|
||||
If you have updated the data and want to copy it back to the J:\ drive, use the following command. Note that it is probably not a good idea to modify data stored in `./data/raw/`.
|
||||
```sh
|
||||
make push-data
|
||||
```
|
||||
|
||||
#### View notebooks
|
||||
Jupyter notebooks have been set up to help explore the data and do preliminary analysis. Once you have set up your environment and pulled the data, this is probably a good place to start. To run the notebook, use the following command and navigate to the `./notebooks/` folder once the jupyter interface opens in your web browser.
|
||||
```sh
|
||||
make notebook
|
||||
```
|
||||
In order to allow notebook to be version controlled, [nbstripout](https://github.com/kynan/nbstripout) has been installed as a git filter. It will run automatically when commiting any changes to the notebook and strip out the outputs.
|
||||
- [Python 3.6+](https://conda.io/docs/user-guide/install/windows.html): Used for processing and analysing data.
|
||||
Jupyter notebooks are used for exploratory analyis and communication.
|
||||
- [QGIS](https://www.qgis.org/en/site/forusers/download): Used for looking at raw LIDAR pre/post storm surveys and
|
||||
extracting dune crests/toes
|
||||
- [rclone](https://rclone.org/downloads/): Data is not tracked by this repository, but is backed up to a remote
|
||||
Chris Leaman working directory located on the WRL coastal drive. Rclone is used to sync local and remote copies.
|
||||
Ensure rclone.exe is located on your `PATH` environment.
|
||||
- [gnuMake](http://gnuwin32.sourceforge.net/packages/make.htm): A list of commands for processing data is provided in
|
||||
the `./Makefile`. Use gnuMake to launch these commands. Ensure make.exe is located on your `PATH` environment.
|
||||
|
||||
## Available data
|
||||
Raw, interim and processed data used in this analysis is kept in the `/data/` folder. Data is not tracked in the repository due to size constraints, but stored locally. A mirror is kept of the coastal folder `J:\` drive which you can use to push/pull to, using rclone. In order to get the data, run `make pull-data`.
|
||||
Raw, interim and processed data used in this analysis is kept in the `/data/` folder. Data is not tracked in the
|
||||
repository due to size constraints, but stored locally. A mirror is kept of the coastal folder J drive which you can
|
||||
use to push/pull to, using rclone. In order to get the data, run `make pull-data`.
|
||||
|
||||
List of data:
|
||||
- `./data/raw/grain_size/`: The `sites_grain_size.csv` file contains the D50 grain size of each beach as well as the references for where these values were taken from. Grain size is needed to estimate wave runup using the Power et al. (2018) runup model.
|
||||
- `./data/raw/land_lims/`: Not used (?) CKL to check
|
||||
- `./data/raw/near_maps/`: This folder contains aerial imagery of some of the beaches taken from Nearmaps. It can be loaded into QGIS and examined to determine storm impacts by comparing pre and post storm images.
|
||||
- `./data/raw/processed_shorelines/`: This data was recieved from Tom Beuzen in October 2018. It consists of pre/poststorm profiles at every 100 m sections along beaches ranging from Dee Why to Nambucca . Profiles are based on raw aerial LIDAR and were processed by Mitch Harley. Tides and waves (10 m contour and reverse shoaled deepwater) for each individual 100 m section is also provided.
|
||||
- `./data/raw/profile_features/`: Dune toe and crest locations based on prestorm LIDAR. Refer to `/notebooks/qgis.qgz` as this shows how they were manually extracted. Note that the shapefiles only show the location (lat/lon) of the dune crest and toe. For actual elevations, these locations need to related to the processed shorelines.
|
||||
- `./data/raw/profile_features_chris_leaman/`: An excel file containing manually selected dune toes, crests, berms and impacts by Chris Leaman. The values in this file should take preceedence over values picked by Tom Beuzen.
|
||||
- `./data/raw/profile_features_tom_beuzen/`: This mat file contains dune toes and crests that Tom Beuzen picked out for each profile. This is used as a basis for the toe/crest locations, but is overridden from data contained in `/data/raw/profile_features_chris_leaman`.
|
||||
- `./data/raw/raw_lidar/`: This is the raw pre/post storm aerial LIDAR which was taken for the June 2016 storm. `.las` files are the raw files which have been processed into `.tiff` files using `PDAL`. Note that these files have not been corrected for systematic errors, so actual elevations should be taken from the `processed_shorelines` folder. Obtained November 2018 from Mitch Harley from the black external HDD labeled "UNSW LIDAR".
|
||||
- `./data/raw/vol_change_kml/`: This data was obtained from Mitch Harley in Feb 2019 and is a `.kml` showing the change in subaerial volume during the storm. It is included for reference only and is not used in the analysis.
|
||||
- `/data/raw/processed_shorelines`: This data was recieved from Tom Beuzen in October 2018. It consists of pre/post
|
||||
storm profiles at every 100 m sections along beaches ranging from Dee Why to Nambucca . Profiles are based on raw
|
||||
aerial LIDAR and were processed by Mitch Harley. Tides and waves (10 m contour and reverse shoaled deepwater) for
|
||||
each individual 100 m section is also provided.
|
||||
- `/data/raw/raw_lidar`: This is the raw pre/post storm aerial LIDAR which was taken for the June 2016 storm. `.las`
|
||||
files are the raw files which have been processed into `.tiff` files using `PDAL`. Note that these files have not
|
||||
been corrected for systematic errors, so actual elevations should be taken from the `processed_shorelines` folder.
|
||||
Obtained November 2018 from Mitch Harley from the black external HDD labeled "UNSW LIDAR".
|
||||
- `/data/raw/profile_features`: Dune toe and crest locations based on prestorm LIDAR. Refer to `/notebooks/qgis.qgz`
|
||||
as this shows how they were manually extracted. Note that the shapefiles only show the location (lat/lon) of the dune
|
||||
crest and toe. For actual elevations, these locations need to related to the processed shorelines.
|
||||
|
||||
## Notebooks
|
||||
- `./notebooks/01_exploration.ipynb`: Shows how to import processed shorelines, waves and tides. An interactive widget plots the location and cross sections.
|
||||
- `./notebooks/02_collision_protection_volume.ipynb`:
|
||||
- `./notebooks/03_dune_toe_vs_runup.ipynb`:
|
||||
- `./notebooks/04a_profile_picker_superseded.ipynb`:
|
||||
- `./notebooks/04b_profile_picker.ipynb`:
|
||||
- `./notebooks/04c_profile_picker_manual.ipynb`:
|
||||
- `./notebooks/05_twl_exceedence.ipynb`:
|
||||
- `./notebooks/06_change_in_slope.ipynb`:
|
||||
- `./notebooks/07_evaluate_model_performance.ipynb`:
|
||||
- `./notebooks/08_narr_topo_bathy_slope_test.ipynb`:
|
||||
- `./notebooks/09_superseded_run_comparison.ipynb`:
|
||||
- `./notebooks/10_profile_clustering.ipynb`:
|
||||
- `/notebooks/qgis.qgz`: A QGIS file which is used to explore the aerial LIDAR data in `/data/raw/raw_lidar`. By examining the pre-strom lidar, dune crest and dune toe lines are manually extracted. These are stored in the `/data/profile_features/`.
|
||||
|
||||
## TODO
|
||||
- [ ] Raw tide WL's are interpolated based on location from tide gauges. This probably isn't the most accurate method, but should have a small effect since surge elevation was low during this event. Need to assess the effect of this method.
|
||||
- [ ] Estimate max TWL from elevation where pre storm and post storm profiles are the same. Need to think more about this as runup impacting dune toe will move the dune face back, incorrectly raising the observed twl. Perhaps this estimation of max TWL is only useful for the swash regime.
|
||||
- [ ] Implement [bayesian change detection algorithm](https://github.com/hildensia/bayesian_changepoint_detection) to help detect dune crests and toes from profiles. Probably low priority at the moment since we are doing manual detection.
|
||||
- [ ] Implement dune impact calculations as per Palmsten & Holman. Calculation should be done in a new dataframe.
|
||||
- [ ] Implement data/interim/*.csv file checking using py.test. Check for correct columns, number of nans etc. Testing of code is probably a lower priority than just checking the interim data files at the moment. Some functions which should be tested are the slope functions in `forecast_twl.py`, as these can be tricky with different profiles.
|
||||
- [ ] Convert runup model functions to use numpy arrays instead of pandas dataframes. This should give a bit of a speedup.
|
||||
- `/notebooks/01_exploration.ipynb`: Shows how to import processed shorelines, waves and tides. An interactive widget
|
||||
plots the location and cross sections.
|
||||
- `/notebooks/qgis.qgz`: A QGIS file which is used to explore the aerial LIDAR data in `/data/raw/raw_lidar`. By
|
||||
examining the pre-strom lidar, dune crest and dune toe lines are manually extracted. These are stored in the
|
||||
`/data/profile_features/`.
|
||||
|
@ -1,52 +0,0 @@
|
||||
channels:
|
||||
- conda-forge
|
||||
- plotly
|
||||
dependencies:
|
||||
- python=3.6
|
||||
- attrs
|
||||
- pre_commit
|
||||
- beautifulsoup4
|
||||
- autopep8
|
||||
- black
|
||||
- cartopy
|
||||
- colorcet
|
||||
- click
|
||||
- click-plugins
|
||||
- colorlover
|
||||
- fiona
|
||||
- ipykernel
|
||||
- ipython
|
||||
- ipywidgets
|
||||
- matplotlib
|
||||
- line_profiler
|
||||
- nbformat
|
||||
- nbstripout
|
||||
- notebook
|
||||
- numpy
|
||||
- pandas
|
||||
- geopandas
|
||||
- pandoc
|
||||
- pip
|
||||
- plotly
|
||||
- plotly-orca
|
||||
- proj4
|
||||
- pyproj
|
||||
- python-dateutil
|
||||
- pytz
|
||||
- pyyaml
|
||||
- psutil
|
||||
- requests
|
||||
- scikit-learn
|
||||
- scipy
|
||||
- setuptools
|
||||
- seaborn
|
||||
- shapely
|
||||
- statsmodels
|
||||
- tqdm
|
||||
- yaml
|
||||
- yapf
|
||||
- jupyterlab
|
||||
- pip:
|
||||
- blackcellmagic
|
||||
- mat4py
|
||||
- overpass
|
@ -1,3 +0,0 @@
|
||||
*.ipynb filter=nbstripout
|
||||
|
||||
*.ipynb diff=ipynb
|
File diff suppressed because one or more lines are too long
@ -1,627 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Investigate \"collision protection volume\" concept"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T02:45:14.908283Z",
|
||||
"start_time": "2018-12-05T02:45:14.556163Z"
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T02:45:34.323928Z",
|
||||
"start_time": "2018-12-05T02:45:14.911088Z"
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tls\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"import plotly.io as pio"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Load data\n",
|
||||
"Load data from the `./data/interim/` folder and parse into `pandas` dataframes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T02:45:53.010702Z",
|
||||
"start_time": "2018-12-05T02:45:34.324930Z"
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Importing profiles.csv\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "stderr",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"C:\\Users\\z5189959\\Desktop\\nsw-2016-storm-impact\\.venv\\lib\\site-packages\\numpy\\lib\\arraysetops.py:522: FutureWarning:\n",
|
||||
"\n",
|
||||
"elementwise comparison failed; returning scalar instead, but in the future will perform elementwise comparison\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Importing profile_features.csv\n",
|
||||
"Importing impacts_forecasted_foreshore_slope_sto06.csv\n",
|
||||
"Importing impacts_forecasted_mean_slope_sto06.csv\n",
|
||||
"Importing impacts_observed.csv\n",
|
||||
"Importing twl_foreshore_slope_sto06.csv\n",
|
||||
"Importing twl_mean_slope_sto06.csv\n",
|
||||
"Done!\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"df_profile_features = df_from_csv('profile_features.csv', index_col=[0])\n",
|
||||
"\n",
|
||||
"impacts = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'foreshore_slope_sto06': df_from_csv('impacts_forecasted_foreshore_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'mean_slope_sto06': df_from_csv('impacts_forecasted_mean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" },\n",
|
||||
" 'observed': df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"twls = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'foreshore_slope_sto06': df_from_csv('twl_foreshore_slope_sto06.csv', index_col=[0, 1]),\n",
|
||||
" 'mean_slope_sto06':df_from_csv('twl_mean_slope_sto06.csv', index_col=[0, 1]),\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Lets define a function to calculate the \"collision protection volume\" based on prestorm profiles."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Get berm feature functions\n",
|
||||
"Define a couple of functions which are going to help us get features of our berms."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 27,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T03:01:56.646213Z",
|
||||
"start_time": "2018-12-05T03:01:56.366466Z"
|
||||
},
|
||||
"code_folding": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from shapely.geometry import Point, LineString, Polygon\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def collision_protection_vol(x, z, d_low_x, d_low_z, lower_z, angle):\n",
|
||||
" # First, get the bounding line strings of our protection volume\n",
|
||||
" lower_line = LineString([Point(min(x), lower_z), Point(max(x), lower_z)])\n",
|
||||
" profile_line = LineString([Point(x_coord, z_coord) for x_coord, z_coord in zip(x, z)\n",
|
||||
" if all([not np.isnan(x_coord), not np.isnan(z_coord)])])\n",
|
||||
" slope_line = LineString([Point(d_low_x, d_low_z),\n",
|
||||
" Point(max(x), d_low_z - max(x) * np.sin(np.deg2rad(angle)))])\n",
|
||||
"\n",
|
||||
" # Work out where our lower line and slope line intersect\n",
|
||||
" lower_profile_intersection = lower_line.intersection(profile_line)\n",
|
||||
" d_protected_intersection = lower_line.intersection(slope_line)\n",
|
||||
"\n",
|
||||
" # Define the perimeter of the protection area\n",
|
||||
" profile_protected = LineString([Point(x_coord, z_coord) for x_coord, z_coord\n",
|
||||
" in zip(profile_line.xy[0], profile_line.xy[1])\n",
|
||||
" if d_low_x < x_coord < lower_profile_intersection.xy[0][0]]\n",
|
||||
" + [lower_profile_intersection]\n",
|
||||
" + [d_protected_intersection]\n",
|
||||
" + [Point(d_low_x, d_low_z)])\n",
|
||||
"\n",
|
||||
" # Convert to polygon and return the area (m3/m)\n",
|
||||
" protection_area_poly = Polygon(profile_protected)\n",
|
||||
" protection_area_vol = protection_area_poly.area\n",
|
||||
" return protection_area_vol\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def get_berm_width(z, d_low_x):\n",
|
||||
" \"\"\"\n",
|
||||
" Returns the width of the berm, defined by the distance between dune toe to z=0\n",
|
||||
" \"\"\"\n",
|
||||
" x_seaward_limit = z.dropna().tail(1).reset_index().x[0]\n",
|
||||
" return x_seaward_limit - d_low_x\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"site_id = 'NARRA0018'\n",
|
||||
"profile_type = 'prestorm'\n",
|
||||
"query = \"site_id == '{}' and profile_type == '{}'\".format(\n",
|
||||
" site_id, profile_type)\n",
|
||||
"prestorm_profile = df_profiles.query(query)\n",
|
||||
"profile_features = df_profile_features.query(query)\n",
|
||||
"\n",
|
||||
"x = prestorm_profile.index.get_level_values('x')\n",
|
||||
"z = prestorm_profile.z\n",
|
||||
"d_low_x = profile_features.dune_toe_x.tolist()[0]\n",
|
||||
"d_low_z = profile_features.dune_toe_z.tolist()[0]\n",
|
||||
"angle = 60 # degrees from the horizontal\n",
|
||||
"lower_z = 0.5 # from mhw\n",
|
||||
"\n",
|
||||
"vol = collision_protection_vol(x, z, d_low_x, d_low_z, lower_z, angle)\n",
|
||||
"berm_width = get_berm_width(z, d_low_x)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T02:45:54.224110Z",
|
||||
"start_time": "2018-12-05T02:45:54.030142Z"
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from datetime import timedelta\n",
|
||||
"\n",
|
||||
"def wl_time(t, wl, z_lower, z_upper):\n",
|
||||
" \"\"\"\n",
|
||||
" Returns the amount of time the water level is between two elevations.\n",
|
||||
" \"\"\"\n",
|
||||
" df_wl = pd.DataFrame.from_records([(t_val, R2_val) for t_val, R2_val in zip(t,R2)], columns=['datetime','wl'])\n",
|
||||
" df_wl.set_index(pd.DatetimeIndex(df_wl['datetime']),inplace=True)\n",
|
||||
" df_wl.drop(columns=['datetime'], inplace=True)\n",
|
||||
" \n",
|
||||
" # Assumes that each record is one hour... probably need to check this\n",
|
||||
" hours = len(df_wl.query('{} < wl < {}'.format(z_lower, z_upper)))\n",
|
||||
" return timedelta(hours=hours)\n",
|
||||
"\n",
|
||||
"def wave_power(t, wl, z_lower, z_upper, Hs0, Tp):\n",
|
||||
" \"\"\"\n",
|
||||
" Returns the cumulative wave power when the water level is between two elevations.\n",
|
||||
" \"\"\"\n",
|
||||
" df_wl = pd.DataFrame.from_records([(t_val, R2_val,Hs0_val,Tp_val) for t_val, R2_val,Hs0_val,Tp_val in zip(t,R2,Hs0,Tp)], columns=['datetime','wl', 'Hs0','Tp'])\n",
|
||||
" df_wl.set_index(pd.DatetimeIndex(df_wl['datetime']),inplace=True)\n",
|
||||
" df_wl.drop(columns=['datetime'], inplace=True)\n",
|
||||
" \n",
|
||||
" # Assumes that each record is one hour... probably need to check this\n",
|
||||
" rho = 1025 # kg/m3\n",
|
||||
" g = 9.8 # m/s2\n",
|
||||
" df_wl_times = df_wl.query('{} < wl < {}'.format(z_lower, z_upper))\n",
|
||||
" power = rho * g ** 2 / 64 / np.pi * df_wl_times.Hs0 ** 2 * df_wl_times.Tp\n",
|
||||
" return power.sum()\n",
|
||||
"\n",
|
||||
"df_twl = twls['forecasted']['mean_slope_sto06']\n",
|
||||
"df_twl_site = df_twl.query(\"site_id == '{}'\".format(site_id))\n",
|
||||
"\n",
|
||||
"R2 = df_twl_site.R2.tolist()\n",
|
||||
"t = df_twl_site.index.get_level_values('datetime')\n",
|
||||
"z_lower = 0.5\n",
|
||||
"z_upper = d_low_z\n",
|
||||
"\n",
|
||||
"exposed_time = wl_time(t, R2, z_lower,z_upper)\n",
|
||||
"\n",
|
||||
"Hs0 = df_twl.Hs0.tolist()\n",
|
||||
"Tp = df_twl.Tp.tolist()\n",
|
||||
"wave_p = wave_power(t, R2, z_lower,z_upper,Hs0, Tp)\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T02:45:54.231129Z",
|
||||
"start_time": "2018-12-05T02:45:54.225660Z"
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 57,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T03:37:45.472885Z",
|
||||
"start_time": "2018-12-05T03:37:45.462857Z"
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"0.96"
|
||||
]
|
||||
},
|
||||
"execution_count": 57,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"def dune_toe_elevation_change(site_id, df_profile_features):\n",
|
||||
" query = \"site_id == '{}'\".format(site_id)\n",
|
||||
" profile_features = df_profile_features.query(query)\n",
|
||||
" prestorm_dune_toe_z = profile_features.query(\"profile_type=='prestorm'\").dune_toe_z.tolist()[0]\n",
|
||||
" poststorm_dune_toe_z = profile_features.query(\"profile_type=='poststorm'\").dune_toe_z.tolist()[0]\n",
|
||||
" return prestorm_dune_toe_z - poststorm_dune_toe_z\n",
|
||||
"\n",
|
||||
"toe_ele_change = dune_toe_elevation_change(\"MANNING0081\", df_profile_features)\n",
|
||||
"toe_ele_change"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 62,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T03:45:45.203827Z",
|
||||
"start_time": "2018-12-05T03:45:13.608478Z"
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"0 of 816\n",
|
||||
"20 of 816\n",
|
||||
"40 of 816\n",
|
||||
"60 of 816\n",
|
||||
"80 of 816\n",
|
||||
"100 of 816\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"vols = []\n",
|
||||
"exposed_times = []\n",
|
||||
"toe_ele_changes = []\n",
|
||||
"wave_powers = []\n",
|
||||
"berm_widths = []\n",
|
||||
"swash_vol_changes = []\n",
|
||||
"dune_face_vol_changes = []\n",
|
||||
"site_ids_to_plot = []\n",
|
||||
"\n",
|
||||
"# Get site ids where we observed collision\n",
|
||||
"observed_site_ids = impacts['observed'].query(\"storm_regime=='collision'\").index.get_level_values('site_id').unique()\n",
|
||||
"\n",
|
||||
"# # Get site ids where we forecast swash\n",
|
||||
"# forecasted_site_ids = impacts['forecasted']['mean_slope_sto06'].query(\"storm_regime=='swash'\").index.get_level_values('site_id').unique()\n",
|
||||
"\n",
|
||||
"# site_ids = set(observed_site_ids).intersection(set(forecasted_site_ids))\n",
|
||||
"\n",
|
||||
"site_ids = observed_site_ids\n",
|
||||
"\n",
|
||||
"# Calculate for each site\n",
|
||||
"\n",
|
||||
"for n, site_id in enumerate(site_ids):\n",
|
||||
" \n",
|
||||
" if n%20 ==0:\n",
|
||||
" print('{} of {}'.format(n, len(site_ids)))\n",
|
||||
" \n",
|
||||
" try:\n",
|
||||
" query = \"site_id == '{}' and profile_type == '{}'\".format(site_id, 'prestorm')\n",
|
||||
" prestorm_profile = df_profiles.query(query)\n",
|
||||
" profile_features = df_profile_features.query(query)\n",
|
||||
"\n",
|
||||
" vol = collision_protection_vol(x = prestorm_profile.index.get_level_values('x'),\n",
|
||||
" z = prestorm_profile.z,\n",
|
||||
" d_low_x = profile_features.dune_toe_x.tolist()[0],\n",
|
||||
" d_low_z = profile_features.dune_toe_z.tolist()[0],\n",
|
||||
" lower_z = profile_features.dune_toe_z.tolist()[0] - 2, # from mhw\n",
|
||||
" angle = 60, # degrees from the horizontal\n",
|
||||
" )\n",
|
||||
" \n",
|
||||
" df_twl = twls['forecasted']['mean_slope_sto06']\n",
|
||||
" df_twl_site = df_twl.query(\"site_id == '{}'\".format(site_id))\n",
|
||||
" \n",
|
||||
" berm_width = get_berm_width(z = prestorm_profile.z,\n",
|
||||
" d_low_x = profile_features.dune_toe_x.tolist()[0]) \n",
|
||||
" \n",
|
||||
" exposed_time = wl_time(t = df_twl_site.index.get_level_values('datetime'),\n",
|
||||
" wl = df_twl_site.R2.tolist(),\n",
|
||||
" z_lower = profile_features.dune_toe_z.tolist()[0] -2,\n",
|
||||
" z_upper = profile_features.dune_toe_z.tolist()[0],\n",
|
||||
" )\n",
|
||||
" swash_vol_change = impacts['observed'].query(\"site_id == '{}'\".format(site_id)).swash_vol_change.tolist()[0]\n",
|
||||
" dune_face_vol_change = impacts['observed'].query(\"site_id == '{}'\".format(site_id)).dune_face_vol_change.tolist()[0]\n",
|
||||
" \n",
|
||||
" power = wave_power(t = df_twl_site.index.get_level_values('datetime'),\n",
|
||||
" wl = df_twl_site.R2.tolist(),\n",
|
||||
" z_lower = profile_features.dune_toe_z.tolist()[0] -2,\n",
|
||||
" z_upper = profile_features.dune_toe_z.tolist()[0],\n",
|
||||
" Hs0=df_twl_site.Hs0.tolist(),\n",
|
||||
" Tp=df_twl_site.Tp.tolist())\n",
|
||||
" \n",
|
||||
" toe_ele_change = dune_toe_elevation_change(site_id, df_profile_features)\n",
|
||||
" except:\n",
|
||||
" continue\n",
|
||||
" \n",
|
||||
"# print(site_id, toe_ele_change)\n",
|
||||
" vols.append(vol)\n",
|
||||
" exposed_times.append(exposed_time)\n",
|
||||
" toe_ele_changes.append(toe_ele_change)\n",
|
||||
" wave_powers.append(power)\n",
|
||||
" berm_widths.append(berm_width)\n",
|
||||
" swash_vol_changes.append(swash_vol_change)\n",
|
||||
" dune_face_vol_changes.append(dune_face_vol_change)\n",
|
||||
" site_ids_to_plot.append(site_id)\n",
|
||||
" \n",
|
||||
" if n>100:\n",
|
||||
" break\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-03T03:12:11.598150Z",
|
||||
"start_time": "2018-12-03T03:12:11.590128Z"
|
||||
}
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 72,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T05:03:39.147413Z",
|
||||
"start_time": "2018-12-05T05:03:39.070207Z"
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"application/vnd.jupyter.widget-view+json": {
|
||||
"model_id": "225855bac0d0464d9be74917812c19ac",
|
||||
"version_major": 2,
|
||||
"version_minor": 0
|
||||
},
|
||||
"text/plain": [
|
||||
"FigureWidget({\n",
|
||||
" 'data': [{'marker': {'size': 4},\n",
|
||||
" 'mode': 'markers',\n",
|
||||
" 'text': [-0…"
|
||||
]
|
||||
},
|
||||
"metadata": {},
|
||||
"output_type": "display_data"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"trace1 = go.Scatter(\n",
|
||||
" x=berm_widths,\n",
|
||||
" y=dune_face_vol_changes,\n",
|
||||
" text = ['{}<br>{}'.format(ele, site_id) for ele,site_id in zip(toe_ele_changes,site_ids_to_plot)],\n",
|
||||
" mode='markers',\n",
|
||||
" marker=dict(\n",
|
||||
" size=4,\n",
|
||||
"# color = [-1 if x<0 else 1 for x in toe_ele_changes],\n",
|
||||
"# color = toe_ele_changes,\n",
|
||||
"# color = dune_face_vol_changes,\n",
|
||||
"# color = [x.total_seconds() / 60 / 60 for x in exposed_times],\n",
|
||||
"# colorscale='Viridis',\n",
|
||||
"# showscale=True\n",
|
||||
" ))\n",
|
||||
"\n",
|
||||
"layout = go.Layout(\n",
|
||||
" title='Dune Collision Protection',\n",
|
||||
"# height=300,\n",
|
||||
"# legend=dict(font={'size': 10}),\n",
|
||||
"# margin=dict(t=50, b=50, l=50, r=20),\n",
|
||||
" xaxis=dict(\n",
|
||||
" title='Berm width',\n",
|
||||
" autorange=True,\n",
|
||||
" showgrid=True,\n",
|
||||
" zeroline=True,\n",
|
||||
" showline=True,\n",
|
||||
" ),\n",
|
||||
" yaxis=dict(\n",
|
||||
" title='Dune face vol change',\n",
|
||||
" autorange=True,\n",
|
||||
" showgrid=True,\n",
|
||||
" zeroline=True,\n",
|
||||
" showline=True,\n",
|
||||
" ))\n",
|
||||
"\n",
|
||||
"g_plot = go.FigureWidget(data=[trace1], layout=layout)\n",
|
||||
"g_plot"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 51,
|
||||
"metadata": {
|
||||
"ExecuteTime": {
|
||||
"end_time": "2018-12-05T03:15:46.517975Z",
|
||||
"start_time": "2018-12-05T03:15:46.512936Z"
|
||||
}
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"[64.5799,\n",
|
||||
" 21.0163,\n",
|
||||
" 38.106,\n",
|
||||
" 28.101,\n",
|
||||
" 58.7247,\n",
|
||||
" 33.5534,\n",
|
||||
" 71.1675,\n",
|
||||
" 52.6043,\n",
|
||||
" 50.5765,\n",
|
||||
" 39.9074,\n",
|
||||
" 67.8385,\n",
|
||||
" 43.9043,\n",
|
||||
" 39.8181,\n",
|
||||
" 37.7153,\n",
|
||||
" 20.4454,\n",
|
||||
" 39.7757,\n",
|
||||
" 42.1843,\n",
|
||||
" 33.6152,\n",
|
||||
" 42.9587,\n",
|
||||
" 39.9773,\n",
|
||||
" 35.7835,\n",
|
||||
" 31.2884,\n",
|
||||
" -0.4618,\n",
|
||||
" 31.0094,\n",
|
||||
" 33.3479,\n",
|
||||
" 47.8394,\n",
|
||||
" 32.3566,\n",
|
||||
" 36.5205,\n",
|
||||
" 45.7109,\n",
|
||||
" 16.0687,\n",
|
||||
" 35.4375,\n",
|
||||
" 43.327,\n",
|
||||
" 53.5016,\n",
|
||||
" 31.0357,\n",
|
||||
" 47.6528,\n",
|
||||
" 25.5658,\n",
|
||||
" 41.0514,\n",
|
||||
" 28.1645,\n",
|
||||
" 44.5443,\n",
|
||||
" 42.925,\n",
|
||||
" 33.9535,\n",
|
||||
" 36.2626,\n",
|
||||
" 35.2536]"
|
||||
]
|
||||
},
|
||||
"execution_count": 51,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# impacts['observed']\n",
|
||||
"swash_vol_changes"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": true
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,407 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import pandas as pd\n",
|
||||
"import os\n",
|
||||
"import numpy.ma as ma\n",
|
||||
"\n",
|
||||
"import numpy\n",
|
||||
"from pyearth import Earth\n",
|
||||
"from matplotlib import pyplot\n",
|
||||
"\n",
|
||||
"np.random.seed(2017)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Try using pyearth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": [
|
||||
5,
|
||||
20,
|
||||
31,
|
||||
40
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from scipy.signal import savgol_filter\n",
|
||||
"import re\n",
|
||||
"from scipy.stats import linregress\n",
|
||||
"import warnings\n",
|
||||
"warnings.simplefilter(action='ignore', category=FutureWarning)\n",
|
||||
"\n",
|
||||
"def get_breakpoints(model, min_distance=20):\n",
|
||||
" # Get breakpoints\n",
|
||||
" breakpoints = []\n",
|
||||
" for line in model.summary().split('\\n'):\n",
|
||||
" # Get unpruned lines\n",
|
||||
" if 'No' in line and 'None' not in line:\n",
|
||||
" # Get break points\n",
|
||||
" m = re.search(\"h\\(x0-(\\d*\\.?\\d+)\\)\", line)\n",
|
||||
" if m:\n",
|
||||
" breakpoints.append(float(m.groups()[0]))\n",
|
||||
" m = re.search(\"h\\((\\d*\\.?\\d+)-x0\\)\", line)\n",
|
||||
" if m:\n",
|
||||
" breakpoints.append(float(m.groups()[0]))\n",
|
||||
" return sorted(list(set(breakpoints)))\n",
|
||||
" \n",
|
||||
"def get_segments(breakpoints, x_min, x_max):\n",
|
||||
" segments = []\n",
|
||||
" breakpoints = [x_min] + breakpoints + [x_max]\n",
|
||||
"\n",
|
||||
" for x1, x2 in zip(breakpoints, breakpoints[1:]):\n",
|
||||
" segments.append({\n",
|
||||
" 'x_start': x1,\n",
|
||||
" 'x_end': x2\n",
|
||||
" })\n",
|
||||
" return segments \n",
|
||||
"\n",
|
||||
"def get_segment_slopes(segments, x, z):\n",
|
||||
" for segment in segments:\n",
|
||||
" mask = ma.masked_where((segment['x_start'] < x) & (x < segment['x_end']),x ).mask\n",
|
||||
" segment['z_mean'] = np.mean(z[mask])\n",
|
||||
" segment['z_start'] = np.mean(z[mask][0])\n",
|
||||
" segment['z_end'] = np.mean(z[mask][-1])\n",
|
||||
" segment['slope'] = -linregress(x[mask], z[mask]).slope\n",
|
||||
" return segments\n",
|
||||
" \n",
|
||||
"def classify_segments(segments, x,z):\n",
|
||||
" \n",
|
||||
" # Most seaward slope must be foreshore\n",
|
||||
" segments[-1]['type'] = 'foreshore'\n",
|
||||
" \n",
|
||||
" # Most landward slope must be land\n",
|
||||
" segments[0]['type'] = 'land'\n",
|
||||
" \n",
|
||||
" # Segments with really high slopes must be structures\n",
|
||||
" for seg in segments:\n",
|
||||
" if seg['slope'] > 2.0:\n",
|
||||
" seg['type'] = 'structure'\n",
|
||||
" \n",
|
||||
" # Segments with large change of slope and \n",
|
||||
" # Segment with max slope should be dune face\n",
|
||||
"# dune_face_idx = [n for n, seg in enumerate(segments) if seg['slope']==max(x['slope'] for x in segments)][0]\n",
|
||||
"# segments[dune_face_idx]['type'] = 'dune_face'\n",
|
||||
" \n",
|
||||
" # Pick out berms \n",
|
||||
" for seg in segments:\n",
|
||||
" if (-0.03 < seg['slope'] < 0.03 # berms should be relatively flat\n",
|
||||
" and 0 < seg['z_mean'] < 4 # berms should be located between 0-4 m AHD\n",
|
||||
" ): # berms should be seaward of dune face\n",
|
||||
" seg['type'] = 'berm'\n",
|
||||
" \n",
|
||||
"# slope = None\n",
|
||||
"# for seg in reversed(segments):\n",
|
||||
"# if slope is None:\n",
|
||||
"# continue\n",
|
||||
"# elif slope - 0.03 < seg['slope'] < slope + 0.03:\n",
|
||||
"# seg['type'] = 'foreshore'\n",
|
||||
"# else:\n",
|
||||
"# break\n",
|
||||
" \n",
|
||||
" return segments\n",
|
||||
"\n",
|
||||
"def get_piecewise_linear_model(x,z):\n",
|
||||
" #Fit an Earth model\n",
|
||||
" model = Earth(penalty=3,thresh=0.0005)\n",
|
||||
" model.fit(x,z)\n",
|
||||
" return model\n",
|
||||
"\n",
|
||||
"def plot_profile_classification(site_id, profile_type):\n",
|
||||
" df_profile = df_profiles.query(\"site_id == '{}' and profile_type == '{}'\".format(site_id, profile_type))\n",
|
||||
" x = np.array(df_profile.index.get_level_values('x').tolist())\n",
|
||||
" z = np.array(df_profile.z.tolist()) \n",
|
||||
" \n",
|
||||
" nan_mask = ma.masked_invalid(z).mask\n",
|
||||
" x = x[~nan_mask]\n",
|
||||
" z_unfiltered = z[~nan_mask]\n",
|
||||
" z = savgol_filter(z_unfiltered, 51, 3)\n",
|
||||
" \n",
|
||||
" model = get_piecewise_linear_model(x,z)\n",
|
||||
" breakpoints = get_breakpoints(model)\n",
|
||||
" segments = get_segments(breakpoints, x_min=x.min(), x_max=x.max())\n",
|
||||
" segments = get_segment_slopes(segments, x=x, z=z)\n",
|
||||
"# segments = merge_similar_segments(segments)\n",
|
||||
" segments = classify_segments(segments, x=x, z=z)\n",
|
||||
" \n",
|
||||
" pyplot.figure()\n",
|
||||
" pyplot.plot(x,z_unfiltered, color='0.5',marker='.', alpha=.2, ms=10,linestyle=\"None\")\n",
|
||||
"\n",
|
||||
" # Plot different segments\n",
|
||||
" foreshore_segments = [x for x in segments if x.get('type') == 'foreshore']\n",
|
||||
" for seg in foreshore_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='b')\n",
|
||||
"\n",
|
||||
" land_segments = [x for x in segments if x.get('type') == 'land']\n",
|
||||
" for seg in land_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='g')\n",
|
||||
"\n",
|
||||
" berm_segments = [x for x in segments if x.get('type') == 'berm']\n",
|
||||
" for seg in berm_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='y')\n",
|
||||
"\n",
|
||||
" dune_face_segments = [x for x in segments if x.get('type') == 'dune_face']\n",
|
||||
" for seg in dune_face_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='r')\n",
|
||||
" \n",
|
||||
" structure_segments = [x for x in segments if x.get('type') == 'structure']\n",
|
||||
" for seg in structure_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='m')\n",
|
||||
" \n",
|
||||
" unclassified_segments = [x for x in segments if x.get('type') is None]\n",
|
||||
" for seg in unclassified_segments:\n",
|
||||
" pyplot.plot([seg['x_start'], seg['x_end']],\n",
|
||||
" [seg['z_start'], seg['z_end']],\n",
|
||||
" linewidth=4, \n",
|
||||
" color='0.4')\n",
|
||||
"\n",
|
||||
" pyplot.xlabel('x (m)')\n",
|
||||
" pyplot.ylabel('z (m AHD)')\n",
|
||||
" pyplot.title('{} profile at {}'.format(profile_type, site_id))\n",
|
||||
" pyplot.show()\n",
|
||||
"\n",
|
||||
" import pprint\n",
|
||||
" pp = pprint.PrettyPrinter(indent=4)\n",
|
||||
" pp.pprint(segments)\n",
|
||||
"\n",
|
||||
"plot_profile_classification('NARRA0018', 'prestorm')\n",
|
||||
"plot_profile_classification('NARRA0019', 'prestorm')\n",
|
||||
"plot_profile_classification('CRESn0017', 'poststorm')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"heading_collapsed": true
|
||||
},
|
||||
"source": [
|
||||
"## Try lmfit"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": [
|
||||
0
|
||||
],
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from lmfit import Model, Parameters\n",
|
||||
"\n",
|
||||
"def get_data():\n",
|
||||
" site_id='NARRA0018'\n",
|
||||
" profile_type='prestorm'\n",
|
||||
" df_profile = df_profiles.query(\"site_id == '{}' and profile_type == '{}'\".format(site_id, profile_type))\n",
|
||||
" x = np.array(df_profile.index.get_level_values('x').tolist())\n",
|
||||
" z = np.array(df_profile.z.tolist()) \n",
|
||||
"\n",
|
||||
" nan_mask = ma.masked_invalid(z).mask\n",
|
||||
" x = x[~nan_mask]\n",
|
||||
" z = z[~nan_mask]\n",
|
||||
" return x,z\n",
|
||||
"\n",
|
||||
"# def piecewise_linear(x, x0, x1, b, k1, k2, k3):\n",
|
||||
"# condlist = [x < x0, (x >= x0) & (x < x1), x >= x1]\n",
|
||||
"# funclist = [lambda x: k1*x + b, lambda x: k1*x + b + k2*(x-x0), lambda x: k1*x + b + k2*(x-x0) + k3*(x - x1)]\n",
|
||||
"# return np.piecewise(x, condlist, funclist)\n",
|
||||
"\n",
|
||||
"# x,z = get_data()\n",
|
||||
"\n",
|
||||
"# fmodel = Model(piecewise_linear)\n",
|
||||
"# params = Parameters()\n",
|
||||
"# params.add('x0', value=0, vary=True, min=min(x), max=max(x))\n",
|
||||
"# params.add('x1', value=0, vary=True, min=min(x), max=max(x))\n",
|
||||
"# params.add('b', value=0, vary=True)\n",
|
||||
"# params.add('k1', value=0, vary=True, min=-0.01, max=0.01)\n",
|
||||
"# params.add('k2', value=0, vary=True, min=-0.1, max=-0.5)\n",
|
||||
"# params.add('k3', value=0, vary=True, min=0.1, max=0.5)\n",
|
||||
"\n",
|
||||
"def piecewise_linear(x, x0, x1, x2, b, k1, k2, k3,k4):\n",
|
||||
" condlist = [x < x0, (x >= x0) & (x < x1), (x >= x1) & (x < x2), x >= x2]\n",
|
||||
" funclist = [lambda x: k1*x + b, lambda x: k1*x + b + k2*(x-x0), lambda x: k1*x + b + k2*(x-x0) + k3*(x - x1), lambda x: k1*x + b + k2*(x-x0) + k3*(x - x1) +k4*(x-x2)]\n",
|
||||
" return np.piecewise(x, condlist, funclist)\n",
|
||||
"\n",
|
||||
"x,z = get_data()\n",
|
||||
"\n",
|
||||
"fmodel = Model(piecewise_linear)\n",
|
||||
"params = Parameters()\n",
|
||||
"params.add('x0', value=0, vary=True, min=min(x), max=max(x))\n",
|
||||
"params.add('x1', value=0, vary=True, min=min(x), max=max(x))\n",
|
||||
"params.add('x2', value=0, vary=True, min=min(x), max=max(x))\n",
|
||||
"params.add('b', value=0, vary=True)\n",
|
||||
"params.add('k1', value=0, vary=True, min=-0.5, max=0.5)\n",
|
||||
"params.add('k2', value=0, vary=True, min=-0.5, max=0.5)\n",
|
||||
"params.add('k3', value=0, vary=True, min=-0.5, max=0.5)\n",
|
||||
"params.add('k4', value=0, vary=True, min=-0.5, max=0.5)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"result = fmodel.fit(z, params, x=x,method='ampgo')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"pyplot.figure()\n",
|
||||
"pyplot.plot(x,z, color='0.5',marker='.', alpha=.2, ms=10,linestyle=\"None\")\n",
|
||||
"pyplot.plot(x,result.best_fit, color='r')\n",
|
||||
"pyplot.show()\n",
|
||||
"print(result.fit_report())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Try spline"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": [
|
||||
2
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from scipy.signal import savgol_filter\n",
|
||||
"\n",
|
||||
"def get_data():\n",
|
||||
" site_id='NARRA0018'\n",
|
||||
" profile_type='prestorm'\n",
|
||||
" df_profile = df_profiles.query(\"site_id == '{}' and profile_type == '{}'\".format(site_id, profile_type))\n",
|
||||
" x = np.array(df_profile.index.get_level_values('x').tolist())\n",
|
||||
" z = np.array(df_profile.z.tolist()) \n",
|
||||
"\n",
|
||||
" nan_mask = ma.masked_invalid(z).mask\n",
|
||||
" x = x[~nan_mask]\n",
|
||||
" z = z[~nan_mask]\n",
|
||||
" return x,z\n",
|
||||
"\n",
|
||||
"x,z = get_data()\n",
|
||||
"\n",
|
||||
"z_filtered = savgol_filter(z, 31, 3)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"pyplot.figure()\n",
|
||||
"pyplot.plot(x,z, color='0.5',marker='.', alpha=.2, ms=10,linestyle=\"None\")\n",
|
||||
"pyplot.plot(x,z_filtered, color='r')\n",
|
||||
"pyplot.show()\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,242 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Profile picker"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"heading_collapsed": true
|
||||
},
|
||||
"source": [
|
||||
"## Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Enable autoreloading of our modules. \n",
|
||||
"# Most of the code will be located in the /src/ folder, \n",
|
||||
"# and then called from the notebook.\n",
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"import decimal\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tls\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"from plotly import tools\n",
|
||||
"import plotly.io as pio\n",
|
||||
"from scipy import stats\n",
|
||||
"import math\n",
|
||||
"import matplotlib\n",
|
||||
"from matplotlib import cm\n",
|
||||
"import colorlover as cl\n",
|
||||
"import numpy.ma as ma\n",
|
||||
"\n",
|
||||
"from ipywidgets import widgets, Output\n",
|
||||
"from IPython.display import display, clear_output, Image, HTML\n",
|
||||
"\n",
|
||||
"from sklearn.metrics import confusion_matrix\n",
|
||||
"\n",
|
||||
"import numpy as np\n",
|
||||
"from matplotlib import pyplot as plt\n",
|
||||
"\n",
|
||||
"from sklearn import linear_model, datasets\n",
|
||||
"\n",
|
||||
"from scipy.interpolate import UnivariateSpline\n",
|
||||
"from scipy.interpolate import interp1d\n",
|
||||
"from scipy.interpolate import splrep, splev\n",
|
||||
"from scipy.integrate import simps\n",
|
||||
"from scipy.stats import linregress\n",
|
||||
"from scipy.signal import find_peaks\n",
|
||||
"import json"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10,6)\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.5\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"\n",
|
||||
"# https://stackoverflow.com/a/20709149\n",
|
||||
"matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage{helvet}', # set the normal font here\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"] "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import data\n",
|
||||
"Let's first import data from our pre-processed interim data folder."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"df_profile_features_crest_toes = df_from_csv('profile_features_crest_toes.csv', index_col=[0,1])\n",
|
||||
"\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Manually pick features"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%matplotlib notebook\n",
|
||||
"\n",
|
||||
"sites = df_profiles.index.get_level_values('site_id').unique()\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"fig =plt.figure(figsize=(10, 3))\n",
|
||||
"\n",
|
||||
"df_prestorm = df_profiles.xs((sites[0],'prestorm'),level=('site_id','profile_type'))\n",
|
||||
"df_poststorm = df_profiles.xs((sites[0],'poststorm'),level=('site_id','profile_type'))\n",
|
||||
"line_prestorm, = plt.plot(df_prestorm.index, df_prestorm.z, label='prestorm')\n",
|
||||
"line_poststorm, = plt.plot(df_prestorm.index, df_prestorm.z, label='poststorm')\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# df_profiles.xs((sites[0],'prestorm'),level=('site_id','profile_type'))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,614 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Check change in mean slope\n",
|
||||
"- Check the effect of changes in prestorm and poststorm mean slope.\n",
|
||||
"- If there is a large berm, the prestorm mean slope (between dune toe and MHW) could be too small, and underpredict wave runup and TWL.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Enable autoreloading of our modules. \n",
|
||||
"# Most of the code will be located in the /src/ folder, \n",
|
||||
"# and then called from the notebook.\n",
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tools\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"import plotly.io as pio\n",
|
||||
"\n",
|
||||
"import itertools\n",
|
||||
"\n",
|
||||
"import matplotlib\n",
|
||||
"from matplotlib import cm\n",
|
||||
"import colorlover as cl\n",
|
||||
"\n",
|
||||
"from ipywidgets import widgets, Output\n",
|
||||
"from IPython.display import display, clear_output, Image, HTML\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"from sklearn.metrics import confusion_matrix"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10,6)\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.5\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"\n",
|
||||
"# https://stackoverflow.com/a/20709149\n",
|
||||
"matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage{helvet}', # set the normal font here\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"] "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import data\n",
|
||||
"Import our data into pandas Dataframes for the analysis. Data files are `.csv` files which are stored in the `./data/interim/` folder."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_waves = df_from_csv('waves.csv', index_col=[0, 1])\n",
|
||||
"df_tides = df_from_csv('tides.csv', index_col=[0, 1])\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"df_sites = df_from_csv('sites.csv', index_col=[0])\n",
|
||||
"df_profile_features_crest_toes = df_from_csv('profile_features_crest_toes.csv', index_col=[0,1])\n",
|
||||
"\n",
|
||||
"# Note that the forecasted data sets should be in the same order for impacts and twls\n",
|
||||
"impacts = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'foreshore_slope_sto06': df_from_csv('impacts_forecasted_foreshore_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'mean_slope_sto06': df_from_csv('impacts_forecasted_mean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" },\n",
|
||||
" 'observed': df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"twls = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'foreshore_slope_sto06': df_from_csv('twl_foreshore_slope_sto06.csv', index_col=[0, 1]),\n",
|
||||
" 'mean_slope_sto06':df_from_csv('twl_mean_slope_sto06.csv', index_col=[0, 1]),\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Plot prestorm vs poststorm mean slopes\n",
|
||||
"Prestorm slopes have already been calculated as part of the TWL forecasting, however we'll need to extract the poststorm mean slopes from our profiles at each site."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Prestorm slopes are easy as we have already calculated this as part of the \n",
|
||||
"df_slopes_prestorm = twls['forecasted']['mean_slope_sto06'].groupby('site_id').head(1).reset_index().set_index(['site_id']).beta.to_frame()\n",
|
||||
"\n",
|
||||
"# Get x and z at mhw (z=0.7m) for each site\n",
|
||||
"z_mhw = 0.7\n",
|
||||
"mhw_poststorm = []\n",
|
||||
"for site, df in df_profiles.xs('poststorm', level='profile_type').groupby('site_id'):\n",
|
||||
" df = df.dropna(subset=['z'])\n",
|
||||
" df = df.iloc[(df['z']-z_mhw).abs().argsort().head(1)].reset_index()\n",
|
||||
" df = df.iloc[0]\n",
|
||||
" mhw_poststorm.append({\n",
|
||||
" 'site_id': df.site_id,\n",
|
||||
" 'x_mhw': df.x,\n",
|
||||
" 'z_mhw': df.z\n",
|
||||
" })\n",
|
||||
"# break\n",
|
||||
"df_mhw_poststorm = pd.DataFrame(mhw_poststorm)\n",
|
||||
"df_mhw_poststorm = df_mhw_poststorm.set_index('site_id')\n",
|
||||
"\n",
|
||||
"# Get x and z at poststorm dune toe for each site\n",
|
||||
"df_dune_toe_poststorm = df_profile_features_crest_toes.xs('poststorm', level='profile_type')[['dune_toe_x','dune_toe_z']]\n",
|
||||
"\n",
|
||||
"# If there is no poststorm dune toe defined, use the dune crest\n",
|
||||
"df_dune_crest_poststorm = df_profile_features_crest_toes.xs('poststorm', level='profile_type')[['dune_crest_x','dune_crest_z']]\n",
|
||||
"df_dune_toe_poststorm.dune_toe_x = df_dune_toe_poststorm.dune_toe_x.fillna(df_dune_crest_poststorm.dune_crest_x)\n",
|
||||
"df_dune_toe_poststorm.dune_toe_z = df_dune_toe_poststorm.dune_toe_z.fillna(df_dune_crest_poststorm.dune_crest_z)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Join df for mhw and dune toe\n",
|
||||
"df = df_mhw_poststorm.join(df_dune_toe_poststorm)\n",
|
||||
"df['beta'] = -(df['dune_toe_z'] - df['z_mhw']) / (df['dune_toe_x'] -df['x_mhw'])\n",
|
||||
"df_slopes_poststorm = df['beta'].to_frame()\n",
|
||||
"\n",
|
||||
"# Count how many nans\n",
|
||||
"print('Number of nans: {}'.format(df_slopes_poststorm.beta.isna().sum()))\n",
|
||||
"\n",
|
||||
"# Display dataframe\n",
|
||||
"print('df_slopes_poststorm:')\n",
|
||||
"df_slopes_poststorm"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Now, let's join our post storm slopes, prestorm slopes, observed and forecasted impacts into one data frame to make it easier to plot."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"dfs = [df_slopes_poststorm.rename(columns={'beta':'poststorm_beta'}),\n",
|
||||
" df_slopes_prestorm.rename(columns={'beta':'prestorm_beta'}),\n",
|
||||
" impacts['observed']['storm_regime'].to_frame().rename(columns={'storm_regime': 'observed_regime'}),\n",
|
||||
" impacts['forecasted']['mean_slope_sto06']['storm_regime'].to_frame().rename(columns={'storm_regime': 'forecasted_regime'})\n",
|
||||
" ]\n",
|
||||
"\n",
|
||||
"df = pd.concat(dfs, axis='columns')\n",
|
||||
"df"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"We also should add the change in beach width between prestorm and post storm profiles"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ele = 0.7\n",
|
||||
"data = []\n",
|
||||
"for site_id, df_site in df_profiles.groupby('site_id'):\n",
|
||||
" \n",
|
||||
" # Beach width should be measured from dune toe (or crest if doesn't exist) to MHW\n",
|
||||
" \n",
|
||||
" dune_toe_x = np.nanmax([\n",
|
||||
" df_profile_features_crest_toes.loc[(site_id,'prestorm')].dune_crest_x,\n",
|
||||
" df_profile_features_crest_toes.loc[(site_id,'prestorm')].dune_toe_x\n",
|
||||
" ])\n",
|
||||
" \n",
|
||||
" \n",
|
||||
" # TODO This probably should take the closest value to ele starting from the seaward end of the profile\n",
|
||||
" temp = df_site.xs('prestorm',level='profile_type').dropna(subset=['z'])\n",
|
||||
" prestorm_width = temp.iloc[(temp.z - ele).abs().argsort()[0]].name[1] - dune_toe_x\n",
|
||||
" \n",
|
||||
" temp = df_site.xs('poststorm',level='profile_type').dropna(subset=['z'])\n",
|
||||
" poststorm_width = temp.iloc[(temp.z - ele).abs().argsort()[0]].name[1] - dune_toe_x\n",
|
||||
" \n",
|
||||
" width_change = prestorm_width - poststorm_width\n",
|
||||
" data.append(\n",
|
||||
" {\n",
|
||||
" 'site_id': site_id,\n",
|
||||
" 'width_change': width_change,\n",
|
||||
" 'prestorm_width': prestorm_width,\n",
|
||||
" 'poststorm_width': poststorm_width\n",
|
||||
" })\n",
|
||||
" \n",
|
||||
" \n",
|
||||
" \n",
|
||||
" \n",
|
||||
"df_width_change = pd.DataFrame(data)\n",
|
||||
"df_width_change = df_width_change.set_index(['site_id'])\n",
|
||||
"\n",
|
||||
"# Join with the data\n",
|
||||
"df = df.merge(df_width_change, left_on=['site_id'], right_on=['site_id'])\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"heading_collapsed": true
|
||||
},
|
||||
"source": [
|
||||
"## Plot our data in a confusion matrix\n",
|
||||
"Superseded"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"fig = tools.make_subplots(\n",
|
||||
" rows=2,\n",
|
||||
" cols=2,\n",
|
||||
" specs=[[{}, {}], [{}, {}]],\n",
|
||||
" subplot_titles=('Swash/Swash', 'Swash/Collision', \n",
|
||||
" 'Collision/Swash', 'Collision/Collision'),\n",
|
||||
" shared_xaxes=True, shared_yaxes=True,)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Loop through combinations of observed/forecasted swash/collision\n",
|
||||
"data = []\n",
|
||||
"for forecasted_regime, observed_regime in itertools.product(['swash','collision'],repeat=2):\n",
|
||||
" \n",
|
||||
" # Get data for this combination \n",
|
||||
" query = 'forecasted_regime==\"{}\" & observed_regime==\"{}\"'.format(forecasted_regime, observed_regime)\n",
|
||||
" df_data = df.query(query)\n",
|
||||
" print(query)\n",
|
||||
" \n",
|
||||
" \n",
|
||||
" # Determine which subplot to plot results in\n",
|
||||
" if forecasted_regime == 'swash' and observed_regime == 'swash':\n",
|
||||
" x_col = 1\n",
|
||||
" y_col = 1\n",
|
||||
" elif forecasted_regime == 'collision' and observed_regime == 'collision':\n",
|
||||
" x_col = 2\n",
|
||||
" y_col = 2\n",
|
||||
" elif forecasted_regime == 'swash' and observed_regime == 'collision':\n",
|
||||
" x_col = 2\n",
|
||||
" y_col = 1\n",
|
||||
" elif forecasted_regime == 'collision' and observed_regime == 'swash':\n",
|
||||
" x_col = 1\n",
|
||||
" y_col = 2\n",
|
||||
" else:\n",
|
||||
" print('something went wrong')\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" fig.append_trace(\n",
|
||||
" go.Scatter(\n",
|
||||
" x=df_data.prestorm_beta,\n",
|
||||
" y=df_data.poststorm_beta,\n",
|
||||
" text = df_data.index.tolist(),\n",
|
||||
" hoverinfo = 'text',\n",
|
||||
" mode = 'markers',\n",
|
||||
" line = dict(\n",
|
||||
" color = ('rgba(22, 22, 22, 0.2)'),\n",
|
||||
" width = 0.5,)),\n",
|
||||
" x_col,\n",
|
||||
" y_col)\n",
|
||||
"\n",
|
||||
"# layout = go.Layout(\n",
|
||||
"# xaxis=dict(domain=[0, 0.45]),\n",
|
||||
"# yaxis=dict(\n",
|
||||
"# domain=[0, 0.45],\n",
|
||||
"# type='log',\n",
|
||||
"# ),\n",
|
||||
"# xaxis2=dict(domain=[0.55, 1]),\n",
|
||||
"# xaxis4=dict(domain=[0.55, 1], anchor='y4'),\n",
|
||||
"# yaxis3=dict(\n",
|
||||
"# domain=[0.55, 1],\n",
|
||||
"# type='log',\n",
|
||||
"# ),\n",
|
||||
"# yaxis4=dict(\n",
|
||||
"# domain=[0.55, 1],\n",
|
||||
"# anchor='x4',\n",
|
||||
"# type='log',\n",
|
||||
"# ))\n",
|
||||
"\n",
|
||||
"fig['layout'].update(showlegend=False, title='Specs with Subplot Title',height=800,)\n",
|
||||
"\n",
|
||||
"for ax in ['yaxis','yaxis2']:\n",
|
||||
"# fig['layout'][ax]['type']='log'\n",
|
||||
" fig['layout'][ax]['range']= [0,0.2]\n",
|
||||
"\n",
|
||||
"for ax in ['xaxis', 'xaxis2']:\n",
|
||||
" fig['layout'][ax]['range']= [0,0.2]\n",
|
||||
"\n",
|
||||
"go.FigureWidget(fig)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"hidden": true
|
||||
},
|
||||
"source": [
|
||||
"Looking at the above plot:\n",
|
||||
"- In general, we can see that the prestorm mean slope is flatter than the poststorm mean slope. This can be explained by the presence of prestorm berms, which increase the prestorm mean slope. During the storm, these berms get eroded and decrease the slope.\n",
|
||||
"- **Collision/Collision**: Where we observe and predict collision, we see steeper prestorm slopes. This is to be expected since larger slopes will generate more runup and higher TWLs.\n",
|
||||
"- **Swash/Collision**: Where we predict collision but observe swash, we can see that the prestorm mean slopes >0.1 generate high TWLs. \n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Plot our data in a confusion matrix\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df[cc_mask].loc[df[cc_mask].poststorm_beta+0.05< df[cc_mask].prestorm_beta]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"f, ([ax1, ax2], [ax3, ax4],) = plt.subplots(\n",
|
||||
" 2,\n",
|
||||
" 2,\n",
|
||||
" sharey=True,\n",
|
||||
" sharex=True,\n",
|
||||
" figsize=(8, 7))\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"ss_mask = (df.observed_regime=='swash') & (df.forecasted_regime=='swash')\n",
|
||||
"sc_mask = (df.observed_regime=='swash') & (df.forecasted_regime=='collision')\n",
|
||||
"cs_mask = (df.observed_regime=='collision') & (df.forecasted_regime=='swash')\n",
|
||||
"cc_mask = (df.observed_regime=='collision') & (df.forecasted_regime=='collision')\n",
|
||||
"\n",
|
||||
"# Define colormap for our observations\n",
|
||||
"cm = plt.cm.get_cmap('plasma')\n",
|
||||
"\n",
|
||||
"params = {'edgecolors': '#999999',\n",
|
||||
" 's': 12,\n",
|
||||
" 'linewidth': 0.1, \n",
|
||||
" 'cmap':cm,\n",
|
||||
" 'vmin':0, \n",
|
||||
" 'vmax':60\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"sc=ax1.scatter(df[ss_mask].prestorm_beta, df[ss_mask].poststorm_beta, c=df[ss_mask].width_change,**params)\n",
|
||||
"ax1.set_title('Swash/Swash')\n",
|
||||
"ax1.set_ylabel('Observed swash')\n",
|
||||
"\n",
|
||||
"ax2.scatter(df[sc_mask].prestorm_beta, df[sc_mask].poststorm_beta, c=df[sc_mask].width_change,**params)\n",
|
||||
"ax2.set_title('Swash/Collision')\n",
|
||||
"\n",
|
||||
"ax3.scatter(df[cs_mask].prestorm_beta, df[cs_mask].poststorm_beta, c=df[cs_mask].width_change,**params)\n",
|
||||
"ax3.set_title('Collision/Swash')\n",
|
||||
"ax3.set_ylabel('Observed collision')\n",
|
||||
"ax3.set_xlabel('Predicted swash')\n",
|
||||
"\n",
|
||||
"ax4.scatter(df[cc_mask].prestorm_beta, df[cc_mask].poststorm_beta, c=df[cc_mask].width_change,**params)\n",
|
||||
"ax4.set_title('Collision/Collision')\n",
|
||||
"ax4.set_xlabel('Predicted collision')\n",
|
||||
"\n",
|
||||
"for ax in [ax1,ax2,ax3,ax4]:\n",
|
||||
" ax.plot([0,0.2],[0,0.2], 'k--')\n",
|
||||
" ax.set_xlim([0,0.2])\n",
|
||||
" ax.set_ylim([0,0.2])\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"# Create a big ax so we can use common axis labels\n",
|
||||
"# https://stackoverflow.com/a/36542971\n",
|
||||
"f.add_subplot(111, frameon=False)\n",
|
||||
"plt.tick_params(labelcolor='none', top=False, bottom=False, left=False, right=False)\n",
|
||||
"plt.grid(False)\n",
|
||||
"plt.xlabel(\"Prestorm mean slope (-)\", labelpad=25)\n",
|
||||
"plt.ylabel(\"Poststorm mean slope (-)\", labelpad=25)\n",
|
||||
" \n",
|
||||
"# Layout adjustment\n",
|
||||
"plt.tight_layout()\n",
|
||||
"plt.subplots_adjust(hspace=0.25, bottom=0.1,right=0.9)\n",
|
||||
"\n",
|
||||
"# Add colorbar\n",
|
||||
"cbar_ax = f.add_axes([0.95, 0.15, 0.05, 0.7])\n",
|
||||
"cb = f.colorbar(sc, cax=cbar_ax)\n",
|
||||
"cb.set_label(r'$\\varDelta$ beach width at MHW (m)')\n",
|
||||
"\n",
|
||||
"# Save and show figure\n",
|
||||
"plt.savefig('06-confusion-change-in-slope.png'.format(beach), dpi=600, bbox_inches='tight') \n",
|
||||
"plt.show()\n",
|
||||
"plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Plot for single beach"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"beach = 'NARRA'\n",
|
||||
"\n",
|
||||
"df_beach = df.loc[df.index.str.contains(beach)]\n",
|
||||
"\n",
|
||||
"# Get index\n",
|
||||
"n = [x for x in range(len(df_beach))][::-1]\n",
|
||||
"n_sites = [x for x in df_beach.index][::-1]\n",
|
||||
"\n",
|
||||
"f, (ax1,ax2,ax3,ax4) = plt.subplots(1,4, sharey=True,figsize=(10, 8))\n",
|
||||
"\n",
|
||||
"ax1.plot(df_beach.prestorm_beta,n,label='Prestorm slope',color='#4d9221')\n",
|
||||
"ax1.plot(df_beach.poststorm_beta,n,label='Poststorm slope',color='#c51b7d')\n",
|
||||
"ax1.set_title('Mean beach slope')\n",
|
||||
"ax1.legend(loc='center', bbox_to_anchor=(0.5, -0.15))\n",
|
||||
"\n",
|
||||
"# Replace yticks with site_ids\n",
|
||||
"yticks = ax1.get_yticks().tolist()\n",
|
||||
"yticks = [n_sites[int(y)] if 0 <= y <= len(n_sites) else y for y in yticks ]\n",
|
||||
"ax1.set_yticklabels(yticks)\n",
|
||||
"ax1.set_xlabel(r'Slope (-)')\n",
|
||||
"\n",
|
||||
"ax2.plot(df_beach.prestorm_width,n,label='Prestorm width',color='#4d9221')\n",
|
||||
"ax2.plot(df_beach.poststorm_width,n, label='Poststorm width',color='#c51b7d')\n",
|
||||
"# ax2.set_xlim([200,300])\n",
|
||||
"ax2.set_xlabel(r'Beach width (m)')\n",
|
||||
"ax2.set_title('Beach width\\nat MHW')\n",
|
||||
"ax2.legend(loc='center', bbox_to_anchor=(0.5, -0.15))\n",
|
||||
"\n",
|
||||
"ax3.plot(df_beach.width_change,n,color='#999999')\n",
|
||||
"ax3.set_xlim([0,75])\n",
|
||||
"ax3.set_title('Change in MHW\\nbeach width')\n",
|
||||
"ax3.set_xlabel(r'$\\varDelta$ Beach width (m)')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"ax4.plot(df_beach.poststorm_beta / df_beach.prestorm_beta,n,color='#999999')\n",
|
||||
"ax4.set_title('Ratio between pre and\\npost storm mean slopes')\n",
|
||||
"\n",
|
||||
"plt.tight_layout()\n",
|
||||
"f.subplots_adjust(top=0.88)\n",
|
||||
"f.suptitle(beach)\n",
|
||||
"\n",
|
||||
"# Print to figure\n",
|
||||
"plt.savefig('06-change-in-slope-{}.png'.format(beach), dpi=600, bbox_inches='tight') \n",
|
||||
"plt.show()\n",
|
||||
"plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_beach"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,767 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Narrabeen Slope Test\n",
|
||||
"With full topo and bathy combined"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Enable autoreloading of our modules. \n",
|
||||
"# Most of the code will be located in the /src/ folder, \n",
|
||||
"# and then called from the notebook.\n",
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"import decimal\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tls\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"from plotly import tools\n",
|
||||
"import plotly.io as pio\n",
|
||||
"from scipy import stats\n",
|
||||
"import math\n",
|
||||
"import matplotlib\n",
|
||||
"from matplotlib import cm\n",
|
||||
"import colorlover as cl\n",
|
||||
"from tqdm import tqdm_notebook\n",
|
||||
"from ipywidgets import widgets, Output\n",
|
||||
"from IPython.display import display, clear_output, Image, HTML\n",
|
||||
"from scipy import stats\n",
|
||||
"from sklearn.metrics import confusion_matrix\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"from scipy.interpolate import interp1d\n",
|
||||
"from pandas.api.types import CategoricalDtype\n",
|
||||
"from scipy.interpolate import UnivariateSpline\n",
|
||||
"from shapely.geometry import Point, LineString"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10,6)\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.5\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"\n",
|
||||
"# https://stackoverflow.com/a/20709149\n",
|
||||
"matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage{helvet}', # set the normal font here\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"] "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import .csv data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"data_filename = '08-narr-topo-bathy-slope-test-full-profiles.csv'\n",
|
||||
"\n",
|
||||
"df_profiles = pd.read_csv(data_filename).set_index(['site_id','x'])\n",
|
||||
"df_profiles = df_profiles[~df_profiles.index.duplicated(keep='first')]\n",
|
||||
"print('df_profiles:')\n",
|
||||
"df_profiles.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Manually cut off the prestorm topo \n",
|
||||
"cuts = {'NARRA0004': {'prestorm_topo_max_x': 330,\n",
|
||||
" 'poststorm_topo_max_x': 250},\n",
|
||||
" 'NARRA0008': {'prestorm_topo_max_x': 290,\n",
|
||||
" 'poststorm_topo_max_x': 250},\n",
|
||||
" 'NARRA0012': {'prestorm_topo_max_x': 300,\n",
|
||||
" 'poststorm_topo_max_x': 250},\n",
|
||||
" 'NARRA0016': {'prestorm_topo_max_x': 300,\n",
|
||||
" 'poststorm_topo_max_x': 225},\n",
|
||||
" 'NARRA0021': {'prestorm_topo_max_x': 280,\n",
|
||||
" 'poststorm_topo_max_x': 225},\n",
|
||||
" 'NARRA0023': {'prestorm_topo_max_x': 275,\n",
|
||||
" 'poststorm_topo_max_x': 215},\n",
|
||||
" 'NARRA0027': {'prestorm_topo_max_x': 260,\n",
|
||||
" 'poststorm_topo_max_x': 225},\n",
|
||||
" 'NARRA0031': {'prestorm_topo_max_x': 260,\n",
|
||||
" 'poststorm_topo_max_x': 225},\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"for site_id in cuts:\n",
|
||||
" mask1 = df_profiles.index.get_level_values('site_id') == site_id\n",
|
||||
" mask2 = df_profiles.index.get_level_values('x') > cuts[site_id]['prestorm_topo_max_x']\n",
|
||||
" df_profiles.loc[(mask1)&(mask2), 'pre_topo'] = np.nan\n",
|
||||
" \n",
|
||||
" mask3 = df_profiles.index.get_level_values('x') > cuts[site_id]['poststorm_topo_max_x']\n",
|
||||
" df_profiles.loc[(mask1)&(mask3), 'post_topo'] = np.nan\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# for site_id,df_site in df_profiles.groupby('site_id'):\n",
|
||||
"# f, (ax1) = plt.subplots(1,1, figsize=(6, 3))\n",
|
||||
"# ax1.set_title(site_id)\n",
|
||||
" \n",
|
||||
"# ax1.plot(df_site.index.get_level_values('x'),\n",
|
||||
"# df_site.pre_topo,\n",
|
||||
"# label='Pre Topo',\n",
|
||||
"# color='#2c7bb6')\n",
|
||||
"# ax1.plot(df_site.index.get_level_values('x'),\n",
|
||||
"# df_site.pre_bathy,\n",
|
||||
"# label='Pre Bathy',\n",
|
||||
"# color='#abd9e9')\n",
|
||||
"\n",
|
||||
"# ax1.plot(df_site.index.get_level_values('x'),\n",
|
||||
"# df_site.post_topo,\n",
|
||||
"# label='Post Topo',\n",
|
||||
"# color='#d7191c')\n",
|
||||
"# ax1.plot(df_site.index.get_level_values('x'),\n",
|
||||
"# df_site.post_bathy,\n",
|
||||
"# label='Post Bathy',\n",
|
||||
"# color='#fdae61')\n",
|
||||
"\n",
|
||||
"# ax1.legend()\n",
|
||||
"# plt.show()\n",
|
||||
"# plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_profiles = df_profiles.dropna(\n",
|
||||
" subset=['post_topo', 'post_bathy', 'pre_bathy', 'pre_topo'], how='all')\n",
|
||||
"\n",
|
||||
"df_profiles = df_profiles.reset_index()\n",
|
||||
"df_profiles = df_profiles.melt(id_vars=['site_id','x','lat','lon'],\n",
|
||||
" value_vars=['post_topo','post_bathy','pre_bathy','pre_topo']).rename(columns={'variable':'profile_type', 'value':'z'})\n",
|
||||
"\n",
|
||||
"df_profiles = df_profiles.dropna(subset=['z'])\n",
|
||||
"\n",
|
||||
"df_profiles.loc[df_profiles.profile_type=='post_topo','profile_type']='poststorm'\n",
|
||||
"df_profiles.loc[df_profiles.profile_type=='post_bathy','profile_type']='poststorm'\n",
|
||||
"df_profiles.loc[df_profiles.profile_type=='pre_topo','profile_type']='prestorm'\n",
|
||||
"df_profiles.loc[df_profiles.profile_type=='pre_bathy','profile_type']='prestorm'\n",
|
||||
"\n",
|
||||
"df_profiles = df_profiles.set_index(['site_id', 'profile_type', 'x'])\n",
|
||||
"df_profiles = df_profiles[~df_profiles.index.duplicated(keep='first')]\n",
|
||||
"\n",
|
||||
"df_profiles = df_profiles.sort_index()\n",
|
||||
"\n",
|
||||
"print('df_profiles:')\n",
|
||||
"df_profiles.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Just plots each site's x and z values\n",
|
||||
"for site_id,df_site in df_profiles.groupby('site_id'):\n",
|
||||
" f, (ax1) = plt.subplots(1,1, figsize=(6, 3))\n",
|
||||
" ax1.set_title(site_id)\n",
|
||||
" \n",
|
||||
" prestorm=df_site.index.get_level_values('profile_type') == 'prestorm'\n",
|
||||
" ax1.plot(df_site[prestorm].index.get_level_values('x'),\n",
|
||||
" df_site[prestorm].z,\n",
|
||||
" label='Pre Topo',\n",
|
||||
" color='#2c7bb6')\n",
|
||||
"\n",
|
||||
" \n",
|
||||
" poststorm=df_site.index.get_level_values('profile_type') == 'poststorm'\n",
|
||||
" ax1.plot(df_site[poststorm].index.get_level_values('x'),\n",
|
||||
" df_site[poststorm].z,\n",
|
||||
" label='Post Topo',\n",
|
||||
" color='#d7191c')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" ax1.legend()\n",
|
||||
" plt.show()\n",
|
||||
" plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Get dune faces"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Manually define dune x coordinates and work out slope\n",
|
||||
"\n",
|
||||
"dune_data = [\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0004',\n",
|
||||
" 'dune_crest_x': 180,\n",
|
||||
" 'dune_toe_x': 205\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0008',\n",
|
||||
" 'dune_crest_x': 180,\n",
|
||||
" 'dune_toe_x': 205\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0012',\n",
|
||||
" 'dune_crest_x': 195,\n",
|
||||
" 'dune_toe_x': 205\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0016',\n",
|
||||
" 'dune_crest_x': 190,\n",
|
||||
" 'dune_toe_x': 200\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0021',\n",
|
||||
" 'dune_crest_x': 205,\n",
|
||||
" 'dune_toe_x': 210\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0023',\n",
|
||||
" 'dune_crest_x': 205,\n",
|
||||
" 'dune_toe_x': 215\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0027',\n",
|
||||
" 'dune_crest_x': 210,\n",
|
||||
" 'dune_toe_x': 219\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'site_id': 'NARRA0031',\n",
|
||||
" 'dune_crest_x': 210,\n",
|
||||
" 'dune_toe_x': 218\n",
|
||||
" },\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"for site_dune in dune_data:\n",
|
||||
" df_site = df_profiles.xs(site_dune['site_id'], level='site_id').xs('prestorm',level='profile_type')\n",
|
||||
" \n",
|
||||
" dune_crest_x = site_dune['dune_crest_x']\n",
|
||||
" dune_toe_x = site_dune['dune_toe_x']\n",
|
||||
" dune_crest_z = df_site.iloc[df_site.index.get_loc(site_dune['dune_crest_x'],method='nearest')].z\n",
|
||||
" dune_toe_z = df_site.iloc[df_site.index.get_loc(site_dune['dune_toe_x'],method='nearest')].z\n",
|
||||
"\n",
|
||||
" dune_slope = (dune_crest_z - dune_toe_z)/(dune_crest_x - dune_toe_x)\n",
|
||||
" \n",
|
||||
" site_dune['dune_crest_z'] = dune_crest_z\n",
|
||||
" site_dune['dune_toe_z'] = dune_toe_z\n",
|
||||
" site_dune['dune_slope'] = dune_slope\n",
|
||||
" \n",
|
||||
" \n",
|
||||
"# Join back into main data\n",
|
||||
"df_dunes = pd.DataFrame(dune_data).set_index('site_id')\n",
|
||||
"print('df_dunes:')\n",
|
||||
"df_dunes.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# # Just plots each site's x and z values\n",
|
||||
"# for site_id,df_site in df_profiles.xs('prestorm',level='profile_type').groupby('site_id'):\n",
|
||||
"# f, (ax1) = plt.subplots(1,1, figsize=(6, 3))\n",
|
||||
"# ax1.set_title(site_id)\n",
|
||||
"# ax1.plot(df_site.index.get_level_values('x'),\n",
|
||||
"# df_site.z)\n",
|
||||
"# ax1.plot([df_dunes.loc[site_id].dune_crest_x, df_dunes.loc[site_id].dune_toe_x],\n",
|
||||
"# [df_dunes.loc[site_id].dune_crest_z, df_dunes.loc[site_id].dune_toe_z],\n",
|
||||
"# 'r.-')\n",
|
||||
"# ax1.set_xlim([150,250])\n",
|
||||
"# ax1.set_ylim([0,15])\n",
|
||||
"# plt.show()\n",
|
||||
"# plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Get prestorm slope"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"z_ele = 0.7\n",
|
||||
"debug=False\n",
|
||||
"\n",
|
||||
"def find_nearest_idx(array, value):\n",
|
||||
" array = np.asarray(array)\n",
|
||||
" idx = (np.abs(array - value)).argmin()\n",
|
||||
" return idx\n",
|
||||
"\n",
|
||||
"prestorm_slope_data =[]\n",
|
||||
"for site_id, df_site in df_profiles.xs('prestorm',level='profile_type').groupby('site_id'):\n",
|
||||
" \n",
|
||||
" # Find index of our z_ele\n",
|
||||
" idx = np.where(df_site.z.values>=z_ele)[0][-1]\n",
|
||||
" \n",
|
||||
" prestorm_end_x = df_site.iloc[idx].name[1]\n",
|
||||
" prestorm_end_z = df_site.iloc[idx].z\n",
|
||||
" \n",
|
||||
" prestorm_start_x = df_dunes.loc[site_id].dune_toe_x\n",
|
||||
" prestorm_start_z = df_dunes.loc[site_id].dune_toe_z\n",
|
||||
" \n",
|
||||
" prestorm_slope = (prestorm_end_z-prestorm_start_z)/(prestorm_end_x-prestorm_start_x)\n",
|
||||
" \n",
|
||||
" prestorm_slope_data.append({\n",
|
||||
" 'site_id': site_id,\n",
|
||||
" 'prestorm_end_x': prestorm_end_x,\n",
|
||||
" 'prestorm_end_z': prestorm_end_z,\n",
|
||||
" 'prestorm_start_x': prestorm_start_x,\n",
|
||||
" 'prestorm_start_z': prestorm_start_z,\n",
|
||||
" 'prestorm_slope': prestorm_slope\n",
|
||||
" })\n",
|
||||
" \n",
|
||||
"df_prestorm_slope = pd.DataFrame(prestorm_slope_data).set_index(['site_id'])\n",
|
||||
"print('df_prestorm_slope:')\n",
|
||||
"df_prestorm_slope.head()\n",
|
||||
" "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Get shelf slope\n",
|
||||
"At 10 m contour"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Elevation to take shelf slope at\n",
|
||||
"z_ele = -9\n",
|
||||
"debug=False\n",
|
||||
"\n",
|
||||
"def find_nearest_idx(array, value):\n",
|
||||
" array = np.asarray(array)\n",
|
||||
" idx = (np.abs(array - value)).argmin()\n",
|
||||
" return idx\n",
|
||||
"\n",
|
||||
"def slope_at_point(x, z, z_ele,debug=False):\n",
|
||||
" # Smooth profile a bit\n",
|
||||
" # TODO the smoothing factor will change based on the number of data points\n",
|
||||
" # Need to fix\n",
|
||||
" s = UnivariateSpline(x, z, s=50)\n",
|
||||
" xs = np.linspace(min(x),max(x),1000)\n",
|
||||
" zs = s(xs)\n",
|
||||
"\n",
|
||||
" # Calculate derivates of spline\n",
|
||||
" dzdx = np.diff(zs)/np.diff(xs)\n",
|
||||
"\n",
|
||||
" # Find index of z_ele\n",
|
||||
" idx = find_nearest_idx(zs, z_ele)\n",
|
||||
" slope = dzdx[idx]\n",
|
||||
" shelf_x = xs[idx]\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" \n",
|
||||
" # For checking how much smoothing is going on\n",
|
||||
" if debug:\n",
|
||||
" f, (ax1) = plt.subplots(1,1, figsize=(6, 3))\n",
|
||||
" ax1.plot(x,z)\n",
|
||||
" ax1.plot(xs,zs)\n",
|
||||
" plt.show()\n",
|
||||
" plt.close()\n",
|
||||
" \n",
|
||||
" return slope, shelf_x, z_ele\n",
|
||||
" \n",
|
||||
"shelf_data = []\n",
|
||||
"for site_id, df_site in df_profiles.xs('prestorm',level='profile_type').groupby('site_id'):\n",
|
||||
" shelf_slope, shelf_x, shelf_z = slope_at_point(df_site.index.get_level_values('x').values,\n",
|
||||
" df_site.z, \n",
|
||||
" z_ele, debug=debug)\n",
|
||||
" shelf_data.append({\n",
|
||||
" 'site_id': site_id,\n",
|
||||
" 'shelf_slope': shelf_slope,\n",
|
||||
" 'shelf_x': shelf_x,\n",
|
||||
" 'shelf_z': shelf_z\n",
|
||||
" })\n",
|
||||
" \n",
|
||||
"df_shelf = pd.DataFrame(shelf_data).set_index(['site_id'])\n",
|
||||
"\n",
|
||||
"df_shelf.loc['NARRA0004','shelf_slope'] = -0.02\n",
|
||||
"\n",
|
||||
"print('df_shelf:')\n",
|
||||
"df_shelf.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Do geometry\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"df_site"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"for site_id, df_site in df_profiles.groupby('site_id'):\n",
|
||||
"\n",
|
||||
" # Project the dune face outwards\n",
|
||||
" dune_face_toe = Point(df_dunes.loc[site_id].dune_toe_x,\n",
|
||||
" df_dunes.loc[site_id].dune_toe_z)\n",
|
||||
" dune_face_sea = Point(\n",
|
||||
" df_dunes.loc[site_id].dune_toe_x + 1000,\n",
|
||||
" # df_dunes.loc[site_id].dune_toe_z +1000 * -1\n",
|
||||
" df_dunes.loc[site_id].dune_toe_z +\n",
|
||||
" 1000 * df_dunes.loc[site_id].dune_slope)\n",
|
||||
" dune_line = LineString([dune_face_toe, dune_face_sea])\n",
|
||||
"\n",
|
||||
" # Project the shelf slope landwards\n",
|
||||
" shelf_point = Point(df_shelf.loc[site_id].shelf_x,\n",
|
||||
" df_shelf.loc[site_id].shelf_z)\n",
|
||||
" shelf_land = Point(\n",
|
||||
" df_shelf.loc[site_id].shelf_x - 1000, df_shelf.loc[site_id].shelf_z -\n",
|
||||
" 1000 * df_shelf.loc[site_id].shelf_slope)\n",
|
||||
" shelf_sea = Point(\n",
|
||||
" df_shelf.loc[site_id].shelf_x + 1000, df_shelf.loc[site_id].shelf_z +\n",
|
||||
" 1000 * df_shelf.loc[site_id].shelf_slope)\n",
|
||||
" shelf_line = LineString([shelf_land, shelf_point, shelf_sea])\n",
|
||||
"\n",
|
||||
" # Find intersection between to lines\n",
|
||||
" dune_shelf_int = dune_line.intersection(shelf_line)\n",
|
||||
" dist_toe_to_int = dune_face_toe.distance(dune_shelf_int)\n",
|
||||
"\n",
|
||||
" # Plots\n",
|
||||
" f, (ax1) = plt.subplots(1, 1, figsize=(12, 4))\n",
|
||||
"\n",
|
||||
" # Raw profile prestorm\n",
|
||||
" ax1.plot(\n",
|
||||
" df_site.xs('prestorm',\n",
|
||||
" level='profile_type').index.get_level_values('x'),\n",
|
||||
" df_site.xs('prestorm', level='profile_type').z,\n",
|
||||
" label='Prestorm profile')\n",
|
||||
"\n",
|
||||
" # Raw profile poststorm\n",
|
||||
" ax1.plot(\n",
|
||||
" df_site.xs('poststorm',\n",
|
||||
" level='profile_type').index.get_level_values('x'),\n",
|
||||
" df_site.xs('poststorm', level='profile_type').z,\n",
|
||||
" label='Poststorm profile')\n",
|
||||
"\n",
|
||||
" # Dune face\n",
|
||||
" ax1.plot(\n",
|
||||
" [df_dunes.loc[site_id].dune_crest_x, df_dunes.loc[site_id].dune_toe_x],\n",
|
||||
" [df_dunes.loc[site_id].dune_crest_z, df_dunes.loc[site_id].dune_toe_z],\n",
|
||||
" linestyle=':',\n",
|
||||
" color='#999999',\n",
|
||||
" label='Dune face ({:.2f})'.format(-df_dunes.loc[site_id].dune_slope))\n",
|
||||
"\n",
|
||||
" # Projected dune face\n",
|
||||
" ax1.plot(\n",
|
||||
" dune_line.xy[0],\n",
|
||||
" dune_line.xy[1],\n",
|
||||
" linestyle='--',\n",
|
||||
" color='#999999',\n",
|
||||
" label='Dune face (projected)')\n",
|
||||
"\n",
|
||||
" # Projected shelf slope\n",
|
||||
" ax1.plot(\n",
|
||||
" shelf_line.xy[0],\n",
|
||||
" shelf_line.xy[1],\n",
|
||||
" linestyle='--',\n",
|
||||
" color='#999999',\n",
|
||||
" label='Shelf slope (projected)')\n",
|
||||
"\n",
|
||||
" # Intersection\n",
|
||||
" ax1.scatter(\n",
|
||||
" dune_shelf_int.xy[0],\n",
|
||||
" dune_shelf_int.xy[1],\n",
|
||||
" marker='x',\n",
|
||||
" color='#999999',\n",
|
||||
" label='Dune/shelf projected intersection')\n",
|
||||
"\n",
|
||||
" # Prestorm slope\n",
|
||||
" ax1.plot([\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_x,\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_end_x\n",
|
||||
" ], [\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_z,\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_end_z\n",
|
||||
" ],\n",
|
||||
" color='violet',\n",
|
||||
" label='Prestorm slope ({:.2f})'.format(\n",
|
||||
" -df_prestorm_slope.loc[site_id].prestorm_slope))\n",
|
||||
"\n",
|
||||
" # # Find best slope based on distance form toe to intersection?\n",
|
||||
" # best_slope_toe = shelf_line.interpolate(\n",
|
||||
" # shelf_line.project(intersection) - 4 * dist_toe_to_int)\n",
|
||||
" # best_slope = (dune_face_toe.xy[1][0] - best_slope_toe.xy[1][0]) / (\n",
|
||||
" # dune_face_toe.xy[0][0] - best_slope_toe.xy[0][0])\n",
|
||||
"\n",
|
||||
" # # Best slope toe\n",
|
||||
" # ax1.scatter(\n",
|
||||
" # best_slope_toe.xy[0], best_slope_toe.xy[1], marker='o', color='g')\n",
|
||||
"\n",
|
||||
" # # Best slope\n",
|
||||
" # ax1.plot([dune_face_toe.xy[0], best_slope_toe.xy[0]],\n",
|
||||
" # [dune_face_toe.xy[1], best_slope_toe.xy[1]],\n",
|
||||
" # color='g',\n",
|
||||
" # label='Best slope ({:.3f})'.format(-best_slope))\n",
|
||||
"\n",
|
||||
" # Find best slope based on intersection of prestorm slope and surf zone slope\n",
|
||||
" prestorm_slope_line = LineString([\n",
|
||||
" Point(\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_x,\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_z,\n",
|
||||
" ),\n",
|
||||
" Point(\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_x + 10000,\n",
|
||||
" df_prestorm_slope.loc[site_id].prestorm_start_z +\n",
|
||||
" 10000 * df_prestorm_slope.loc[site_id].prestorm_slope)\n",
|
||||
" ])\n",
|
||||
"\n",
|
||||
" # Where prestorm slope projection intersects shelf line\n",
|
||||
" prestorm_slope_shelf_int = prestorm_slope_line.intersection(shelf_line)\n",
|
||||
"\n",
|
||||
" # Distance between dune/shelf intersection and prestorm/shelf intersection\n",
|
||||
" dist_shelf_prestorm_ints = prestorm_slope_shelf_int.distance(\n",
|
||||
" dune_shelf_int)\n",
|
||||
"\n",
|
||||
" best_slope_pt = shelf_line.interpolate(\n",
|
||||
" shelf_line.project(dune_shelf_int) + 0.3 * (shelf_line.project(prestorm_slope_shelf_int) -\n",
|
||||
" shelf_line.project(dune_shelf_int)))\n",
|
||||
" \n",
|
||||
" best_slope =(df_prestorm_slope.loc[site_id].prestorm_start_z-best_slope_pt.xy[1][0])/(df_prestorm_slope.loc[site_id].prestorm_start_x-best_slope_pt.xy[0][0])\n",
|
||||
" \n",
|
||||
" if not prestorm_slope_shelf_int.is_empty:\n",
|
||||
" ax1.plot(\n",
|
||||
" prestorm_slope_shelf_int.xy[0],\n",
|
||||
" prestorm_slope_shelf_int.xy[1],\n",
|
||||
" marker='x',\n",
|
||||
" color='#999999',\n",
|
||||
" label='Prestorm slope/shelf\\nprojected intersection')\n",
|
||||
" ax1.plot(\n",
|
||||
" prestorm_slope_line.xy[0],\n",
|
||||
" prestorm_slope_line.xy[1],\n",
|
||||
" color='#999999',\n",
|
||||
" linestyle='--',\n",
|
||||
" label='Prestorm slope projected line')\n",
|
||||
" ax1.plot(\n",
|
||||
" [df_prestorm_slope.loc[site_id].prestorm_start_x,\n",
|
||||
" best_slope_pt.xy[0][0]],\n",
|
||||
" [df_prestorm_slope.loc[site_id].prestorm_start_z,\n",
|
||||
" best_slope_pt.xy[1][0]],\n",
|
||||
" color='red',\n",
|
||||
" linestyle='--',\n",
|
||||
" label='Best slope ({:.3f})'.format(-best_slope))\n",
|
||||
" \n",
|
||||
" # TEMP Target slopes\n",
|
||||
" target_slopes = {\n",
|
||||
" 'NARRA0004': 0.076,\n",
|
||||
" 'NARRA0008': 0.093,\n",
|
||||
" 'NARRA0012': 0.060,\n",
|
||||
" 'NARRA0016': 0.11,\n",
|
||||
" 'NARRA0021': 0.063,\n",
|
||||
" 'NARRA0023': 0.061,\n",
|
||||
" 'NARRA0027': 0.060,\n",
|
||||
" 'NARRA0031': 0.057,\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
" target_direction = {\n",
|
||||
" 'NARRA0004': \"flatter\",\n",
|
||||
" 'NARRA0008': \"steeper\",\n",
|
||||
" 'NARRA0012': \"flatter\",\n",
|
||||
" 'NARRA0016': \"flatter\",\n",
|
||||
" 'NARRA0021': \"steeper\",\n",
|
||||
" 'NARRA0023': \"steeper\",\n",
|
||||
" 'NARRA0027': \"steeper\",\n",
|
||||
" 'NARRA0031': \"steeper\",\n",
|
||||
" }\n",
|
||||
" ax1.plot([dune_face_toe.xy[0][0], dune_face_toe.xy[0][0] + 1000], [\n",
|
||||
" dune_face_toe.xy[1][0],\n",
|
||||
" dune_face_toe.xy[1][0] - 1000 * target_slopes[site_id]\n",
|
||||
" ],\n",
|
||||
" color='red',\n",
|
||||
" label='Target slope\\n({} than {:.3f})'.format(\n",
|
||||
" target_direction[site_id], target_slopes[site_id]))\n",
|
||||
"\n",
|
||||
" ax1.set_xlim([100, 800])\n",
|
||||
" ax1.set_ylim([-15, 12])\n",
|
||||
"# ax1.set_xlim([100, 600])\n",
|
||||
"# ax1.set_ylim([-10, 12])\n",
|
||||
"\n",
|
||||
" # ax1.set_xlim([df_dunes.loc[site_id].dune_crest_x - 50,\n",
|
||||
" # intersection.xy[0][0] + 50])\n",
|
||||
" # ax1.set_ylim([intersection.xy[1][0] -3,\n",
|
||||
" # df_dunes.loc[site_id].dune_crest_z + 3])\n",
|
||||
"\n",
|
||||
" ax1.set_title(site_id)\n",
|
||||
" ax1.legend(loc='upper right', prop={'size': 10})\n",
|
||||
" f.savefig('08-{}.png'.format(site_id), dpi=600)\n",
|
||||
" plt.show()\n",
|
||||
" plt.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"dune_shelf_int"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
@ -1,337 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Run comparison\n",
|
||||
"Create a comparison between different runs by looking at the different R_high values and storm regimes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Enable autoreloading of our modules. \n",
|
||||
"# Most of the code will be located in the /src/ folder, \n",
|
||||
"# and then called from the notebook.\n",
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"import decimal\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tls\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"from plotly import tools\n",
|
||||
"import plotly.io as pio\n",
|
||||
"from scipy import stats\n",
|
||||
"import math\n",
|
||||
"import matplotlib\n",
|
||||
"from matplotlib import cm\n",
|
||||
"import colorlover as cl\n",
|
||||
"from tqdm import tqdm_notebook\n",
|
||||
"from ipywidgets import widgets, Output\n",
|
||||
"from IPython.display import display, clear_output, Image, HTML\n",
|
||||
"from scipy import stats\n",
|
||||
"from sklearn.metrics import confusion_matrix\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"from scipy.interpolate import interp1d\n",
|
||||
"from pandas.api.types import CategoricalDtype\n",
|
||||
"from scipy.interpolate import UnivariateSpline\n",
|
||||
"from shapely.geometry import Point, LineString"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10,6)\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.5\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"\n",
|
||||
"# https://stackoverflow.com/a/20709149\n",
|
||||
"matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage{helvet}', # set the normal font here\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"] "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_waves = df_from_csv('waves.csv', index_col=[0, 1])\n",
|
||||
"df_tides = df_from_csv('tides.csv', index_col=[0, 1])\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"df_sites = df_from_csv('sites.csv', index_col=[0])\n",
|
||||
"df_profile_features_crest_toes = df_from_csv('profile_features_crest_toes.csv', index_col=[0])\n",
|
||||
"\n",
|
||||
"# Note that the forecasted data sets should be in the same order for impacts and twls\n",
|
||||
"impacts = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'postintertidal_slope_hol86': df_from_csv('impacts_forecasted_postintertidal_slope_hol86.csv', index_col=[0]),\n",
|
||||
" 'postintertidal_slope_nie91': df_from_csv('impacts_forecasted_postintertidal_slope_nie91.csv', index_col=[0]),\n",
|
||||
" 'postintertidal_slope_pow18': df_from_csv('impacts_forecasted_postintertidal_slope_pow18.csv', index_col=[0]),\n",
|
||||
" 'postintertidal_slope_sto06': df_from_csv('impacts_forecasted_postintertidal_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'postmean_slope_hol86': df_from_csv('impacts_forecasted_postmean_slope_hol86.csv', index_col=[0]),\n",
|
||||
" 'postmean_slope_nie91': df_from_csv('impacts_forecasted_postmean_slope_nie91.csv', index_col=[0]),\n",
|
||||
" 'postmean_slope_pow18': df_from_csv('impacts_forecasted_postmean_slope_pow18.csv', index_col=[0]),\n",
|
||||
" 'postmean_slope_sto06': df_from_csv('impacts_forecasted_postmean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'preintertidal_slope_hol86': df_from_csv('impacts_forecasted_preintertidal_slope_hol86.csv', index_col=[0]),\n",
|
||||
" 'preintertidal_slope_nie91': df_from_csv('impacts_forecasted_preintertidal_slope_nie91.csv', index_col=[0]),\n",
|
||||
" 'preintertidal_slope_pow18': df_from_csv('impacts_forecasted_preintertidal_slope_pow18.csv', index_col=[0]),\n",
|
||||
" 'preintertidal_slope_sto06': df_from_csv('impacts_forecasted_preintertidal_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'premean_slope_hol86': df_from_csv('impacts_forecasted_premean_slope_hol86.csv', index_col=[0]),\n",
|
||||
" 'premean_slope_nie91': df_from_csv('impacts_forecasted_premean_slope_nie91.csv', index_col=[0]),\n",
|
||||
" 'premean_slope_pow18': df_from_csv('impacts_forecasted_premean_slope_pow18.csv', index_col=[0]),\n",
|
||||
" 'premean_slope_sto06': df_from_csv('impacts_forecasted_premean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" },\n",
|
||||
" 'observed': df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"twls = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'postintertidal_slope_hol86.csv': df_from_csv('twl_postintertidal_slope_hol86.csv', index_col=[0,1]),\n",
|
||||
" 'postintertidal_slope_nie91.csv': df_from_csv('twl_postintertidal_slope_nie91.csv', index_col=[0,1]),\n",
|
||||
" 'postintertidal_slope_pow18.csv': df_from_csv('twl_postintertidal_slope_pow18.csv', index_col=[0,1]),\n",
|
||||
" 'postintertidal_slope_sto06.csv': df_from_csv('twl_postintertidal_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'postmean_slope_hol86.csv': df_from_csv('twl_postmean_slope_hol86.csv', index_col=[0,1]),\n",
|
||||
" 'postmean_slope_nie91.csv': df_from_csv('twl_postmean_slope_nie91.csv', index_col=[0,1]),\n",
|
||||
" 'postmean_slope_pow18.csv': df_from_csv('twl_postmean_slope_pow18.csv', index_col=[0,1]),\n",
|
||||
" 'postmean_slope_sto06.csv': df_from_csv('twl_postmean_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'preintertidal_slope_hol86.csv': df_from_csv('twl_preintertidal_slope_hol86.csv', index_col=[0,1]),\n",
|
||||
" 'preintertidal_slope_nie91.csv': df_from_csv('twl_preintertidal_slope_nie91.csv', index_col=[0,1]),\n",
|
||||
" 'preintertidal_slope_pow18.csv': df_from_csv('twl_preintertidal_slope_pow18.csv', index_col=[0,1]),\n",
|
||||
" 'preintertidal_slope_sto06.csv': df_from_csv('twl_preintertidal_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'premean_slope_hol86.csv': df_from_csv('twl_premean_slope_hol86.csv', index_col=[0,1]),\n",
|
||||
" 'premean_slope_nie91.csv': df_from_csv('twl_premean_slope_nie91.csv', index_col=[0,1]),\n",
|
||||
" 'premean_slope_pow18.csv': df_from_csv('twl_premean_slope_pow18.csv', index_col=[0,1]),\n",
|
||||
" 'premean_slope_sto06.csv': df_from_csv('twl_premean_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Get prediction accuracy\n",
|
||||
"Use [scikit-learn](https://scikit-learn.org/stable/modules/model_evaluation.html#classification-metrics) model evaluation metrics"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import pprint\n",
|
||||
"pp = pprint.PrettyPrinter(indent=2)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import sklearn.metrics\n",
|
||||
"\n",
|
||||
"# Encode the storm regimes values as categorical intgers so we can compare them\n",
|
||||
"cat_type = CategoricalDtype(\n",
|
||||
" categories=[\"swash\", \"collision\", \"overwash\", \"inundation\"], ordered=True)\n",
|
||||
"correct_regime = impacts['observed'].storm_regime.astype(\n",
|
||||
" cat_type).cat.codes.values\n",
|
||||
"\n",
|
||||
"# Define our forecast model names\n",
|
||||
"models = [model for model in impacts['forecasted']]\n",
|
||||
"\n",
|
||||
"# Define the metric we want to calculate for each forecast model\n",
|
||||
"metrics = [\n",
|
||||
" 'accuracy_score', 'balanced_accuracy_score', 'confusion_matrix',\n",
|
||||
" 'classification_report', 'f1_score', 'fbeta_score', 'precision_score', 'recall_score'\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"# Store results in a nested dictionary by metric\n",
|
||||
"performance = {metric: {} for metric in metrics}\n",
|
||||
"\n",
|
||||
"for model, metric in itertools.product(models, metrics):\n",
|
||||
"\n",
|
||||
" # Get predicted storm regims\n",
|
||||
" df_pred = impacts['forecasted'][model]\n",
|
||||
" predicted_regime = df_pred.storm_regime.astype(cat_type).cat.codes.values\n",
|
||||
"\n",
|
||||
" if metric == 'accuracy_score':\n",
|
||||
" m = sklearn.metrics.accuracy_score(correct_regime, predicted_regime)\n",
|
||||
"\n",
|
||||
" if metric == 'balanced_accuracy_score':\n",
|
||||
" m = sklearn.metrics.balanced_accuracy_score(correct_regime,\n",
|
||||
" predicted_regime)\n",
|
||||
"\n",
|
||||
" if metric == 'confusion_matrix':\n",
|
||||
" m = sklearn.metrics.confusion_matrix(\n",
|
||||
" correct_regime, predicted_regime, labels=[0, 1, 2, 3])\n",
|
||||
" \n",
|
||||
" if metric == 'f1_score':\n",
|
||||
" m = sklearn.metrics.f1_score(correct_regime, predicted_regime, average='weighted')\n",
|
||||
" \n",
|
||||
" if metric == 'fbeta_score':\n",
|
||||
" m = sklearn.metrics.fbeta_score(correct_regime, predicted_regime, average='weighted', beta=1)\n",
|
||||
" \n",
|
||||
" if metric == 'precision_score':\n",
|
||||
" m = sklearn.metrics.precision_score(correct_regime, predicted_regime, average='weighted')\n",
|
||||
" \n",
|
||||
" if metric == 'recall_score':\n",
|
||||
" m = sklearn.metrics.recall_score(correct_regime, predicted_regime, average='weighted')\n",
|
||||
"# m=1\n",
|
||||
" \n",
|
||||
" if metric == 'classification_report':\n",
|
||||
"# m = sklearn.metrics.classification_report(\n",
|
||||
"# correct_regime,\n",
|
||||
"# predicted_regime,\n",
|
||||
"# labels=[0, 1, 2, 3],\n",
|
||||
"# target_names=['swash', 'collision', 'overwash', 'inundation'])\n",
|
||||
"# print(m)\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" # Store metric in results dictionary\n",
|
||||
" performance[metric][model] = m\n",
|
||||
"\n",
|
||||
"pp.pprint(performance)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"predicted_regime"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Scatter plot matirx\n",
|
||||
" - Use [Altair](https://altair-viz.github.io/getting_started/installation.html) for interactivity?\n",
|
||||
" - Or maybe [Holoviews](https://towardsdatascience.com/pyviz-simplifying-the-data-visualisation-process-in-python-1b6d2cb728f1)?"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
@ -1,186 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import pandas as pd\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"from dtaidistance import dtw\n",
|
||||
"from dtaidistance import dtw_visualisation as dtwvis"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Import data\n",
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Use dtaidistance package"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"p1 = df_profiles.dropna(subset=['z']).xs(['AVOCAn0003','prestorm'],level=['site_id','profile_type']).z.values\n",
|
||||
"p2 = df_profiles.dropna(subset=['z']).xs(['AVOCAn0004','prestorm'],level=['site_id','profile_type']).z.values\n",
|
||||
"path = dtw.warping_path(p1,p2)\n",
|
||||
"dtwvis.plot_warping(p1,p2,path)\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Use kshape package"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"profiles = df_profiles.dropna(subset=['z'])\\\n",
|
||||
" .xs(['prestorm'],level=['profile_type'])\\\n",
|
||||
" .groupby('site_id').z\\\n",
|
||||
" .apply(list).tolist()\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# profiles = [x[-50:] for x in profiles]\n",
|
||||
"# print(min(len(x) for x in profiles))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from kshape.core import kshape, zscore\n",
|
||||
"\n",
|
||||
"time_series = [[1,2,3,4], [0,1,2,3], [0,1,2,3], [1,2,2,3]]\n",
|
||||
"cluster_num = 4\n",
|
||||
"clusters = kshape(zscore(profiles, axis=1), cluster_num)\n",
|
||||
"# print(clusters)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"cluster_no = 0\n",
|
||||
"\n",
|
||||
"# Plot shape of all clusters\n",
|
||||
"plt.figure(0)\n",
|
||||
"for n,cluster in enumerate(clusters):\n",
|
||||
" plt.plot(cluster[0],label=n)\n",
|
||||
"plt.legend()\n",
|
||||
"\n",
|
||||
"plt.figure(1)\n",
|
||||
"# Plot all profiles in partiuclar cluster\n",
|
||||
"for profile_no in clusters[cluster_no][1]:\n",
|
||||
" plt.plot(profiles[profile_no])\n",
|
||||
"\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"a = [1,2,3,4,5,6]\n",
|
||||
"a[-1:]"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
@ -1,919 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Investigate "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup notebook\n",
|
||||
"Import our required packages and set default plotting options."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Enable autoreloading of our modules. \n",
|
||||
"# Most of the code will be located in the /src/ folder, \n",
|
||||
"# and then called from the notebook.\n",
|
||||
"%matplotlib inline\n",
|
||||
"%reload_ext autoreload\n",
|
||||
"%autoreload"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from IPython.core.debugger import set_trace\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"import decimal\n",
|
||||
"import plotly\n",
|
||||
"import plotly.graph_objs as go\n",
|
||||
"import plotly.plotly as py\n",
|
||||
"import plotly.tools as tls\n",
|
||||
"import plotly.figure_factory as ff\n",
|
||||
"from plotly import tools\n",
|
||||
"import plotly.io as pio\n",
|
||||
"from scipy import stats\n",
|
||||
"import math\n",
|
||||
"import matplotlib\n",
|
||||
"from matplotlib import cm\n",
|
||||
"import colorlover as cl\n",
|
||||
"from tqdm import tqdm_notebook\n",
|
||||
"from ipywidgets import widgets, Output\n",
|
||||
"from IPython.display import display, clear_output, Image, HTML\n",
|
||||
"from scipy import stats\n",
|
||||
"from sklearn.metrics import confusion_matrix\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"from matplotlib.ticker import MultipleLocator\n",
|
||||
"from matplotlib.lines import Line2D\n",
|
||||
"from cycler import cycler\n",
|
||||
"from scipy.interpolate import interp1d\n",
|
||||
"from pandas.api.types import CategoricalDtype\n",
|
||||
"import seaborn as sns\n",
|
||||
"sns.set(style=\"white\")\n",
|
||||
"from scipy import interpolate\n",
|
||||
"from tqdm import tqdm"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10,6)\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.5\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['axes.grid']=True\n",
|
||||
"\n",
|
||||
"# https://stackoverflow.com/a/20709149\n",
|
||||
"# matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage{helvet}', # set the normal font here\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"] "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import data\n",
|
||||
"Import our data from the `./data/interim/` folder and load it into pandas dataframes. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_waves = df_from_csv('waves.csv', index_col=[0, 1])\n",
|
||||
"df_tides = df_from_csv('tides.csv', index_col=[0, 1])\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0, 1, 2])\n",
|
||||
"df_sites = df_from_csv('sites.csv', index_col=[0])\n",
|
||||
"df_sites_waves = df_from_csv('sites_waves.csv', index_col=[0])\n",
|
||||
"df_profile_features_crest_toes = df_from_csv('profile_features_crest_toes.csv', index_col=[0,1])\n",
|
||||
"\n",
|
||||
"# Note that the forecasted data sets should be in the same order for impacts and twls\n",
|
||||
"impacts = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'postintertidal_slope_sto06': df_from_csv('impacts_forecasted_postintertidal_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'postmean_slope_sto06': df_from_csv('impacts_forecasted_postmean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'preintertidal_slope_sto06': df_from_csv('impacts_forecasted_preintertidal_slope_sto06.csv', index_col=[0]),\n",
|
||||
" 'premean_slope_sto06': df_from_csv('impacts_forecasted_premean_slope_sto06.csv', index_col=[0]),\n",
|
||||
" },\n",
|
||||
" 'observed': df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
"twls = {\n",
|
||||
" 'forecasted': {\n",
|
||||
" 'postintertidal_slope_sto06': df_from_csv('twl_postintertidal_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'postmean_slope_sto06': df_from_csv('twl_postmean_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'preintertidal_slope_sto06': df_from_csv('twl_preintertidal_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" 'premean_slope_sto06': df_from_csv('twl_premean_slope_sto06.csv', index_col=[0,1]),\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"print('Done!')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Gather data into one dataframe\n",
|
||||
"For plotting, gather all our data into one dataframe."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Which forecasted impacts dataframe should we use to assess prediction performance?\n",
|
||||
"df_selected_forecast = impacts['forecasted']['postintertidal_slope_sto06']\n",
|
||||
"\n",
|
||||
"# Create df with all our data\n",
|
||||
"df = impacts['observed'].merge(\n",
|
||||
" df_sites_waves, left_index=True, right_index=True)\n",
|
||||
"\n",
|
||||
"# Join observed/forecasted regimes\n",
|
||||
"df_forecasted = df_selected_forecast.rename(\n",
|
||||
" {'storm_regime': 'forecasted_regime'\n",
|
||||
" }, axis='columns').forecasted_regime\n",
|
||||
"df = pd.concat([df, df_forecasted], axis=1)\n",
|
||||
"\n",
|
||||
"# Create new accuracy column which categorises each prediction\n",
|
||||
"df.loc[(df.storm_regime == 'swash') & (df.forecasted_regime == 'swash'), 'accuracy'] = 'correct swash'\n",
|
||||
"df.loc[(df.storm_regime == 'collision') & (df.forecasted_regime == 'collision'), 'accuracy'] = 'correct collision'\n",
|
||||
"df.loc[(df.storm_regime == 'swash') & (df.forecasted_regime == 'collision'), 'accuracy'] = 'overpredicted swash'\n",
|
||||
"df.loc[(df.storm_regime == 'collision') & (df.forecasted_regime == 'swash'), 'accuracy'] = 'underpredicted collision'\n",
|
||||
"\n",
|
||||
"print('df columns:\\n===')\n",
|
||||
"for col in sorted(df.columns):\n",
|
||||
" print(col)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Create plots"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Variable pairplot, by observed storm impact\n",
|
||||
"Create pairplot of selected variables and look for relationships between each. Colors represent the different observed storm impact regimes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"g = sns.pairplot(\n",
|
||||
" data=df,\n",
|
||||
" hue='storm_regime',\n",
|
||||
" dropna=True,\n",
|
||||
" palette={\n",
|
||||
" 'swash': 'blue',\n",
|
||||
" 'collision': 'orange',\n",
|
||||
" 'overwash': 'red'\n",
|
||||
" },\n",
|
||||
" plot_kws=dict(s=20, edgecolor=\"white\", linewidth=0.1, alpha=0.1),\n",
|
||||
" vars=['beta_prestorm_mean',\n",
|
||||
" 'beta_poststorm_mean',\n",
|
||||
" 'beta_diff_mean',\n",
|
||||
" 'swash_pct_change',\n",
|
||||
" 'width_msl_change_m',\n",
|
||||
" 'width_msl_change_pct',\n",
|
||||
" 'Exscum'])\n",
|
||||
"g.savefig('11_pairplot_observed_impacts.png')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Variable pairplot, by observed/prediction class\n",
|
||||
"Create pairplot of selected variables and look for relationships between each. Colors represent the different observed/prediction classes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"g = sns.pairplot(\n",
|
||||
" data=df,\n",
|
||||
" hue='accuracy',\n",
|
||||
" dropna=True,\n",
|
||||
" palette={\n",
|
||||
" 'correct swash': 'blue',\n",
|
||||
" 'correct collision': 'green',\n",
|
||||
" 'overpredicted swash': 'orange',\n",
|
||||
" 'underpredicted collision': 'red',\n",
|
||||
" },\n",
|
||||
" plot_kws=dict(s=20, edgecolor=\"white\", linewidth=0.1, alpha=0.1),\n",
|
||||
" vars=['beta_prestorm_mean',\n",
|
||||
" 'beta_poststorm_mean',\n",
|
||||
" 'beta_diff_mean',\n",
|
||||
" 'swash_pct_change',\n",
|
||||
" 'width_msl_change_m',\n",
|
||||
" 'width_msl_change_pct',\n",
|
||||
" 'Exscum'])\n",
|
||||
"g.savefig('11_pairplot_accuracy_classes.png')\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Pre/post storm slope by observed/predicted class"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# First create a melted dataframe since our coulmn's aren't exactly as they should be for plotting\n",
|
||||
"df_temp = df.copy()\n",
|
||||
"df_temp = df_temp.reset_index()\n",
|
||||
"\n",
|
||||
"df_melt = pd.melt(\n",
|
||||
" df_temp,\n",
|
||||
" id_vars=['site_id', 'accuracy'],\n",
|
||||
" value_vars=['beta_prestorm_mean', 'beta_poststorm_mean'],\n",
|
||||
" var_name='profile_type',\n",
|
||||
" value_name='beta_mean')\n",
|
||||
"\n",
|
||||
"df_melt.loc[df_melt.profile_type == 'beta_prestorm_mean','profile_type'] = 'prestorm'\n",
|
||||
"df_melt.loc[df_melt.profile_type == 'beta_poststorm_mean','profile_type'] = 'poststorm'\n",
|
||||
"df_melt.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"f, ax = plt.subplots(figsize=(6,5))\n",
|
||||
"\n",
|
||||
"cats = ['correct swash', 'overpredicted swash','underpredicted collision','correct collision']\n",
|
||||
"\n",
|
||||
"# Plot the orbital period with horizontal boxes\n",
|
||||
"sns.boxplot(\n",
|
||||
" data=df_melt,\n",
|
||||
" x=\"accuracy\",\n",
|
||||
" y=\"beta_mean\",\n",
|
||||
" hue=\"profile_type\",\n",
|
||||
" order=cats\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"group_labels = [x.replace(' ','\\n') for x in cats]\n",
|
||||
"ax.set_xticklabels(group_labels)\n",
|
||||
"\n",
|
||||
"# Setup ticks and grid\n",
|
||||
"ax.xaxis.grid(True)\n",
|
||||
"major_ticks = np.arange(-1, 1, 0.05)\n",
|
||||
"minor_ticks = np.arange(-1, 1, 0.01)\n",
|
||||
"ax.set_yticks(major_ticks)\n",
|
||||
"ax.set_yticks(minor_ticks, minor=True)\n",
|
||||
"ax.grid(which='both')\n",
|
||||
"ax.grid(which='minor', alpha=0.3,linestyle='--')\n",
|
||||
"ax.grid(which='major', alpha=0.8,linestyle='-')\n",
|
||||
"\n",
|
||||
"ax.set_ylim([-0.02,0.3])\n",
|
||||
"\n",
|
||||
"f.savefig('11_prepost_slopes_accuracy_classes.png',dpi=600)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Change in slope by observed/predicted class"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"f, ax = plt.subplots(figsize=(6,5))\n",
|
||||
"\n",
|
||||
"cats = ['correct swash', 'overpredicted swash','underpredicted collision','correct collision']\n",
|
||||
"\n",
|
||||
"# Plot the orbital period with horizontal boxes\n",
|
||||
"sns.boxplot(\n",
|
||||
" data=df,\n",
|
||||
" x=\"accuracy\",\n",
|
||||
" y=\"beta_diff_mean\",\n",
|
||||
" order=cats\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"group_labels = [x.replace(' ','\\n') for x in cats]\n",
|
||||
"ax.set_xticklabels(group_labels)\n",
|
||||
"\n",
|
||||
"# Setup ticks and grid\n",
|
||||
"ax.xaxis.grid(True)\n",
|
||||
"major_ticks = np.arange(-1, 1, 0.05)\n",
|
||||
"minor_ticks = np.arange(-1, 1, 0.01)\n",
|
||||
"ax.set_yticks(major_ticks)\n",
|
||||
"ax.set_yticks(minor_ticks, minor=True)\n",
|
||||
"ax.grid(which='both')\n",
|
||||
"ax.grid(which='minor', alpha=0.3,linestyle='--')\n",
|
||||
"ax.grid(which='major', alpha=0.8,linestyle='-')\n",
|
||||
"\n",
|
||||
"ax.set_ylim([-0.2,0.2])\n",
|
||||
"\n",
|
||||
"f.savefig('11_change_in_slopes_accuracy_classes.png',dpi=600)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Swash zone volume change histogram"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"How much does the beach width change variation can we expect in the swash regime?"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"f, ax = plt.subplots(figsize=(5,4))\n",
|
||||
"\n",
|
||||
"sns.distplot(df.loc[df.storm_regime=='swash'].width_msl_change_pct.dropna(), \n",
|
||||
" kde=False);\n",
|
||||
"\n",
|
||||
"ax.set_title('Distribution of beach width change for swash regime')\n",
|
||||
"ax.set_xlabel('$\\Delta$ beach width (%)')\n",
|
||||
"ax.set_ylabel('Count')\n",
|
||||
"\n",
|
||||
"f.savefig('11_change_in_beach_width.png',dpi=600)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Check prestorm and post storm width"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ax.get_xaxis().get_major_ticks()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"x_col = \"width_msl_prestorm\"\n",
|
||||
"y_col = \"width_msl_poststorm\"\n",
|
||||
"\n",
|
||||
"with sns.axes_style(\"white\"):\n",
|
||||
" g = sns.jointplot(x=x_col,\n",
|
||||
" y=y_col,\n",
|
||||
" data=df.dropna(subset=[x_col, y_col]),\n",
|
||||
" kind=\"hex\",\n",
|
||||
" ylim=(0, 150),\n",
|
||||
" xlim=(0, 150))\n",
|
||||
"\n",
|
||||
" x0, x1 = g.ax_joint.get_xlim()\n",
|
||||
" y0, y1 = g.ax_joint.get_ylim()\n",
|
||||
" lims = [max(x0, y0), min(x1, y1)]\n",
|
||||
" g.ax_joint.plot(lims, lims, ':k') \n",
|
||||
" "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Find correlations between variables"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create correlogram"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": []
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from matplotlib.patches import Ellipse\n",
|
||||
"def corrplot(data, pvalues, labels):\n",
|
||||
" \"\"\"Creates a correlation plot of the passed data.\n",
|
||||
" The function returns the plot which can then be shown with\n",
|
||||
" plot.show(), saved to a file with plot.savefig(), or manipulated\n",
|
||||
" in any other standard matplotlib way.\n",
|
||||
" data is the correlation matrix, a 2-D numpy array containing\n",
|
||||
" the pairwise correlations between variables;\n",
|
||||
" pvalues is a matrix containing the pvalue for each corresponding\n",
|
||||
" correlation value; if none it is assumed to be the zero matrix\n",
|
||||
" labels is an array containing the variable names\n",
|
||||
" https://github.com/louridas/corrplot/blob/master/corrplot.py\n",
|
||||
" \"\"\"\n",
|
||||
"\n",
|
||||
" plt.figure(1)\n",
|
||||
"\n",
|
||||
" column_labels = labels\n",
|
||||
" row_labels = labels\n",
|
||||
" \n",
|
||||
" f = plt.figure(figsize=(8,8))\n",
|
||||
" ax = plt.subplot(1, 1, 1, aspect='equal')\n",
|
||||
"\n",
|
||||
" width, height = data.shape\n",
|
||||
" num_cols, num_rows = width, height\n",
|
||||
"\n",
|
||||
" if pvalues is None:\n",
|
||||
" pvalues = np.zeros([num_rows, num_cols])\n",
|
||||
" \n",
|
||||
" shrink = 0.9\n",
|
||||
"\n",
|
||||
" poscm = cm.get_cmap('Blues')\n",
|
||||
" negcm = cm.get_cmap('Oranges')\n",
|
||||
"\n",
|
||||
" for x in range(width):\n",
|
||||
" for y in range(height):\n",
|
||||
" d = data[x, y]\n",
|
||||
" c = pvalues[x, y]\n",
|
||||
" rotate = -45 if d > 0 else +45\n",
|
||||
" clrmap = poscm if d >= 0 else negcm\n",
|
||||
" d_abs = np.abs(d)\n",
|
||||
" ellipse = Ellipse((x, y),\n",
|
||||
" width=1 * shrink,\n",
|
||||
" height=(shrink - d_abs*shrink),\n",
|
||||
" angle=rotate)\n",
|
||||
" ellipse.set_edgecolor('black')\n",
|
||||
" ellipse.set_facecolor(clrmap(d_abs))\n",
|
||||
" if c > 0.05:\n",
|
||||
" ellipse.set_linestyle('dotted')\n",
|
||||
" ellipse.set_alpha(0.5)\n",
|
||||
" ax.add_artist(ellipse)\n",
|
||||
"\n",
|
||||
" ax.set_xlim(-1, num_cols)\n",
|
||||
" ax.set_ylim(-1, num_rows)\n",
|
||||
" \n",
|
||||
" ax.xaxis.tick_top()\n",
|
||||
" xtickslocs = np.arange(len(row_labels))\n",
|
||||
" ax.set_xticks(xtickslocs)\n",
|
||||
" ax.set_xticklabels(row_labels, rotation=30, fontsize='small', ha='left')\n",
|
||||
"\n",
|
||||
" ax.invert_yaxis()\n",
|
||||
" ytickslocs = np.arange(len(row_labels))\n",
|
||||
" ax.set_yticks(ytickslocs)\n",
|
||||
" ax.set_yticklabels(column_labels, fontsize='small')\n",
|
||||
"\n",
|
||||
" return plt"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Calculate correlation coefficient and p-values\n",
|
||||
"# https://stackoverflow.com/a/24469099\n",
|
||||
"corr = df.corr(method ='pearson') \n",
|
||||
"n=len(corr.columns)\n",
|
||||
"t=corr*np.sqrt((n-2)/(1-corr*corr))\n",
|
||||
"pvals = stats.t.cdf(t, n-2)\n",
|
||||
"\n",
|
||||
"plot = corrplot(corr.values, pvals, corr.columns.tolist())\n",
|
||||
"plot.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create regression plot between two variables"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from scipy import stats\n",
|
||||
"\n",
|
||||
"# x_col = 'beta_prestorm_intertidal'\n",
|
||||
"# y_col = \"beta_diff_intertidal\"\n",
|
||||
"# data = df.loc[df.storm_regime=='swash']\n",
|
||||
"\n",
|
||||
"# y_col = 'total_vol_change'\n",
|
||||
"# x_col = \"Pxscum\"\n",
|
||||
"# data = df\n",
|
||||
"\n",
|
||||
"y_col = 'prestorm_cum_exposed_vol'\n",
|
||||
"x_col = \"Exscum\"\n",
|
||||
"c_col = 'total_vol_change'\n",
|
||||
"data = df\n",
|
||||
"\n",
|
||||
"slope, intercept, r_value, p_value, std_err = stats.linregress(\n",
|
||||
" data.dropna()[x_col].values,\n",
|
||||
" data.dropna()[y_col].values)\n",
|
||||
"\n",
|
||||
"fig = plt.figure(\n",
|
||||
" figsize=(6, 4), dpi=150, facecolor='w', edgecolor='k')\n",
|
||||
"ax = fig.add_subplot(111)\n",
|
||||
"\n",
|
||||
"scatter = ax.scatter(\n",
|
||||
" x=data.dropna()[x_col].values,\n",
|
||||
" y=data.dropna()[y_col].values,\n",
|
||||
" c=data.dropna()[c_col].values,\n",
|
||||
" s=1, \n",
|
||||
" vmin=-150, vmax=0,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"ax.set_xlabel(x_col)\n",
|
||||
"ax.set_ylabel(y_col)\n",
|
||||
"ax.set_ylim(0,20000)\n",
|
||||
"\n",
|
||||
"cbar = plt.colorbar(scatter)\n",
|
||||
"cbar.set_label(c_col)\n",
|
||||
"\n",
|
||||
"ax.grid(True, linestyle=\"--\", alpha=0.2, color='grey', linewidth=1)\n",
|
||||
"\n",
|
||||
"plt.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Calculate berm shape index"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_profiles\n",
|
||||
"df_profile_features_crest_toes\n",
|
||||
"\n",
|
||||
"berm_shape = []\n",
|
||||
"grouped = df_profiles.dropna(subset=['z']).xs('prestorm',level='profile_type').groupby('site_id')\n",
|
||||
"for site_id, df_site in tqdm(grouped):\n",
|
||||
" features = df_profile_features_crest_toes.loc[(site_id,'prestorm')]\n",
|
||||
" \n",
|
||||
" # Get x-coordinate at z=0\n",
|
||||
" x_last = df_site.iloc[-1].name[1]\n",
|
||||
" z_last = 0\n",
|
||||
" \n",
|
||||
" # Get coordinates of dune toe\n",
|
||||
" x_first = features.dune_toe_x\n",
|
||||
" z_first = features.dune_toe_z\n",
|
||||
" \n",
|
||||
" # If there is no dune toe, get dune crest\n",
|
||||
" if np.isnan(x_first):\n",
|
||||
" x_first = features.dune_crest_x\n",
|
||||
" z_first = features.dune_crest_z\n",
|
||||
" \n",
|
||||
" # If no dune crest, use nan\n",
|
||||
" if np.isnan(x_first):\n",
|
||||
" berm_shape.append({'site_id': site_id,\n",
|
||||
" 'prestorm_berm_curvature': np.nan})\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" # Fit straight line between start and end points\n",
|
||||
" segment = (df_site.loc[(df_site.index.get_level_values('x')>=x_first)&\n",
|
||||
" (df_site.index.get_level_values('x')<=x_last)])\n",
|
||||
" x_segment = segment.index.get_level_values('x')\n",
|
||||
" z_segment = segment.z\n",
|
||||
" f = interpolate.interp1d([x_first,x_last],[z_first,z_last])\n",
|
||||
" z_straight = f(x_segment)\n",
|
||||
"\n",
|
||||
" area = np.trapz(y=z_straight-z_segment, x=x_segment)\n",
|
||||
" length = x_last-x_first\n",
|
||||
" \n",
|
||||
" normalized_curvature = area\n",
|
||||
"# normalized_curvature = area / length\n",
|
||||
" berm_shape.append({'site_id': site_id,\n",
|
||||
" 'prestorm_berm_curvature': normalized_curvature})\n",
|
||||
"\n",
|
||||
"# Convert to dataframe \n",
|
||||
"df_berm_shape = pd.DataFrame(berm_shape)\n",
|
||||
"df_berm_shape = df_berm_shape.set_index('site_id')\n",
|
||||
"\n",
|
||||
"# Join onto our big dataframe\n",
|
||||
"df = df.drop(columns=['prestorm_berm_curvature'], errors='ignore')\n",
|
||||
"df = pd.concat([df, df_berm_shape], axis=1)\n",
|
||||
"\n",
|
||||
"df_berm_shape.head()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Check wave timeseries\n",
|
||||
"How much does wave height vary alongshore between sites?"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from dateutil.parser import parse\n",
|
||||
"sites = ['NARRA0001', 'NARRA0012', 'NARRA0024']\n",
|
||||
"\n",
|
||||
"fig = plt.figure(\n",
|
||||
" figsize=(6, 4), dpi=150, facecolor='w', edgecolor='k')\n",
|
||||
"ax = fig.add_subplot(111)\n",
|
||||
"\n",
|
||||
"for site_id in sites:\n",
|
||||
" print(site_id)\n",
|
||||
" x = [parse(t) for t in df_waves.xs(site_id,level='site_id').index]\n",
|
||||
" y = df_waves.xs(site_id,level='site_id').Hs\n",
|
||||
" ax.plot(x,y)\n",
|
||||
" \n",
|
||||
"plt.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Cumulative sum of available prestorm volume?"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# At each site, determine relationship between height and available volume\n",
|
||||
"data = []\n",
|
||||
"site_ids = df_sites.index.values\n",
|
||||
"for site_id in site_ids:\n",
|
||||
" df_profile = df_profiles.xs([site_id, 'prestorm'],\n",
|
||||
" level=['site_id',\n",
|
||||
" 'profile_type']).dropna(subset=['z'])\n",
|
||||
" x_profile = df_profile.index.get_level_values('x').values\n",
|
||||
" z_profile = df_profile.z.values\n",
|
||||
" \n",
|
||||
" z_vals = np.arange(min(df_profile.z),max(df_profile.z),0.01)\n",
|
||||
" \n",
|
||||
" for z in z_vals:\n",
|
||||
" i_start = np.where((z_profile > z))[0][-1]\n",
|
||||
" x_start = x_profile[i_start]\n",
|
||||
" x_end = x_profile[-1]\n",
|
||||
" mask = (x_start <= x_profile) & (x_profile <= x_end)\n",
|
||||
" vol = np.trapz(z_profile[mask], x=x_profile[mask])\n",
|
||||
" data.append({'site_id': site_id,'z':z,'prestorm_vol':vol})\n",
|
||||
" \n",
|
||||
"df_prestorm_vols_by_z = pd.DataFrame(data)\n",
|
||||
"df_prestorm_vols_by_z = df_prestorm_vols_by_z.set_index(['site_id','z'])\n",
|
||||
"df_prestorm_vols_by_z.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_twl = twls['forecasted']['preintertidal_slope_sto06']\n",
|
||||
"df_twl['z'] = df_twl.R_high.round(2)\n",
|
||||
"\n",
|
||||
"df_twl = df_twl.join(df_prestorm_vols_by_z, on=['site_id','z'])\n",
|
||||
"df_twl = df_twl.drop(columns=['z'])\n",
|
||||
"\n",
|
||||
"df_site_cum_exposed_vols = df_twl.groupby('site_id').prestorm_vol.sum().to_frame()\n",
|
||||
"df_site_cum_exposed_vols = df_site_cum_exposed_vols.rename({'prestorm_vol':'prestorm_cum_exposed_vol'},axis=1)\n",
|
||||
"\n",
|
||||
"# # Join onto main dataframe\n",
|
||||
"df = df.drop(columns=['prestorm_cum_exposed_vol'], errors='ignore')\n",
|
||||
"df = pd.concat([df, df_site_cum_exposed_vols], axis=1)\n",
|
||||
"\n",
|
||||
"df_site_cum_exposed_vols.head()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# PCA?"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"X[0]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from sklearn import decomposition\n",
|
||||
"from sklearn.preprocessing import StandardScaler\n",
|
||||
"\n",
|
||||
"target_col = 'swash_pct_change'\n",
|
||||
"training_cols = ['beta_prestorm_mean','beta_prestorm_intertidal','prestorm_dune_vol','prestorm_swash_vol','width_msl_prestorm','Pxscum','prestorm_berm_curvature','prestorm_cum_exposed_vol']\n",
|
||||
"\n",
|
||||
"df_pca = df[training_cols+[target_col]].dropna()\n",
|
||||
"df_pca_data_only = df_pca.drop(target_col,axis=1)\n",
|
||||
"\n",
|
||||
"# input data\n",
|
||||
"X = df_pca_data_only.values\n",
|
||||
"X = StandardScaler().fit_transform(X)\n",
|
||||
"\n",
|
||||
"# target\n",
|
||||
"y = df_pca[target_col]\n",
|
||||
"\n",
|
||||
"# pca\n",
|
||||
"pca = decomposition.PCA(n_components=2)\n",
|
||||
"pca.fit(X)\n",
|
||||
"\n",
|
||||
"X = pca.transform(X)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"fig = plt.figure(\n",
|
||||
" figsize=(6, 4), dpi=150, facecolor='w', edgecolor='k')\n",
|
||||
"ax = fig.add_subplot(111)\n",
|
||||
"\n",
|
||||
"scatter = ax.scatter(\n",
|
||||
" x=X[:,0],\n",
|
||||
" y=X[:,1],\n",
|
||||
" c=y,\n",
|
||||
" s=0.5, \n",
|
||||
" vmin=-1, vmax=0,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"# ax.set_xlabel(x_col)\n",
|
||||
"# ax.set_ylabel(y_col)\n",
|
||||
"# ax.set_ylim(0,20000)\n",
|
||||
"\n",
|
||||
"cbar = plt.colorbar(scatter)\n",
|
||||
"# cbar.set_label(c_col)\n",
|
||||
"\n",
|
||||
"# ax.grid(True, linestyle=\"--\", alpha=0.2, color='grey', linewidth=1)\n",
|
||||
"\n",
|
||||
"plt.show()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_pca_dims = pd.DataFrame(pca.components_, columns=list(df_pca_data_only.columns))\n",
|
||||
"\n",
|
||||
"df_pca_dims.iloc[0]\n",
|
||||
"# pca.explained_variance_ratio_"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {
|
||||
"height": "calc(100% - 180px)",
|
||||
"left": "10px",
|
||||
"top": "150px",
|
||||
"width": "223.594px"
|
||||
},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": true
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,589 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import matplotlib\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"import shapely.geometry as sgeom\n",
|
||||
"from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER\n",
|
||||
"import cartopy.feature \n",
|
||||
"import cartopy.crs as ccrs\n",
|
||||
"\n",
|
||||
"import matplotlib.lines as mlines"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Matplot lib default settings\n",
|
||||
"plt.rcParams[\"figure.figsize\"] = (10, 6)\n",
|
||||
"plt.rcParams['axes.grid'] = True\n",
|
||||
"plt.rcParams['grid.alpha'] = 0.3\n",
|
||||
"plt.rcParams['grid.color'] = \"grey\"\n",
|
||||
"plt.rcParams['grid.linestyle'] = \"--\"\n",
|
||||
"plt.rcParams['grid.linewidth'] = 0.5\n",
|
||||
"plt.rcParams['axes.grid'] = True\n",
|
||||
"\n",
|
||||
"# # https://stackoverflow.com/a/20709149\n",
|
||||
"matplotlib.rcParams['text.usetex'] = True\n",
|
||||
"matplotlib.rcParams['font.family'] = 'sans-serif'\n",
|
||||
"\n",
|
||||
"matplotlib.rcParams['text.latex.preamble'] = [\n",
|
||||
" r'\\usepackage{siunitx}', # i need upright \\micro symbols, but you need...\n",
|
||||
" r'\\sisetup{detect-all}', # ...this to force siunitx to actually use your fonts\n",
|
||||
" r'\\usepackage[default]{sourcesanspro}',\n",
|
||||
" r'\\usepackage{amsmath}',\n",
|
||||
" r'\\usepackage{sansmath}', # load up the sansmath so that math -> helvet\n",
|
||||
" r'\\sansmath', # <- tricky! -- gotta actually tell tex to use!\n",
|
||||
"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def blank_axes(ax):\n",
|
||||
" \"\"\"\n",
|
||||
" blank_axes: blank the extraneous spines and tick marks for an axes\n",
|
||||
"\n",
|
||||
" Input:\n",
|
||||
" ax: a matplotlib Axes object\n",
|
||||
"\n",
|
||||
" Output: None\n",
|
||||
" \"\"\"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
" ax.spines['right'].set_visible(False)\n",
|
||||
" ax.spines['top'].set_visible(False)\n",
|
||||
" ax.spines['bottom'].set_visible(False)\n",
|
||||
" ax.spines['left'].set_visible(False)\n",
|
||||
" ax.yaxis.set_ticks_position('none')\n",
|
||||
" ax.xaxis.set_ticks_position('none')\n",
|
||||
" ax.tick_params(labelbottom='off', labeltop='off', labelleft='off', labelright='off' ,\\\n",
|
||||
" bottom='off', top='off', left='off', right='off' )\n",
|
||||
"#end blank_axes"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": [
|
||||
66,
|
||||
268,
|
||||
296
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Define figure and axes\n",
|
||||
"fig, ax1 = plt.subplots(figsize=(5, 5),\n",
|
||||
" subplot_kw=dict(projection=ccrs.PlateCarree()))\n",
|
||||
"\n",
|
||||
"# Inset axes showing overall Australia plot\n",
|
||||
"ax2 = fig.add_axes([0.14, 0.58, 0.20, 0.15], projection=ccrs.PlateCarree())\n",
|
||||
"\n",
|
||||
"# Define extents to our axes\n",
|
||||
"ax1_extent = [146, 154, -35, -30]\n",
|
||||
"ax2_extent = [110, 157, -42, -7]\n",
|
||||
"ax1.set_extent(ax1_extent)\n",
|
||||
"ax2.set_extent(ax2_extent)\n",
|
||||
"\n",
|
||||
"# Add gridlines to ax1\n",
|
||||
"gl = ax1.gridlines(draw_labels=True, linestyle='--', zorder=2, alpha=0.5)\n",
|
||||
"gl.xlabels_top = gl.ylabels_right = False\n",
|
||||
"gl.xformatter = LONGITUDE_FORMATTER\n",
|
||||
"gl.yformatter = LATITUDE_FORMATTER\n",
|
||||
"\n",
|
||||
"# Define features we want to plot\n",
|
||||
"feat_rivers = cartopy.feature.NaturalEarthFeature(\n",
|
||||
" 'physical',\n",
|
||||
" 'rivers_lake_centerlines',\n",
|
||||
" '10m',\n",
|
||||
" edgecolor=cartopy.feature.COLORS['water'],\n",
|
||||
" facecolor='none')\n",
|
||||
"\n",
|
||||
"feat_oceans = cartopy.feature.NaturalEarthFeature(\n",
|
||||
" 'physical', 'ocean', '10m', facecolor=cartopy.feature.COLORS['water'])\n",
|
||||
"\n",
|
||||
"feat_borders = cartopy.feature.NaturalEarthFeature(\n",
|
||||
" 'cultural',\n",
|
||||
" 'admin_1_states_provinces',\n",
|
||||
" '10m',\n",
|
||||
" edgecolor='black',\n",
|
||||
" facecolor=cartopy.feature.COLORS['land'],\n",
|
||||
" linewidth=0.5)\n",
|
||||
"\n",
|
||||
"# Add features to our plots\n",
|
||||
"ax1.add_feature(feat_rivers)\n",
|
||||
"ax1.add_feature(feat_oceans)\n",
|
||||
"ax1.add_feature(feat_borders)\n",
|
||||
"ax2.add_feature(feat_oceans)\n",
|
||||
"ax2.add_feature(feat_borders)\n",
|
||||
"\n",
|
||||
"# Plot location box on ax2\n",
|
||||
"ax1_extent_box = sgeom.box(ax1_extent[0], ax1_extent[2], ax1_extent[1],\n",
|
||||
" ax1_extent[3])\n",
|
||||
"ax2.add_geometries([ax1_extent_box],\n",
|
||||
" ccrs.PlateCarree(),\n",
|
||||
" color='none',\n",
|
||||
" edgecolor='r',\n",
|
||||
" linewidth=2)\n",
|
||||
"\n",
|
||||
"# Define marker properties\n",
|
||||
"marker_edge_width = 1.3\n",
|
||||
"marker_edge_color = '#ffffff'\n",
|
||||
"wave_buoy_color = 'red'\n",
|
||||
"wave_buoy_marker = 'o'\n",
|
||||
"tide_gauge_color = 'blue'\n",
|
||||
"tide_gauge_marker = 's'\n",
|
||||
"beach_color = 'green'\n",
|
||||
"beach_marker = '^'\n",
|
||||
"\n",
|
||||
"# Plot beaches\n",
|
||||
"# df_sites.groupby('beach').mean()[['lat','lon']].to_dict('index')\n",
|
||||
"beaches = {\n",
|
||||
" 'AVOCAn': {\n",
|
||||
" 'lat': -33.460695367777774,\n",
|
||||
" 'lon': 151.43853769000003\n",
|
||||
" },\n",
|
||||
" 'AVOCAs': {\n",
|
||||
" 'lat': -33.467647595,\n",
|
||||
" 'lon': 151.43574445875\n",
|
||||
" },\n",
|
||||
" 'BILG': {\n",
|
||||
" 'lat': -33.645234478,\n",
|
||||
" 'lon': 151.328779182\n",
|
||||
" },\n",
|
||||
" 'BLUEYS': {\n",
|
||||
" 'lat': -32.35377103,\n",
|
||||
" 'lon': 152.53584677666666\n",
|
||||
" },\n",
|
||||
" 'BOAT': {\n",
|
||||
" 'lat': -32.43502469599999,\n",
|
||||
" 'lon': 152.530818656\n",
|
||||
" },\n",
|
||||
" 'BOOM': {\n",
|
||||
" 'lat': -32.34039573142857,\n",
|
||||
" 'lon': 152.54337415\n",
|
||||
" },\n",
|
||||
" 'CATHIE': {\n",
|
||||
" 'lat': -31.57630510275862,\n",
|
||||
" 'lon': 152.8433463127586\n",
|
||||
" },\n",
|
||||
" 'CRESn': {\n",
|
||||
" 'lat': -31.12568202392001,\n",
|
||||
" 'lon': 153.00734157120007\n",
|
||||
" },\n",
|
||||
" 'CRESs': {\n",
|
||||
" 'lat': -31.180938470000008,\n",
|
||||
" 'lon': 152.97574073\n",
|
||||
" },\n",
|
||||
" 'DEEWHYn': {\n",
|
||||
" 'lat': -33.745759471666666,\n",
|
||||
" 'lon': 151.3055993875\n",
|
||||
" },\n",
|
||||
" 'DEEWHYs': {\n",
|
||||
" 'lat': -33.751954194999996,\n",
|
||||
" 'lon': 151.29818175499997\n",
|
||||
" },\n",
|
||||
" 'DIAMONDn': {\n",
|
||||
" 'lat': -32.026216662195125,\n",
|
||||
" 'lon': 152.55036803634147\n",
|
||||
" },\n",
|
||||
" 'DIAMONDs': {\n",
|
||||
" 'lat': -32.046040624285716,\n",
|
||||
" 'lon': 152.54134085\n",
|
||||
" },\n",
|
||||
" 'DUNBn': {\n",
|
||||
" 'lat': -31.674815349864858,\n",
|
||||
" 'lon': 152.81198585391894\n",
|
||||
" },\n",
|
||||
" 'DUNBs': {\n",
|
||||
" 'lat': -31.710181410909083,\n",
|
||||
" 'lon': 152.79323301090912\n",
|
||||
" },\n",
|
||||
" 'ELIZA': {\n",
|
||||
" 'lat': -32.3298006057143,\n",
|
||||
" 'lon': 152.53714101142856\n",
|
||||
" },\n",
|
||||
" 'ENTRA': {\n",
|
||||
" 'lat': -33.31609181329114,\n",
|
||||
" 'lon': 151.5278903848101\n",
|
||||
" },\n",
|
||||
" 'FOST': {\n",
|
||||
" 'lat': -32.17670982666667,\n",
|
||||
" 'lon': 152.51195243333333\n",
|
||||
" },\n",
|
||||
" 'GRANTSn': {\n",
|
||||
" 'lat': -31.613473751666664,\n",
|
||||
" 'lon': 152.8381070795833\n",
|
||||
" },\n",
|
||||
" 'GRANTSs': {\n",
|
||||
" 'lat': -31.63005646785714,\n",
|
||||
" 'lon': 152.83392283714286\n",
|
||||
" },\n",
|
||||
" 'HARGn': {\n",
|
||||
" 'lat': -33.25858048428571,\n",
|
||||
" 'lon': 151.56334493285718\n",
|
||||
" },\n",
|
||||
" 'HARGs': {\n",
|
||||
" 'lat': -33.26487224142857,\n",
|
||||
" 'lon': 151.5624840085714\n",
|
||||
" },\n",
|
||||
" 'HARR': {\n",
|
||||
" 'lat': -31.859077996607144,\n",
|
||||
" 'lon': 152.72314068214285\n",
|
||||
" },\n",
|
||||
" 'LHOUSE': {\n",
|
||||
" 'lat': -32.443838815384616,\n",
|
||||
" 'lon': 152.52969125769232\n",
|
||||
" },\n",
|
||||
" 'LHOUSEn': {\n",
|
||||
" 'lat': -31.506830332043016,\n",
|
||||
" 'lon': 152.900197138172\n",
|
||||
" },\n",
|
||||
" 'LHOUSEs': {\n",
|
||||
" 'lat': -31.55095255875001,\n",
|
||||
" 'lon': 152.85847451375002\n",
|
||||
" },\n",
|
||||
" 'MACM': {\n",
|
||||
" 'lat': -33.494884234375,\n",
|
||||
" 'lon': 151.42840894187498\n",
|
||||
" },\n",
|
||||
" 'MANNING': {\n",
|
||||
" 'lat': -31.922794031338576,\n",
|
||||
" 'lon': 152.63626414188988\n",
|
||||
" },\n",
|
||||
" 'MONA': {\n",
|
||||
" 'lat': -33.68342594,\n",
|
||||
" 'lon': 151.31180166238096\n",
|
||||
" },\n",
|
||||
" 'NAMB': {\n",
|
||||
" 'lat': -30.702570222054792,\n",
|
||||
" 'lon': 152.99174024657532\n",
|
||||
" },\n",
|
||||
" 'NARRA': {\n",
|
||||
" 'lat': -33.71824857833333,\n",
|
||||
" 'lon': 151.30161430805555\n",
|
||||
" },\n",
|
||||
" 'NINEMn': {\n",
|
||||
" 'lat': -32.098527227407416,\n",
|
||||
" 'lon': 152.5245430024074\n",
|
||||
" },\n",
|
||||
" 'NINEMs': {\n",
|
||||
" 'lat': -32.146616644,\n",
|
||||
" 'lon': 152.50721414266667\n",
|
||||
" },\n",
|
||||
" 'NSHORE_n': {\n",
|
||||
" 'lat': -31.35297012609755,\n",
|
||||
" 'lon': 152.94414099536587\n",
|
||||
" },\n",
|
||||
" 'NSHORE_s': {\n",
|
||||
" 'lat': -31.4042148925,\n",
|
||||
" 'lon': 152.91674769522717\n",
|
||||
" },\n",
|
||||
" 'OLDBAR': {\n",
|
||||
" 'lat': -31.981825014722215,\n",
|
||||
" 'lon': 152.58157028555553\n",
|
||||
" },\n",
|
||||
" 'ONEMILE': {\n",
|
||||
" 'lat': -32.19014868,\n",
|
||||
" 'lon': 152.53698099153846\n",
|
||||
" },\n",
|
||||
" 'PEARLn': {\n",
|
||||
" 'lat': -33.5394179,\n",
|
||||
" 'lon': 151.310494964\n",
|
||||
" },\n",
|
||||
" 'PEARLs': {\n",
|
||||
" 'lat': -33.543258066,\n",
|
||||
" 'lon': 151.30794061\n",
|
||||
" },\n",
|
||||
" 'SCOT': {\n",
|
||||
" 'lat': -30.740275808333333,\n",
|
||||
" 'lon': 152.99018976333335\n",
|
||||
" },\n",
|
||||
" 'STOCNn': {\n",
|
||||
" 'lat': -32.78820750815384,\n",
|
||||
" 'lon': 152.0395944421538\n",
|
||||
" },\n",
|
||||
" 'STOCNs': {\n",
|
||||
" 'lat': -32.833099094162684,\n",
|
||||
" 'lon': 151.9039352245933\n",
|
||||
" },\n",
|
||||
" 'STOCS': {\n",
|
||||
" 'lat': -32.8965449047826,\n",
|
||||
" 'lon': 151.79411199869566\n",
|
||||
" },\n",
|
||||
" 'STUART': {\n",
|
||||
" 'lat': -30.835545341910105,\n",
|
||||
" 'lon': 153.00643798999994\n",
|
||||
" },\n",
|
||||
" 'SWRO': {\n",
|
||||
" 'lat': -30.885526112307694,\n",
|
||||
" 'lon': 153.05837861230768\n",
|
||||
" },\n",
|
||||
" 'TREACH': {\n",
|
||||
" 'lat': -32.454167825000006,\n",
|
||||
" 'lon': 152.508508009375\n",
|
||||
" },\n",
|
||||
" 'WAMBE': {\n",
|
||||
" 'lat': -33.43660858444444,\n",
|
||||
" 'lon': 151.445516972963\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"\n",
|
||||
"for beach in beaches:\n",
|
||||
" ax1.plot(beaches[beach]['lon'],\n",
|
||||
" beaches[beach]['lat'],\n",
|
||||
" color=beach_color,\n",
|
||||
" marker=beach_marker,\n",
|
||||
" markeredgewidth=marker_edge_width-0.5,\n",
|
||||
" markeredgecolor='#000000',\n",
|
||||
" transform=ccrs.Geodetic())\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Add wave buoys\n",
|
||||
"wave_buoys = [\n",
|
||||
" {\n",
|
||||
" 'name': 'Sydney',\n",
|
||||
" 'lat': -33.77166667,\n",
|
||||
" 'lon': 151.40861111\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Crowdy Head',\n",
|
||||
" 'lat': -31.81388889,\n",
|
||||
" 'lon': 152.85611111\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Coffs Harbour',\n",
|
||||
" 'lat': -30.36250000,\n",
|
||||
" 'lon': 153.26916667\n",
|
||||
" },\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"for wave_buoy in wave_buoys:\n",
|
||||
" ax1.plot(wave_buoy['lon'],\n",
|
||||
" wave_buoy['lat'],\n",
|
||||
" color=wave_buoy_color,\n",
|
||||
" marker=wave_buoy_marker,\n",
|
||||
" markeredgewidth=marker_edge_width,\n",
|
||||
" markeredgecolor=marker_edge_color,\n",
|
||||
" transform=ccrs.Geodetic())\n",
|
||||
"\n",
|
||||
"# Add tide gauges\n",
|
||||
"tide_gauges = [\n",
|
||||
" {\n",
|
||||
" 'name': 'HMAS Penguin',\n",
|
||||
" 'lat': -33.82546,\n",
|
||||
" 'lon': 151.25853\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Patonga',\n",
|
||||
" 'lat': -33.55098,\n",
|
||||
" 'lon': 151.27461\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Shoal Bay',\n",
|
||||
" 'lat': -32.71967,\n",
|
||||
" 'lon': 152.17565\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Forster',\n",
|
||||
" 'lat': -32.17398,\n",
|
||||
" 'lon': 152.50820\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Crowdy Head',\n",
|
||||
" 'lat': -31.83870,\n",
|
||||
" 'lon': 152.75001\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Port Macquarie',\n",
|
||||
" 'lat': -31.42682,\n",
|
||||
" 'lon': 152.91112\n",
|
||||
" },\n",
|
||||
" {\n",
|
||||
" 'name': 'Coffs Harbour',\n",
|
||||
" 'lat': -30.30286,\n",
|
||||
" 'lon': 153.14614\n",
|
||||
" },\n",
|
||||
"]\n",
|
||||
"\n",
|
||||
"for tide_gauge in tide_gauges:\n",
|
||||
" ax1.plot(tide_gauge['lon'],\n",
|
||||
" tide_gauge['lat'],\n",
|
||||
" color=tide_gauge_color,\n",
|
||||
" marker=tide_gauge_marker,\n",
|
||||
" markeredgewidth=marker_edge_width,\n",
|
||||
" markeredgecolor=marker_edge_color,\n",
|
||||
" transform=ccrs.Geodetic())\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Prepare legend\n",
|
||||
"legend_buoy = mlines.Line2D([], [],\n",
|
||||
" color=wave_buoy_color,\n",
|
||||
" marker=wave_buoy_marker,\n",
|
||||
" markersize=5,\n",
|
||||
" linestyle=\"None\",\n",
|
||||
" markeredgewidth=marker_edge_width,\n",
|
||||
" markeredgecolor=marker_edge_color,\n",
|
||||
" label='Wave buoys')\n",
|
||||
"\n",
|
||||
"legend_gauge = mlines.Line2D([], [],\n",
|
||||
" color=tide_gauge_color,\n",
|
||||
" marker=tide_gauge_marker,\n",
|
||||
" markersize=5,\n",
|
||||
" linestyle=\"None\",\n",
|
||||
" markeredgewidth=marker_edge_width,\n",
|
||||
" markeredgecolor=marker_edge_color,\n",
|
||||
" label='Tide gauges')\n",
|
||||
"\n",
|
||||
"legend_beaches = mlines.Line2D([], [],\n",
|
||||
" color=beach_color,\n",
|
||||
" marker=beach_marker,\n",
|
||||
" markersize=5,\n",
|
||||
" linestyle=\"None\",\n",
|
||||
" markeredgewidth=marker_edge_width-0.5,\n",
|
||||
" markeredgecolor='#000000',\n",
|
||||
" label='Beaches included')\n",
|
||||
"\n",
|
||||
"handles = [legend_buoy, legend_gauge, legend_beaches]\n",
|
||||
"names = ['Wave buoys', 'Tide gauges', 'Surveyed\\nbeaches']\n",
|
||||
"\n",
|
||||
"# create legend\n",
|
||||
"ax1.legend(handles, names, title=r'\\underline{Legend}', loc='lower left')\n",
|
||||
"\n",
|
||||
"# Add landmarks\n",
|
||||
"ax1.text(151.204325-0.1, -33.869810, r'\\textsc{Sydney}', transform=ccrs.Geodetic(),ha='right',zorder=4)\n",
|
||||
"ax1.text(151.784937-0.1, -32.928103, r'\\textsc{Newcastle}', transform=ccrs.Geodetic(),ha='right',zorder=4)\n",
|
||||
"ax1.text(152.909329-0.1, -31.440207, r'\\textsc{Port Macquarie}', transform=ccrs.Geodetic(),ha='right',zorder=4)\n",
|
||||
"ax1.text(153.111704-0.1, -30.300466, r'\\textsc{Coffs Harbour}', transform=ccrs.Geodetic(),ha='right',zorder=4)\n",
|
||||
"ax1.text(150.891708-0.1, -34.433129, r'\\textsc{Wollongong}', transform=ccrs.Geodetic(),ha='right',zorder=4)\n",
|
||||
"\n",
|
||||
"ax1.plot(151.204325, -33.869810, transform=ccrs.Geodetic(),zorder=3,color='k',marker='.')\n",
|
||||
"ax1.plot(151.784937, -32.928103, transform=ccrs.Geodetic(),zorder=3,color='k',marker='.')\n",
|
||||
"ax1.plot(152.909329, -31.440207, transform=ccrs.Geodetic(),zorder=3,color='k',marker='.')\n",
|
||||
"ax1.plot(153.111704, -30.300466, transform=ccrs.Geodetic(),zorder=3,color='k',marker='.')\n",
|
||||
"ax1.plot(150.891708, -34.433129,transform=ccrs.Geodetic(),zorder=3,color='k',marker='.')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"ax2.text(133.729975, -25.173095, r'\\textsc{Australia}', transform=ccrs.Geodetic(),ha='center',zorder=4,va='bottom', fontsize=6, bbox=dict(facecolor=cartopy.feature.COLORS['land'],pad=0.1,linewidth=0, alpha=0.9))\n",
|
||||
"\n",
|
||||
"# # Add inset for Narrabeen\n",
|
||||
"# ax3 = fig.add_axes([0.7, 0.28, 0.2, 0.3], projection=ccrs.PlateCarree())\n",
|
||||
"# ax3_extent = [151.296915, 151.316252, -33.739274, -33.702466]\n",
|
||||
"# # ax3_extent = [151.296915, 151.32, -33.739274, -33.68]\n",
|
||||
"# ax3.set_extent(ax3_extent)\n",
|
||||
"# # ax3.add_feature(feat_oceans)\n",
|
||||
"# # ax3.add_feature(feat_borders)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"fig.savefig('07_c&p_locality.png',dpi=600,bbox_inches = \"tight\", pad_inches=0.01)\n",
|
||||
"plt.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# # Try using overpass api\n",
|
||||
"# # Hard because coastline is given as line string, not shapefile.\n",
|
||||
"\n",
|
||||
"# import overpass\n",
|
||||
"# api = overpass.API()\n",
|
||||
"# response = api.get('way[\"natural\"=\"coastline\"](-34, 151.0, -33, 152);out geom;')\n",
|
||||
"# coords = [x['geometry']['coordinates'] for x in response['features']]\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# for line in coords:\n",
|
||||
"# lats = [x[1] for x in line]\n",
|
||||
"# lons = [x[0] for x in line]\n",
|
||||
"# ax3.plot(lons,lats, transform=ccrs.Geodetic())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"###"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": false
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
File diff suppressed because one or more lines are too long
@ -1,121 +0,0 @@
|
||||
{
|
||||
"type": "FeatureCollection",
|
||||
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
|
||||
"features": [
|
||||
{ "type": "Feature", "properties": { "area": 7.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.312088315737213, -33.731759753314719 ], [ 151.285874532141236, -33.751609891372972 ], [ 151.297531748333455, -33.762252811053656 ], [ 151.323745531929404, -33.742405136585639 ], [ 151.312088315737213, -33.731759753314719 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 15.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.713316517914194, -34.995462365421304 ], [ 150.659281664445359, -35.030600787434174 ], [ 150.670674540619729, -35.042349525480454 ], [ 150.724709394088478, -35.007216152925253 ], [ 150.713316517914194, -34.995462365421304 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 70.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.82981739277065, -35.018834239509481 ], [ 150.784681201002428, -34.897414824136412 ], [ 150.749617385613391, -34.906176096351778 ], [ 150.794753577381641, -35.027582539642225 ], [ 150.82981739277065, -35.018834239509481 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 89.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.782495693468775, -34.759702105265937 ], [ 150.72399620090647, -34.893522285583018 ], [ 150.763036455392097, -34.905012361594544 ], [ 150.821535947954374, -34.771210869337644 ], [ 150.782495693468775, -34.759702105265937 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 6.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.754206923578749, -32.956209566436968 ], [ 151.744701746269357, -32.94977700104937 ], [ 151.718177158233004, -32.977371201523567 ], [ 151.727682335542426, -32.983801757994669 ], [ 151.754206923578749, -32.956209566436968 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.793569233070116, -32.93097358298138 ], [ 151.807064777797081, -32.918386849090048 ], [ 151.799100500026384, -32.912369673799752 ], [ 151.785604955299391, -32.924957263513647 ], [ 151.793569233070116, -32.93097358298138 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.331729438717161, -35.602436909218632 ], [ 150.317524165859396, -35.610124264056566 ], [ 150.323184684185208, -35.617037952137039 ], [ 150.337389957042944, -35.609351261642296 ], [ 150.331729438717161, -35.602436909218632 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.312737289062994, -33.673730230020134 ], [ 151.303256152301003, -33.689094759163098 ], [ 151.311864609684051, -33.692772702173606 ], [ 151.321345746445985, -33.677408830427957 ], [ 151.312737289062994, -33.673730230020134 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.302340369031612, -33.697192592485045 ], [ 151.290758794763605, -33.724705632327918 ], [ 151.302824314848607, -33.728219493623143 ], [ 151.314405889116614, -33.70070757978894 ], [ 151.302340369031612, -33.697192592485045 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 6.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.288799030402345, -33.72209486483851 ], [ 151.305146039351598, -33.746796602660645 ], [ 151.317975972497834, -33.74092525514758 ], [ 151.301628963548637, -33.716221826848752 ], [ 151.288799030402345, -33.72209486483851 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.320261262580914, -33.645116643437291 ], [ 151.3177204166509, -33.661984494473529 ], [ 151.328458288803034, -33.663105117045426 ], [ 151.330999134733048, -33.646237485670866 ], [ 151.320261262580914, -33.645116643437291 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.074261832544352, -34.162722562976754 ], [ 151.057630601818232, -34.172219525791412 ], [ 151.063182321393157, -34.178874650241639 ], [ 151.079813552119219, -34.169378436289875 ], [ 151.074261832544352, -34.162722562976754 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.43210802028446, -33.484173036709436 ], [ 151.416905970657183, -33.499875874069183 ], [ 151.426668105505371, -33.506448003920248 ], [ 151.441870155132705, -33.490746358598834 ], [ 151.43210802028446, -33.484173036709436 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.174763918244281, -32.733121204831683 ], [ 152.16264721578915, -32.747986274536238 ], [ 152.173306583138185, -32.754132692436642 ], [ 152.185423285593288, -32.739268648247318 ], [ 152.174763918244281, -32.733121204831683 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 97.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.413338437951694, -27.108708653600289 ], [ 153.413148249484067, -27.329433322644011 ], [ 153.44471518705285, -27.32945481062492 ], [ 153.444905375520477, -27.108730184201317 ], [ 153.413338437951694, -27.108708653600289 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 12.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.460770202983269, -27.033389132085823 ], [ 153.433268224716443, -27.115380612976537 ], [ 153.443400448935051, -27.118074144165949 ], [ 153.47090242720185, -27.036084634360744 ], [ 153.460770202983269, -27.033389132085823 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 68.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.530316494803571, -27.429417447915331 ], [ 153.443870597177948, -27.671920174676099 ], [ 153.462182581481869, -27.677045593127747 ], [ 153.548628479107521, -27.434554197017 ], [ 153.530316494803571, -27.429417447915331 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 4.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.835221877751763, -34.723150825111716 ], [ 150.825540622749344, -34.745468211607609 ], [ 150.837476596295346, -34.748964552481603 ], [ 150.847157851297766, -34.726648110362753 ], [ 150.835221877751763, -34.723150825111716 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.400307922491692, -35.511343157567133 ], [ 150.374180854699148, -35.536034882572153 ], [ 150.390807215784804, -35.54768563228631 ], [ 150.416934283577348, -35.522997492858792 ], [ 150.400307922491692, -35.511343157567133 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 34.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.660304226198974, -35.148589383322879 ], [ 150.548965076403078, -35.18970767243038 ], [ 150.558394400612002, -35.206763227668674 ], [ 150.669733550407926, -35.165653567006572 ], [ 150.660304226198974, -35.148589383322879 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.299008238748627, -33.760853263475894 ], [ 151.285727242147232, -33.773907940484506 ], [ 151.293904190680962, -33.779656046795111 ], [ 151.307185187282386, -33.766602245600147 ], [ 151.299008238748627, -33.760853263475894 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 13.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.467589427210243, -33.3981576366116 ], [ 151.430515110935801, -33.440988361088309 ], [ 151.444222661980405, -33.449251949841653 ], [ 151.481296978254846, -33.406425303049978 ], [ 151.467589427210243, -33.3981576366116 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.492481980756565, -33.388830711766673 ], [ 151.514931887618161, -33.353711788212586 ], [ 151.499082629565265, -33.346643907113013 ], [ 151.47663272270367, -33.381765683263637 ], [ 151.492481980756565, -33.388830711766673 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.432060293458051, -33.450903174846722 ], [ 151.425837629068894, -33.473643483227704 ], [ 151.442416703322664, -33.476800427186625 ], [ 151.44863936771182, -33.454060947085466 ], [ 151.432060293458051, -33.450903174846722 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.128454335981218, -32.756434077446798 ], [ 152.107032328151519, -32.77771235105596 ], [ 152.117664040466764, -32.785279327103886 ], [ 152.139086048296463, -32.764002862657406 ], [ 152.128454335981218, -32.756434077446798 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.992858478477103, -34.222065914566308 ], [ 150.980484391950597, -34.233613466489345 ], [ 150.987672888643345, -34.238878808862523 ], [ 151.00004697516988, -34.227331978977027 ], [ 150.992858478477103, -34.222065914566308 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.868139213834638, -34.578859438343358 ], [ 150.870466995898141, -34.594297279017439 ], [ 150.881138687822556, -34.59320675445516 ], [ 150.878810905759053, -34.577768711165369 ], [ 150.868139213834638, -34.578859438343358 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.23774933826914, -35.696663359987745 ], [ 150.206496383273588, -35.703563551310324 ], [ 150.209761317646951, -35.713314777259207 ], [ 150.241014272642474, -35.706415429932989 ], [ 150.23774933826914, -35.696663359987745 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.919208551841905, -34.367451752847487 ], [ 150.890723123788263, -34.408216967905773 ], [ 150.904065662130591, -34.414564469699961 ], [ 150.932551090184205, -34.37380234652025 ], [ 150.919208551841905, -34.367451752847487 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.177953326371522, -34.027823044343982 ], [ 151.151637938977444, -34.042789945329602 ], [ 151.159042252526689, -34.051728939184819 ], [ 151.185357639920795, -34.036763615632275 ], [ 151.177953326371522, -34.027823044343982 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.852783610351395, -34.648996763747945 ], [ 150.848035020178543, -34.667436635608318 ], [ 150.858816091310928, -34.66931486892355 ], [ 150.863564681483751, -34.650875415032282 ], [ 150.852783610351395, -34.648996763747945 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.866290906816261, -34.64824397510219 ], [ 150.865998205884608, -34.609622998491943 ], [ 150.848731050674445, -34.609711624926199 ], [ 150.849023751606126, -34.64833256028971 ], [ 150.866290906816261, -34.64824397510219 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 16.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.537171398065539, -35.240899147147758 ], [ 150.469500903700748, -35.28600858861622 ], [ 150.479084565356999, -35.295590132350817 ], [ 150.546755059721789, -35.250486026965199 ], [ 150.537171398065539, -35.240899147147758 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 6.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.544073559347197, -35.203528793443816 ], [ 150.5207862784969, -35.231560653473863 ], [ 150.53177040586354, -35.237649731904568 ], [ 150.555057686713837, -35.209619975265987 ], [ 150.544073559347197, -35.203528793443816 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.475747870590709, -35.318568368450585 ], [ 150.466368168372952, -35.342059891646727 ], [ 150.479935678068614, -35.345664900983671 ], [ 150.489315380286371, -35.322174425692566 ], [ 150.475747870590709, -35.318568368450585 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 6.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.472540710877098, -35.286223324588846 ], [ 150.46403930846202, -35.319442286186323 ], [ 150.476042474872258, -35.321487798134278 ], [ 150.484543877287336, -35.288269676516023 ], [ 150.472540710877098, -35.286223324588846 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 4.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.451576703458244, -35.398277879408674 ], [ 150.474529807351246, -35.376220494514904 ], [ 150.4662383322067, -35.370484843164647 ], [ 150.44328522831367, -35.392543796192463 ], [ 150.451576703458244, -35.398277879408674 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.389159156614198, -35.457510983492647 ], [ 150.384667278027962, -35.477729819465921 ], [ 150.395675315948381, -35.479351796783376 ], [ 150.400167194534589, -35.459133368650519 ], [ 150.389159156614198, -35.457510983492647 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.43379877699914, -35.401488318549482 ], [ 150.400353254564749, -35.441127277830027 ], [ 150.413079197965175, -35.448255830299622 ], [ 150.446524720399566, -35.408620379762034 ], [ 150.43379877699914, -35.401488318549482 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 21.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.917548523435613, -34.304819382693033 ], [ 150.908247834528225, -34.37026769055209 ], [ 150.929527427076238, -34.372328700075208 ], [ 150.938828115983625, -34.306882001120272 ], [ 150.917548523435613, -34.304819382693033 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.90284805988415, -34.417857451184368 ], [ 150.892508822208896, -34.448457728019264 ], [ 150.90321746672609, -34.450918632285394 ], [ 150.913556704401316, -34.420319256687456 ], [ 150.90284805988415, -34.417857451184368 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 44.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.889071031435236, -34.480176690135039 ], [ 150.844543135993348, -34.570952129712154 ], [ 150.872632674398659, -34.580299038007595 ], [ 150.917160569840547, -34.489533792590507 ], [ 150.889071031435236, -34.480176690135039 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 80.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.555799979535323, -31.986388186784922 ], [ 152.486296037493844, -32.171328985919544 ], [ 152.513157941717736, -32.178569153204869 ], [ 152.582661883759243, -31.993643018355709 ], [ 152.555799979535323, -31.986388186784922 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.53037060570324, -32.181497416722777 ], [ 152.533153539421079, -32.200026500473513 ], [ 152.543707221023539, -32.198891388772367 ], [ 152.5409242873057, -32.180362073916889 ], [ 152.53037060570324, -32.181497416722777 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 31.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.539144069727456, -32.434911077595139 ], [ 152.398835306988559, -32.481403036214417 ], [ 152.405051144980035, -32.494754363619386 ], [ 152.545359907718904, -32.448269298797904 ], [ 152.539144069727456, -32.434911077595139 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 42.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.92605818535435, -31.464963546930857 ], [ 152.824182625057375, -31.582971262219715 ], [ 152.840309317593665, -31.59308042968663 ], [ 152.942184877890611, -31.475085495469632 ], [ 152.92605818535435, -31.464963546930857 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.283496501484308, -33.884964022501869 ], [ 151.267561779878832, -33.891536572603947 ], [ 151.272372881127467, -33.899573806040685 ], [ 151.288307602732914, -33.89300187529237 ], [ 151.283496501484308, -33.884964022501869 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.51010258022464, -32.426042676288631 ], [ 152.523549798614482, -32.438118247410102 ], [ 152.531870264858441, -32.431517809741749 ], [ 152.518423046468541, -32.41944135473048 ], [ 152.51010258022464, -32.426042676288631 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.076343551681987, -30.895426732993084 ], [ 153.076669333571289, -30.910919963588146 ], [ 153.086732912811584, -30.910764182723778 ], [ 153.086407130922254, -30.895270926912666 ], [ 153.076343551681987, -30.895426732993084 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 7.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.956880530902055, -31.282790655403996 ], [ 152.961857409043546, -31.317475697513601 ], [ 152.976551137513439, -31.315936652109194 ], [ 152.971574259371948, -31.281251043429016 ], [ 152.956880530902055, -31.282790655403996 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 14.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.967143255501924, -31.190130135878658 ], [ 152.95044085987476, -31.251783625329757 ], [ 152.965454344596196, -31.254757096527101 ], [ 152.982156740223331, -31.193105547152246 ], [ 152.967143255501924, -31.190130135878658 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 34.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.958174815931926, -31.313073747471105 ], [ 152.898225469273257, -31.421512768429412 ], [ 152.916078780804099, -31.42870413260334 ], [ 152.976028127462826, -31.320273414519619 ], [ 152.958174815931926, -31.313073747471105 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.291411655635102, -29.777390920898217 ], [ 153.279270692719336, -29.808634976360967 ], [ 153.290739128839391, -29.811990653624473 ], [ 153.302880091755156, -29.780747646070473 ], [ 153.291411655635102, -29.777390920898217 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 9.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.002670690005942, -30.596746771019944 ], [ 153.008240612634779, -30.643680962889054 ], [ 153.021683079498132, -30.642499824428402 ], [ 153.016113156869267, -30.595565059769843 ], [ 153.002670690005942, -30.596746771019944 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 42.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.035452968025766, -31.068837766851505 ], [ 152.960647737790424, -31.183002382806269 ], [ 152.980242822040964, -31.192404932935823 ], [ 153.055048052276334, -31.078251638594566 ], [ 153.035452968025766, -31.068837766851505 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 23.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.620559111548914, -28.634957852435601 ], [ 153.602392492208764, -28.709684798174788 ], [ 153.623586201543361, -28.713649532703943 ], [ 153.641752820883511, -28.638925415908989 ], [ 153.620559111548914, -28.634957852435601 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.059140948468951, -36.455373508281859 ], [ 150.043755719573483, -36.498451480300211 ], [ 150.059834424865443, -36.50216328336176 ], [ 150.075219653760882, -36.459087375298118 ], [ 150.059140948468951, -36.455373508281859 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.058733516160913, -36.416110908675719 ], [ 150.070363379980819, -36.428793446985765 ], [ 150.079189782968655, -36.42355313070491 ], [ 150.067559919148749, -36.410869736483811 ], [ 150.058733516160913, -36.416110908675719 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.526730241278472, -32.357216650715174 ], [ 152.53745246923836, -32.36245977902913 ], [ 152.556117454533648, -32.335221266052457 ], [ 152.545395226573788, -32.329976558887907 ], [ 152.526730241278472, -32.357216650715174 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.525607334974211, -32.373071857191974 ], [ 152.509496939263869, -32.402532938564228 ], [ 152.520287857935244, -32.406739958767311 ], [ 152.536398253645586, -32.377280249873458 ], [ 152.525607334974211, -32.373071857191974 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 26.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.546255812381759, -32.224807690683399 ], [ 152.506809074804153, -32.311036727505233 ], [ 152.524875227170412, -32.316942711017639 ], [ 152.564321964748046, -32.230719289367975 ], [ 152.546255812381759, -32.224807690683399 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 21.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.688641145786477, -31.869860751726847 ], [ 152.598239461321839, -31.946508238385423 ], [ 152.607858251424858, -31.954679852445107 ], [ 152.698259935889467, -31.878039175882968 ], [ 152.688641145786477, -31.869860751726847 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 7.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.365029412539741, -29.432636998586727 ], [ 153.348901389133403, -29.469105332821144 ], [ 153.361378396834908, -29.473288479835848 ], [ 153.377506420241247, -29.436821649350204 ], [ 153.365029412539741, -29.432636998586727 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.694519103595951, -35.066258225038425 ], [ 150.685571715218913, -35.080943375821327 ], [ 150.694486093373769, -35.084580908178324 ], [ 150.703433481750835, -35.06989641208434 ], [ 150.694519103595951, -35.066258225038425 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 4.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.02790527656353, -30.498239435406575 ], [ 153.02269325296129, -30.526808822712351 ], [ 153.032375488760806, -30.528119643279723 ], [ 153.037587512363046, -30.499550641238759 ], [ 153.02790527656353, -30.498239435406575 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 15.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.557549872444213, -28.593052922738785 ], [ 153.606537025181154, -28.648923265351776 ], [ 153.61853611694562, -28.64081834007602 ], [ 153.569548964208707, -28.584943684106371 ], [ 153.557549872444213, -28.593052922738785 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.220334942658525, -35.807510329173525 ], [ 150.222822268469201, -35.821228866610518 ], [ 150.23378200010049, -35.819922263351017 ], [ 150.231294674289842, -35.80620350014803 ], [ 150.220334942658525, -35.807510329173525 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 14.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.174137297509276, -35.823516378914633 ], [ 150.164961041884709, -35.860779282120802 ], [ 150.18919494538045, -35.864699843517201 ], [ 150.198371201005017, -35.827438782641579 ], [ 150.174137297509276, -35.823516378914633 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 16.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.165777759973736, -35.851035862090335 ], [ 150.136320261082119, -35.903445763990206 ], [ 150.153398594755259, -35.909745629907285 ], [ 150.182856093646876, -35.857339897677299 ], [ 150.165777759973736, -35.851035862090335 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.100969921923877, -30.351490298601746 ], [ 153.094184020450314, -30.379144909540003 ], [ 153.10844080816068, -30.381748871852658 ], [ 153.115226709634214, -30.354094997407792 ], [ 153.100969921923877, -30.351490298601746 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 8.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 149.96071729378852, -37.245131286879975 ], [ 149.932686679588727, -37.281901182844848 ], [ 149.944112056833433, -37.287416382600348 ], [ 149.972142671033225, -37.250649180212328 ], [ 149.96071729378852, -37.245131286879975 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 68.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.393296291402436, -32.471689777403078 ], [ 152.263166316201023, -32.582622747197064 ], [ 152.284331661566995, -32.600259831098413 ], [ 152.414461636768436, -32.489348657441234 ], [ 152.393296291402436, -32.471689777403078 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.566955606277105, -28.31350633644356 ], [ 153.566994043912956, -28.364403154087459 ], [ 153.581598175587715, -28.364394612160908 ], [ 153.581559737951864, -28.313497790423689 ], [ 153.566955606277105, -28.31350633644356 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.323944811235833, -35.629359198715811 ], [ 150.301016341957137, -35.632969353719176 ], [ 150.303469086492498, -35.643259322803935 ], [ 150.326397555771166, -35.639649632584238 ], [ 150.323944811235833, -35.629359198715811 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.301788601143869, -35.633672046988373 ], [ 150.287381175450633, -35.658743461891603 ], [ 150.298237219991961, -35.662862412603111 ], [ 150.312644645685225, -35.637792290534193 ], [ 150.301788601143869, -35.633672046988373 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.200993450929104, -35.695762007514091 ], [ 150.185468113659539, -35.703589551082267 ], [ 150.190869243704554, -35.71065377661219 ], [ 150.20639458097412, -35.702826926616176 ], [ 150.200993450929104, -35.695762007514091 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.007443240734915, -36.637572552567505 ], [ 149.983151534874082, -36.680951259884239 ], [ 149.997607962199652, -36.686159259677382 ], [ 150.021899668060485, -36.642783488017528 ], [ 150.007443240734915, -36.637572552567505 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 4.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 149.947123319521353, -37.355907479046856 ], [ 149.944581160759611, -37.379231991707378 ], [ 149.955734593519367, -37.379999704917353 ], [ 149.958276752281108, -37.356675430961559 ], [ 149.947123319521353, -37.355907479046856 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 15.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.12589147682732, -36.100705379899267 ], [ 150.116044689846262, -36.164782735637523 ], [ 150.131184238173063, -36.166299684288582 ], [ 150.14103102515412, -36.102223567670634 ], [ 150.12589147682732, -36.100705379899267 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 7.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.150895485010807, -36.013124875496636 ], [ 150.133328495567525, -36.053523154670351 ], [ 150.144559457025167, -36.056716009660974 ], [ 150.162126446468449, -36.01631936858422 ], [ 150.150895485010807, -36.013124875496636 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 68.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.772273056596134, -31.717898691811367 ], [ 152.712427680871201, -31.836168983864475 ], [ 152.746026128712913, -31.848446516127446 ], [ 152.805871504437874, -31.73019193633041 ], [ 152.772273056596134, -31.717898691811367 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 7.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.989493574791879, -30.760048806130182 ], [ 152.991879197907707, -30.798180419014635 ], [ 153.004503644948954, -30.797597539682879 ], [ 153.002118021833127, -30.759465695700012 ], [ 152.989493574791879, -30.760048806130182 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 13.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.277794846295905, -29.82235010198044 ], [ 153.253959742910183, -29.876738038856626 ], [ 153.269601929992376, -29.881893368777018 ], [ 153.293437033378098, -29.827508241151222 ], [ 153.277794846295905, -29.82235010198044 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 7.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.254229131824076, -29.874447999881987 ], [ 153.267566479566426, -29.901857513639534 ], [ 153.282688589948748, -29.896326931200122 ], [ 153.269351242206426, -29.868915896712689 ], [ 153.254229131824076, -29.874447999881987 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 14.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.344637190483979, -29.575464868401184 ], [ 153.369602543233555, -29.513083544948621 ], [ 153.354714391912239, -29.508572460817479 ], [ 153.329749039162692, -29.570956567015674 ], [ 153.344637190483979, -29.575464868401184 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.362737747409909, -29.477445405187137 ], [ 153.350662031579162, -29.497432300536605 ], [ 153.36074039013414, -29.502045538668089 ], [ 153.372816105964887, -29.482059553487954 ], [ 153.362737747409909, -29.477445405187137 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 14.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.320146124747509, -29.570645511048539 ], [ 153.326721920535533, -29.617892143549035 ], [ 153.348291180744212, -29.615622805009636 ], [ 153.341715384956188, -29.568375109488429 ], [ 153.320146124747509, -29.570645511048539 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 32.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.995057588166247, -30.645326962844418 ], [ 152.978718199536701, -30.746980018879675 ], [ 153.000642826229068, -30.74958434257352 ], [ 153.016982214858587, -30.647934031146121 ], [ 152.995057588166247, -30.645326962844418 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.56574215412877, -28.321042880928342 ], [ 153.576646967460789, -28.322863201806179 ], [ 153.591499775291055, -28.253891177503917 ], [ 153.580594961959008, -28.252069676968144 ], [ 153.56574215412877, -28.321042880928342 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.137571081393247, -30.276322562400033 ], [ 153.133058608740953, -30.299554708416988 ], [ 153.144218632256468, -30.301170784397364 ], [ 153.148731104908762, -30.277939021166041 ], [ 153.137571081393247, -30.276322562400033 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.165656048374757, -35.957394920534476 ], [ 150.166246789564497, -35.907538102108255 ], [ 150.151055464443601, -35.907420052725122 ], [ 150.150464723253862, -35.957276945575231 ], [ 150.165656048374757, -35.957394920534476 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 10.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.549691773803886, -28.5355217517371 ], [ 153.562793840662067, -28.595908674150497 ], [ 153.574151275641327, -28.594008419731992 ], [ 153.561049208783146, -28.533620406644971 ], [ 153.549691773803886, -28.5355217517371 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.464585299834823, -28.114480730957347 ], [ 153.475531490307674, -28.129430619960267 ], [ 153.483446286638525, -28.124923201117159 ], [ 153.472500096165703, -28.109972683559892 ], [ 153.464585299834823, -28.114480730957347 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.126252219774756, -36.199558378437217 ], [ 150.128761030756976, -36.213816861646031 ], [ 150.138454184084168, -36.212706528380018 ], [ 150.135945373101947, -36.198447842874081 ], [ 150.126252219774756, -36.199558378437217 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 9.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.072165346981933, -36.372487726327627 ], [ 150.055711366780514, -36.41430073553596 ], [ 150.068902154975234, -36.417663179118982 ], [ 150.085356135176653, -36.375851979149459 ], [ 150.072165346981933, -36.372487726327627 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.14452858170128, -36.260574211554953 ], [ 150.122240618993203, -36.289157009337394 ], [ 150.131593727687289, -36.293896170263906 ], [ 150.153881690395366, -36.265315107975731 ], [ 150.14452858170128, -36.260574211554953 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 150.126885541351982, -36.085934320536161 ], [ 150.133363903413453, -36.098672202540037 ], [ 150.143262757164621, -36.095385020868513 ], [ 150.136784395103149, -36.082646606084559 ], [ 150.126885541351982, -36.085934320536161 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 2.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.252426757662533, -33.941576829965534 ], [ 151.252364256796739, -33.957664398960631 ], [ 151.263153831247166, -33.957693240728936 ], [ 151.263216332112961, -33.941605677186324 ], [ 151.252426757662533, -33.941576829965534 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 25.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.503024641329773, -33.351022779979488 ], [ 151.577393068494985, -33.290723412378014 ], [ 151.563366240797961, -33.278639156093327 ], [ 151.488997813632778, -33.33894688009422 ], [ 151.503024641329773, -33.351022779979488 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 18.7 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.60391978725491, -28.721882073666812 ], [ 153.585402173111561, -28.803430405663754 ], [ 153.601022913046734, -28.806155109203882 ], [ 153.619540527190111, -28.724608906808825 ], [ 153.60391978725491, -28.721882073666812 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 693.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.307709695430958, -24.988508915482228 ], [ 153.022000003224207, -25.785262616088694 ], [ 153.079420107856293, -25.802011102636783 ], [ 153.365129800063073, -25.005368329745604 ], [ 153.307709695430958, -24.988508915482228 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 6.3 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.833742791696693, -31.603121692047083 ], [ 152.827404978140862, -31.638465804471075 ], [ 152.839302590570952, -31.640012492853824 ], [ 152.845640404126783, -31.604668968002549 ], [ 152.833742791696693, -31.603121692047083 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 4.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.322909612995119, -33.577499042299621 ], [ 151.318960206919115, -33.60309760378393 ], [ 151.331637176589396, -33.604454571076012 ], [ 151.335586582665428, -33.578856412312042 ], [ 151.322909612995119, -33.577499042299621 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 28.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.219183428195493, -29.973573757379498 ], [ 153.183866690095726, -30.058950363870849 ], [ 153.204242952428842, -30.06526697414898 ], [ 153.239559690528637, -29.979895808333573 ], [ 153.219183428195493, -29.973573757379498 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 9.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.188130603465112, -30.081075068925884 ], [ 153.193103719409834, -30.116214438045755 ], [ 153.210750111957282, -30.114345419567229 ], [ 153.20577699601256, -30.079205385917756 ], [ 153.188130603465112, -30.081075068925884 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 77.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.053383035788812, -30.915952615761139 ], [ 153.022577374416386, -31.053280525921295 ], [ 153.060058801459775, -31.059455453740959 ], [ 153.090864462832201, -30.92213643824433 ], [ 153.053383035788812, -30.915952615761139 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 12.5 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.135066978901961, -30.304699731399385 ], [ 153.096923316797842, -30.353454590299886 ], [ 153.109226019626419, -30.360623329927343 ], [ 153.147369681730567, -30.311872041071432 ], [ 153.135066978901961, -30.304699731399385 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 148.9 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.559780278105933, -28.858905165374594 ], [ 153.404829827173927, -29.103121155816151 ], [ 153.437734656731266, -29.119077311325952 ], [ 153.592685107663272, -28.874899044667824 ], [ 153.559780278105933, -28.858905165374594 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 5.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 153.574814149386327, -28.35757451714586 ], [ 153.562477795765716, -28.389575377269619 ], [ 153.572700006881519, -28.392625610925453 ], [ 153.58503636050213, -28.360625671113336 ], [ 153.574814149386327, -28.35757451714586 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 33.8 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.599901815272347, -33.188108642431708 ], [ 151.542194492731795, -33.261615695321318 ], [ 151.563861760300512, -33.273513131011249 ], [ 151.621569082841063, -33.200016081658021 ], [ 151.599901815272347, -33.188108642431708 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 11.0 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.62486053843574, -33.139052104237216 ], [ 151.619195155562437, -33.17764141657679 ], [ 151.6381444023867, -33.179590690750238 ], [ 151.643809785260004, -33.141002236368706 ], [ 151.62486053843574, -33.139052104237216 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 61.4 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.647932321766802, -33.117158806658004 ], [ 151.735052086265171, -33.017016218993483 ], [ 151.707974250578189, -33.000461536667551 ], [ 151.62085448607985, -33.100622947984384 ], [ 151.647932321766802, -33.117158806658004 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 191.2 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.063097787308209, -32.754603421880958 ], [ 151.767874785630056, -32.886437323152997 ], [ 151.789535029810764, -32.920660742663429 ], [ 152.084758031488917, -32.788877690879936 ], [ 152.063097787308209, -32.754603421880958 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 67.1 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 152.243454296747956, -32.582770954550604 ], [ 152.162404811647633, -32.681872015149025 ], [ 152.19362026601479, -32.699965989790407 ], [ 152.274669751115113, -32.60088498479859 ], [ 152.243454296747956, -32.582770954550604 ] ] ] } },
|
||||
{ "type": "Feature", "properties": { "area": 3.6 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 151.377071861239386, -33.523596463255998 ], [ 151.353052179498889, -33.529137911615919 ], [ 151.356141273190929, -33.538442265783701 ], [ 151.38016095493137, -33.53290141373239 ], [ 151.377071861239386, -33.523596463255998 ] ] ] } }
|
||||
]
|
||||
}
|
@ -1,817 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Shoreline position v.s. wave energy\n",
|
||||
"This notebook looks at the relationship between shoreline position and wave energy."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import datetime\n",
|
||||
"import pickle\n",
|
||||
"import fiona\n",
|
||||
"import shapely\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"import pandas as pd\n",
|
||||
"import geopandas\n",
|
||||
"from scipy.stats import percentileofscore\n",
|
||||
"from shapely.geometry import Point\n",
|
||||
"import numpy as np\n",
|
||||
"import requests\n",
|
||||
"from bs4 import BeautifulSoup\n",
|
||||
"import urllib.parse\n",
|
||||
"import itertools\n",
|
||||
"from tqdm import tqdm\n",
|
||||
"import glob\n",
|
||||
"from scipy.interpolate import griddata, SmoothBivariateSpline\n",
|
||||
"from scipy.ndimage.filters import gaussian_filter\n",
|
||||
"import colorcet as cc"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Shoreline positions"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Import Killian's data\n",
|
||||
"shorelines = pickle.load(open(\"14_timeseries_Australia_2.pkl\", \"rb\"))\n",
|
||||
"beaches = fiona.open(\"14_beaches_Australia.geojson\")\n",
|
||||
"polygons = fiona.open(\"14_polygons_Australia.geojson\")\n",
|
||||
"transects = fiona.open(\"14_transects_Australia.geojson\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"code_folding": [
|
||||
0
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"# Import Chris' data\n",
|
||||
"df_sites = df_from_csv('sites.csv', index_col=[0])\n",
|
||||
"df_obs_impacts = df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
"df_waves = df_from_csv('waves.csv', index_col=[0,1])\n",
|
||||
"df_waves.index = df_waves.index.set_levels([df_waves.index.levels[0], pd.to_datetime(df_waves.index.levels[1])])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Get coorindates of transects where Killian has given shoreline data\n",
|
||||
"transect_data = [x for x in transects if x['properties']['id'] in shorelines.keys()]\n",
|
||||
"transect_dict = [{'name':x['properties']['name'],\n",
|
||||
" 'id':x['properties']['id'],\n",
|
||||
" 'orientation':x['properties']['orientation'],\n",
|
||||
" 'start_coords': Point(x['geometry']['coordinates'][0][0], x['geometry']['coordinates'][0][1]),\n",
|
||||
" 'end_coords': Point(x['geometry']['coordinates'][1][0], x['geometry']['coordinates'][1][1])} for x in transect_data]\n",
|
||||
"df_transects = pd.DataFrame(transect_dict)\n",
|
||||
"gdf_transects = geopandas.GeoDataFrame(df_transects, geometry='start_coords',crs={'init':'epsg:4326'})\n",
|
||||
"\n",
|
||||
"# Find closest Chris transect to each one of Kilian's transects\n",
|
||||
"# First transform coords using geopandas\n",
|
||||
"df_sites['coords'] = list(zip(df_sites.lon, df_sites.lat))\n",
|
||||
"df_sites['coords'] = df_sites['coords'].apply(Point)\n",
|
||||
"gdf_sites = geopandas.GeoDataFrame(df_sites, geometry='coords',crs={'init':'epsg:4326'})\n",
|
||||
"gdf_sites['site_id'] = gdf_sites.index"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Find nearest Chris transect for each of Kilian's transect\n",
|
||||
"\n",
|
||||
"from shapely.ops import nearest_points\n",
|
||||
"\n",
|
||||
"def nearest(row,\n",
|
||||
" geom_union,\n",
|
||||
" df1,\n",
|
||||
" df2,\n",
|
||||
" geom1_col='geometry',\n",
|
||||
" geom2_col='geometry',\n",
|
||||
" src_column=None):\n",
|
||||
" \"\"\"Find the nearest point and return the corresponding value from specified column.\"\"\"\n",
|
||||
" # Find the geometry that is closest\n",
|
||||
" nearest = df2[geom2_col] == nearest_points(row[geom1_col], geom_union)[1]\n",
|
||||
" # Get the corresponding value from df2 (matching is based on the geometry)\n",
|
||||
" value = df2[nearest][src_column].get_values()[0]\n",
|
||||
" return value\n",
|
||||
"\n",
|
||||
"unary_union = gdf_sites.unary_union\n",
|
||||
"gdf_transects['chris_site_id'] = gdf_transects.apply(nearest,\n",
|
||||
" geom_union=unary_union,\n",
|
||||
" df1=gdf_transects,\n",
|
||||
" df2=gdf_sites,\n",
|
||||
" geom1_col='start_coords',\n",
|
||||
" geom2_col='coords',\n",
|
||||
" src_column='site_id',\n",
|
||||
" axis=1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Got the closests site_id, now check the distance. If distance too far between these sites, then probably not a good match.\n",
|
||||
"gdf_transects = gdf_transects.merge(gdf_sites[['coords']], left_on='chris_site_id', right_on='site_id')\n",
|
||||
"gdf_transects = gdf_transects.rename({'coords': 'chris_coords'},axis='columns')\n",
|
||||
"gdf_transects\n",
|
||||
"distances = gdf_transects[['start_coords']].to_crs(epsg=28356).distance(\n",
|
||||
" geopandas.GeoDataFrame(gdf_transects[['chris_coords']],\n",
|
||||
" geometry='chris_coords',\n",
|
||||
" crs={\n",
|
||||
" 'init': 'epsg:4326'\n",
|
||||
" }).to_crs(epsg=28356))\n",
|
||||
"\n",
|
||||
"gdf_transects['transect_to_chris_dist'] = distances"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Limit used transects to 300 m max distance\n",
|
||||
"gdf_transects = gdf_transects[gdf_transects.transect_to_chris_dist < 300]\n",
|
||||
"gdf_transects"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Calculate the change to the percentile of shoreline right after the storm\n",
|
||||
"# Kilian's shorelines are given for z=0 MSL, so find change in shoreline due to storm\n",
|
||||
"gdf_transects=gdf_transects.merge(df_obs_impacts.width_msl_change_m,left_on=['chris_site_id'], right_on=['site_id'])\n",
|
||||
"gdf_transects"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# At each beach calculate percentile of shoreline right before the storm\n",
|
||||
"data = []\n",
|
||||
"\n",
|
||||
"for row in gdf_transects.iterrows():\n",
|
||||
"\n",
|
||||
" # Get shoreline records\n",
|
||||
" id_shorelines = shorelines[row[1].id]['chainage']\n",
|
||||
" id_dates = shorelines[row[1].id]['dates']\n",
|
||||
"\n",
|
||||
" # Find last date before June 2016 storm\n",
|
||||
" dt_storm = datetime.datetime(2016, 6, 3)\n",
|
||||
" dt_storm = dt_storm.replace(tzinfo=datetime.timezone.utc)\n",
|
||||
" mask = pd.Series([x < dt_storm for x in id_dates])\n",
|
||||
" i_last_obs = mask[::-1].idxmax()\n",
|
||||
"\n",
|
||||
" last_obs_ch = id_shorelines[i_last_obs]\n",
|
||||
" last_obs_date = id_dates[i_last_obs]\n",
|
||||
" post_storm_ch = last_obs_ch + row[1].width_msl_change_m\n",
|
||||
"\n",
|
||||
" prestorm_shoreline_pctile = percentileofscore(id_shorelines[~np.isnan(id_shorelines)], last_obs_ch)\n",
|
||||
" poststorm_shoreline_pctile = percentileofscore(id_shorelines[~np.isnan(id_shorelines)],\n",
|
||||
" post_storm_ch)\n",
|
||||
" change_shoreline_pctile = poststorm_shoreline_pctile - prestorm_shoreline_pctile\n",
|
||||
"\n",
|
||||
" rel_change_shoreline_pctile = (poststorm_shoreline_pctile- prestorm_shoreline_pctile)/prestorm_shoreline_pctile *100\n",
|
||||
" \n",
|
||||
" # Calculate percentile of shoreline score\n",
|
||||
" data.append({\n",
|
||||
" 'prestorm_shoreline_pctile': prestorm_shoreline_pctile,\n",
|
||||
" 'poststorm_shoreline_pctile': poststorm_shoreline_pctile,\n",
|
||||
" 'change_shoreline_pctile': change_shoreline_pctile,\n",
|
||||
" 'rel_change_shoreline_pctile': rel_change_shoreline_pctile,\n",
|
||||
" 'index': row[0]\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
"data = pd.DataFrame(data).set_index('index')\n",
|
||||
"gdf_transects = gdf_transects.join(data)\n",
|
||||
"gdf_transects"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Grab data from NSW Nearshore wave transformation tool.\n",
|
||||
"# Need to relate Kilian's site id\n",
|
||||
"sites = [{\n",
|
||||
" 'id': 'way4042355',\n",
|
||||
" 'site_id': 'DEEWHYs0003',\n",
|
||||
" 'nsw_nearshore_id': 1007832\n",
|
||||
"}, {\n",
|
||||
" 'id': 'way13858409',\n",
|
||||
" 'site_id': 'DEEWHYn0003',\n",
|
||||
" 'nsw_nearshore_id': 1007822, \n",
|
||||
"}, {\n",
|
||||
" 'id': 'way13858412',\n",
|
||||
" 'site_id': 'MONA0011',\n",
|
||||
" 'nsw_nearshore_id': 1007726, \n",
|
||||
"},\n",
|
||||
"{\n",
|
||||
" 'id': 'way14040821',\n",
|
||||
" 'site_id': 'NARRA0007',\n",
|
||||
" 'nsw_nearshore_id': 1007760, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way14040977',\n",
|
||||
" 'site_id': 'NARRA0018',\n",
|
||||
" 'nsw_nearshore_id': 1007770, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way14041013',\n",
|
||||
" 'site_id': 'NARRA0030',\n",
|
||||
" 'nsw_nearshore_id': 1007778, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way25005079',\n",
|
||||
" 'site_id': 'MACM0009',\n",
|
||||
" 'nsw_nearshore_id': 1007354, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way54609773',\n",
|
||||
" 'site_id': 'WAMBE0005',\n",
|
||||
" 'nsw_nearshore_id': 1007264, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way54667480',\n",
|
||||
" 'site_id': 'AVOCAn0005',\n",
|
||||
" 'nsw_nearshore_id': 1007306, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way54669965',\n",
|
||||
" 'site_id': 'AVOCAs0004',\n",
|
||||
" 'nsw_nearshore_id': 1007312, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way134627391',\n",
|
||||
" 'site_id': 'ONEMILE0007',\n",
|
||||
" 'nsw_nearshore_id': 1005098, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way159040990',\n",
|
||||
" 'site_id': 'LHOUSE0004',\n",
|
||||
" 'nsw_nearshore_id': 1005448, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way173070325',\n",
|
||||
" 'site_id': 'LHOUSEn0077',\n",
|
||||
" 'nsw_nearshore_id': 1004186, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way182614828',\n",
|
||||
" 'site_id': 'TREACH0009',\n",
|
||||
" 'nsw_nearshore_id': 1005472, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way189407637',\n",
|
||||
" 'site_id': 'NSHORE_n0063',\n",
|
||||
" 'nsw_nearshore_id': 1003994, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way190929758',\n",
|
||||
" 'site_id': 'CRESn0069',\n",
|
||||
" 'nsw_nearshore_id': 1003708, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way222144734',\n",
|
||||
" 'site_id': 'BLUEYS0002',\n",
|
||||
" 'nsw_nearshore_id': 1005316, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way222145626',\n",
|
||||
" 'site_id': 'BOOM0008',\n",
|
||||
" 'nsw_nearshore_id': 1005298, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way224198013',\n",
|
||||
" 'site_id': 'MANNING0048',\n",
|
||||
" 'nsw_nearshore_id': 1004712, \n",
|
||||
"},{\n",
|
||||
" 'id': 'way450323845',\n",
|
||||
" 'site_id': 'NAMB0033',\n",
|
||||
" 'nsw_nearshore_id': np.nan, \n",
|
||||
"},{\n",
|
||||
" 'id': 'relation2303044',\n",
|
||||
" 'site_id': 'ENTRA0041',\n",
|
||||
" 'nsw_nearshore_id': 1007110, \n",
|
||||
"},{\n",
|
||||
" 'id': 'relation2723197',\n",
|
||||
" 'site_id': 'GRANTSn0022',\n",
|
||||
" 'nsw_nearshore_id': 1004296, \n",
|
||||
"}\n",
|
||||
"]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def nearshore_wave_csv_url(id,start_date,end_date):\n",
|
||||
" URL = 'http://www.nswaves.com.au/transform.php'\n",
|
||||
" payload = {\n",
|
||||
" 'init': '1',\n",
|
||||
" 'type': 'Transform-Full',\n",
|
||||
" 'startsite': '{}'.format(id),\n",
|
||||
" 'endsite': '{}'.format(id),\n",
|
||||
" 'timestep': 'null',\n",
|
||||
" 'startdate': start_date.strftime('%Y-%m-%d'),\n",
|
||||
" 'starthour': '00',\n",
|
||||
" 'enddate': end_date.strftime('%Y-%m-%d'),\n",
|
||||
" 'endhour': '00',\n",
|
||||
" 'sitestep': '1',\n",
|
||||
" 'method': 'Parametric',\n",
|
||||
" 'source': 'Waverider',\n",
|
||||
" 'filename': 'ckl',\n",
|
||||
" 'format': 'csv',\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
" session = requests.session()\n",
|
||||
" r = requests.post(URL, data=payload)\n",
|
||||
" \n",
|
||||
" soup = BeautifulSoup(r.text)\n",
|
||||
" \n",
|
||||
" # Check if data extraction was successful\n",
|
||||
" if soup.findAll(text=\"OK : Data Extraction Successful - Click filename/s to download data file\"):\n",
|
||||
"\n",
|
||||
" # Find all links\n",
|
||||
" for link in soup.find_all('a'):\n",
|
||||
"\n",
|
||||
" href = link.get('href')\n",
|
||||
" if '/data/full' not in href:\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" # Convert to absolute convert to absolute url\n",
|
||||
" csv_url = urllib.parse.urljoin(URL, href)\n",
|
||||
"\n",
|
||||
" return csv_url\n",
|
||||
" else:\n",
|
||||
" return None\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"def download_csv(url, file_path):\n",
|
||||
" urllib.request.urlretrieve(url,file_path)\n",
|
||||
" print('Downloaded {}'.format(file_path))\n",
|
||||
" \n",
|
||||
" \n",
|
||||
"def daterange(start_date, end_date,delta):\n",
|
||||
" while start_date < end_date:\n",
|
||||
" yield start_date\n",
|
||||
" start_date += delta\n",
|
||||
" \n",
|
||||
"def download_nearshore_csv(id, site_id, nsw_nearshore_id, start_date, end_date,output_folder='./14_nearshore_waves/'):\n",
|
||||
" \n",
|
||||
" # Create output folder if doesn't already exists\n",
|
||||
" os.makedirs(output_folder, exist_ok=True)\n",
|
||||
"\n",
|
||||
" # Output filename\n",
|
||||
" output_filename = '{}_{}_{}_{}_{}.csv'.format(\n",
|
||||
" id,\n",
|
||||
" site_id,\n",
|
||||
" nsw_nearshore_id,\n",
|
||||
" start_date.strftime('%Y%m%d'),\n",
|
||||
" end_date.strftime('%Y%m%d'),\n",
|
||||
" )\n",
|
||||
" output_filepath = os.path.join(output_folder,output_filename)\n",
|
||||
"\n",
|
||||
" # Don't download if file already exists\n",
|
||||
" if os.path.isfile(output_filepath):\n",
|
||||
" return\n",
|
||||
"\n",
|
||||
" csv_url = nearshore_wave_csv_url(nsw_nearshore_id,start_date,end_date)\n",
|
||||
"\n",
|
||||
" if csv_url:\n",
|
||||
" download_csv(csv_url, output_filepath)\n",
|
||||
" else:\n",
|
||||
" print('No url found')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# start_year = 2005\n",
|
||||
"# end_year = 2015\n",
|
||||
"# output_folder = './14_nearshore_waves/'\n",
|
||||
"\n",
|
||||
"# # Create list of start end dates we want to request\n",
|
||||
"# date_ranges = [(datetime.datetime(x, 1, 1), datetime.datetime(x, 12, 31))\n",
|
||||
"# for x in range(start_year, end_year + 1)]\n",
|
||||
"\n",
|
||||
"# inputs = list(itertools.product(sites, date_ranges))\n",
|
||||
"\n",
|
||||
"# for inpt in inputs:\n",
|
||||
"# download_nearshore_csv(inpt[0]['id'], inpt[0]['site_id'], inpt[0]['nsw_nearshore_id'], inpt[1][0], inpt[1][1])\n",
|
||||
"# break"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# # Use a queue to get data\n",
|
||||
"\n",
|
||||
"# from queue import Queue\n",
|
||||
"# from threading import Thread\n",
|
||||
"# q = Queue(maxsize=0)\n",
|
||||
"# num_theads = 4\n",
|
||||
"\n",
|
||||
"# start_year = 2005\n",
|
||||
"# end_year = 2015\n",
|
||||
"# date_ranges = [(datetime.datetime(x, 1, 1), datetime.datetime(x, 12, 31))\n",
|
||||
"# for x in range(start_year, end_year + 1)]\n",
|
||||
"\n",
|
||||
"# inputs = [x for x in list(itertools.product(sites, date_ranges))]\n",
|
||||
"\n",
|
||||
"# #Populating Queue with tasks\n",
|
||||
"# results = [{} for x in inputs]\n",
|
||||
"\n",
|
||||
"# #load up the queue with the urls to fetch and the index for each job (as a tuple):\n",
|
||||
"# for i, inpt in enumerate(inputs):\n",
|
||||
"# q.put((i, inpt))\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# # Threaded function for queue processing.\n",
|
||||
"# def crawl(q, result):\n",
|
||||
"# while not q.empty():\n",
|
||||
"# work = q.get() #fetch new work from the Queue\n",
|
||||
"# print(work)\n",
|
||||
"# download_nearshore_csv(work[1][0]['id'], work[1][0]['site_id'],\n",
|
||||
"# work[1][0]['nsw_nearshore_id'], work[1][1][0],\n",
|
||||
"# work[1][1][1])\n",
|
||||
"# #signal to the queue that task has been processed\n",
|
||||
"# q.task_done()\n",
|
||||
"# return True\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# #Starting worker threads on queue processing\n",
|
||||
"# for i in range(num_theads):\n",
|
||||
"# print('Starting thread {}'.format(i))\n",
|
||||
"# worker = Thread(target=crawl, args=(q, results))\n",
|
||||
"# worker.setDaemon(True) #setting threads as \"daemon\" allows main program to\n",
|
||||
"# #exit eventually even if these dont finish\n",
|
||||
"# #correctly.\n",
|
||||
"# worker.start()\n",
|
||||
"\n",
|
||||
"# #now we wait until the queue has been processed\n",
|
||||
"# q.join()\n",
|
||||
"# print('All tasks completed.')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# For each site, get\n",
|
||||
"for site in sites:\n",
|
||||
"\n",
|
||||
" # print(site)\n",
|
||||
" df_sites\n",
|
||||
"\n",
|
||||
" # Get shoreline orientation\n",
|
||||
" orientation = df_sites.loc[[site['site_id']]].orientation.iloc[0]\n",
|
||||
"\n",
|
||||
" # Get peak hour wave energy from June 2016 storm\n",
|
||||
" max_hrly_wave_power = df_waves.loc[[site['site_id']]].Pxs.max()\n",
|
||||
"\n",
|
||||
" # Load nearshore wave csv files into one dataframe\n",
|
||||
" site_nearshore_wave_files = glob.glob('./14_nearshore_waves/*{}*'.format(\n",
|
||||
" site['site_id']))\n",
|
||||
"\n",
|
||||
" if len(site_nearshore_wave_files) == 0:\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" df_hist_waves = pd.concat((pd.read_csv(f,\n",
|
||||
" skiprows=8,\n",
|
||||
" index_col=0,\n",
|
||||
" names=['Hs', 'Tp', 'dir'],\n",
|
||||
" na_values=' NaN')\n",
|
||||
" for f in site_nearshore_wave_files))\n",
|
||||
" df_hist_waves.index = pd.to_datetime(df_hist_waves.index)\n",
|
||||
"\n",
|
||||
" # At each row, calculate crossshore component of nearshore wave energy\n",
|
||||
" df_hist_waves['d'] = 10\n",
|
||||
" df_hist_waves['L'] = 9.81 * df_hist_waves.Tp**2 / 2 / np.pi\n",
|
||||
" df_hist_waves['n'] = 0.5 * (\n",
|
||||
" 1 + (4 * np.pi * df_hist_waves.d / df_hist_waves.L) /\n",
|
||||
" (np.sinh(4 * np.pi * df_hist_waves.d / df_hist_waves.L)))\n",
|
||||
" df_hist_waves['E'] = 1 / 16 * 1025 * 9.81 * df_hist_waves.Hs**2\n",
|
||||
" df_hist_waves['C'] = 9.81 * df_hist_waves.Tp / 2 / np.pi * np.tanh(\n",
|
||||
" 2 * np.pi * df_hist_waves.d / df_hist_waves.L)\n",
|
||||
" df_hist_waves['shoreline_tn_angle'] = 270 - orientation\n",
|
||||
" df_hist_waves.loc[\n",
|
||||
" df_hist_waves.shoreline_tn_angle > 360,\n",
|
||||
" 'shoreline_tn_angle'] = df_hist_waves.shoreline_tn_angle - 360\n",
|
||||
" df_hist_waves[\n",
|
||||
" 'alpha'] = df_hist_waves.shoreline_tn_angle - df_hist_waves.dir\n",
|
||||
" df_hist_waves[\n",
|
||||
" 'Px'] = df_hist_waves.n * df_hist_waves.E * df_hist_waves.C * np.cos(\n",
|
||||
" np.deg2rad(df_hist_waves.alpha))\n",
|
||||
"\n",
|
||||
" # Apply percentileofscore for June 2016 wave energy\n",
|
||||
" storm_Px_hrly_pctile = percentileofscore(df_hist_waves.Px.dropna().values,\n",
|
||||
" max_hrly_wave_power,\n",
|
||||
" kind='mean')\n",
|
||||
"\n",
|
||||
" # Calculate cumulate wave energy from storm\n",
|
||||
" idx = ((df_waves.index.get_level_values('datetime') > '2016-06-04') &\n",
|
||||
" (df_waves.index.get_level_values('datetime') < '2016-06-07') &\n",
|
||||
" (df_waves.index.get_level_values('site_id') == site['site_id']))\n",
|
||||
" hrs = len(df_waves[idx])\n",
|
||||
" Pxscum_storm = df_waves[idx].Pxs.sum()\n",
|
||||
" \n",
|
||||
" # Calculate cumulate wave energy of mean wave conditions over length of storm\n",
|
||||
" Pxscum_mean = df_hist_waves['Px'].mean() * hrs\n",
|
||||
" Pxscum_storm_mean_ratio = Pxscum_storm / Pxscum_mean\n",
|
||||
"\n",
|
||||
" # Add to gdf_transects dataframe\n",
|
||||
" idx = gdf_transects[gdf_transects.chris_site_id == site['site_id']].index\n",
|
||||
" gdf_transects.loc[idx, 'storm_Px_hrly_pctile'] = storm_Px_hrly_pctile\n",
|
||||
" gdf_transects.loc[idx, 'Pxscum_storm'] = Pxscum_storm\n",
|
||||
" gdf_transects.loc[idx, 'Pxscum_mean'] = Pxscum_mean\n",
|
||||
" gdf_transects.loc[idx, 'Pxscum_storm_mean_ratio'] = Pxscum_storm_mean_ratio\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"gdf_transects.sort_values(by='Pxscum_storm_mean_ratio',ascending=False).head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"gdf_transects.sort_values(by='rel_change_shoreline_pctile').head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Drop nans\n",
|
||||
"gdf_transects = gdf_transects.dropna(axis='index',\n",
|
||||
" subset=[\n",
|
||||
" 'Pxscum_storm_mean_ratio',\n",
|
||||
" 'prestorm_shoreline_pctile',\n",
|
||||
" 'change_shoreline_pctile',\n",
|
||||
" 'rel_change_shoreline_pctile'\n",
|
||||
" ],\n",
|
||||
" how='any')\n",
|
||||
"\n",
|
||||
"# Grid results\n",
|
||||
"grid_x, grid_y = np.mgrid[0:2:100j, 0:100:100j]\n",
|
||||
"\n",
|
||||
"x_vals = gdf_transects.Pxscum_storm_mean_ratio.values\n",
|
||||
"y_vals = gdf_transects.prestorm_shoreline_pctile.values\n",
|
||||
"z_vals = gdf_transects.rel_change_shoreline_pctile.values\n",
|
||||
"\n",
|
||||
"points = [[x, y] for x, y in zip(\n",
|
||||
" x_vals,\n",
|
||||
" y_vals,\n",
|
||||
")]\n",
|
||||
"\n",
|
||||
"grid = griddata((x_vals,y_vals), z_vals, (grid_x, grid_y), method='cubic')\n",
|
||||
"\n",
|
||||
"# Smooth data\n",
|
||||
"# https://stackoverflow.com/a/34370291\n",
|
||||
"# grid = gaussian_filter(grid, sigma=0.5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def round_down(num, divisor):\n",
|
||||
" return num - (num%divisor)\n",
|
||||
"\n",
|
||||
"def round_up(x, divisor): \n",
|
||||
" return (x + divisor - 1) // divisor * divisor"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"gdf_transects[gdf_transects.prestorm_shoreline_pctile<40].sort_values(by='change_shoreline_pctile',ascending=True).head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Plot peak wave energy pctile vs prestorm shoreline percentile vs change in shoreline percentile\n",
|
||||
"\n",
|
||||
"x_col = 'Pxscum_storm_mean_ratio'\n",
|
||||
"y_col = 'prestorm_shoreline_pctile'\n",
|
||||
"# z_col = 'rel_change_shoreline_pctile'\n",
|
||||
"z_col = 'change_shoreline_pctile'\n",
|
||||
"\n",
|
||||
"# Drop nans\n",
|
||||
"gdf_transects = gdf_transects.dropna(axis='index',\n",
|
||||
" subset=[x_col, y_col,z_col\n",
|
||||
" ],\n",
|
||||
" how='any')\n",
|
||||
"\n",
|
||||
"# Grid results\n",
|
||||
"grid_x, grid_y = np.mgrid[0:25:100j, 0:100:100j]\n",
|
||||
"\n",
|
||||
"x_vals = gdf_transects[x_col].values\n",
|
||||
"y_vals = gdf_transects[y_col].values\n",
|
||||
"z_vals = gdf_transects[z_col].values\n",
|
||||
"\n",
|
||||
"grid = griddata((x_vals,y_vals), z_vals, (grid_x, grid_y), method='linear',rescale=True)\n",
|
||||
"\n",
|
||||
"# Smooth data\n",
|
||||
"# https://stackoverflow.com/a/34370291\n",
|
||||
"# grid = gaussian_filter(grid, sigma=0.5)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# # 2D Spline interpolation\n",
|
||||
"# s = SmoothBivariateSpline(x_vals, y_vals,z_vals)\n",
|
||||
"# spline_x = np.arange(1,25,0.1)\n",
|
||||
"# spline_y = np.arange(0,100,0.5)\n",
|
||||
"# spline_z = s(spline_x, spline_y,grid=True)\n",
|
||||
"# spline_grid_x, spline_grid_y = np.meshgrid(spline_x, spline_y)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Create figure\n",
|
||||
"fig = plt.figure(figsize=(3, 3), dpi=150, facecolor='w', edgecolor='k')\n",
|
||||
"ax = fig.add_subplot(111)\n",
|
||||
"\n",
|
||||
"# Define colors\n",
|
||||
"cmap_interval = 25\n",
|
||||
"cmap = cc.cm.fire\n",
|
||||
"vmin = round_down(np.min(z_vals), cmap_interval)\n",
|
||||
"vmax = round_up(np.max(z_vals), cmap_interval)\n",
|
||||
"levels = [x*cmap_interval for x in range(-4,2)]\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Plot SPLINE grid surface\n",
|
||||
"# cf = ax.contourf(spline_grid_x, spline_grid_y, spline_z.T,levels=levels, cmap=cmap,vmin=vmin,vmax=vmax)\n",
|
||||
"\n",
|
||||
"# Plot SPLINE contours\n",
|
||||
"# cs = plt.contour(grid_x, grid_y,grid,levels=levels,linewidths=0.5,colors='white', vmin=vmin,vmax=vmax)\n",
|
||||
"# ax.clabel(cs, inline=1, fontsize=4, fmt='%1.0f%%')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Plot CUBIC FIT grid surface\n",
|
||||
"cf = plt.contourf(grid_x, grid_y,grid,levels=levels, cmap=cmap,vmin=vmin,vmax=vmax)\n",
|
||||
"\n",
|
||||
"# Plot CUBIC FIT contours\n",
|
||||
"cs = plt.contour(grid_x, grid_y,grid,levels=levels,linewidths=0.5,colors='white', vmin=vmin,vmax=vmax)\n",
|
||||
"ax.clabel(cs, inline=1, fontsize=4, fmt='%1.0f%%')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"scatter = ax.scatter(\n",
|
||||
" x=x_vals,\n",
|
||||
" y=y_vals,\n",
|
||||
" c=z_vals,\n",
|
||||
" s=1,\n",
|
||||
" cmap=cmap,vmin=vmin,vmax=vmax\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"ax.set_xlim([1,25])\n",
|
||||
"\n",
|
||||
"ax.set_xlabel(x_col)\n",
|
||||
"ax.set_ylabel(y_col)\n",
|
||||
"\n",
|
||||
"cbar = plt.colorbar(cf)\n",
|
||||
"cbar.set_label(z_col)\n",
|
||||
"\n",
|
||||
"ax.grid(True, linestyle=\"--\", alpha=0.2, color='grey', linewidth=1)\n",
|
||||
"\n",
|
||||
"plt.show()\n",
|
||||
"\n",
|
||||
"fig.savefig('14_beach_state_vs_wave_energy_{}'.format(z_col),dpi=600,bbox_inches = \"tight\", pad_inches=0.01)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"hide_input": false,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"toc": {
|
||||
"base_numbering": 1,
|
||||
"nav_menu": {},
|
||||
"number_sections": true,
|
||||
"sideBar": true,
|
||||
"skip_h1_title": false,
|
||||
"title_cell": "Table of Contents",
|
||||
"title_sidebar": "Contents",
|
||||
"toc_cell": false,
|
||||
"toc_position": {
|
||||
"height": "calc(100% - 180px)",
|
||||
"left": "10px",
|
||||
"top": "150px",
|
||||
"width": "297.797px"
|
||||
},
|
||||
"toc_section_display": true,
|
||||
"toc_window_display": true
|
||||
},
|
||||
"varInspector": {
|
||||
"cols": {
|
||||
"lenName": 16,
|
||||
"lenType": 16,
|
||||
"lenVar": 40
|
||||
},
|
||||
"kernels_config": {
|
||||
"python": {
|
||||
"delete_cmd_postfix": "",
|
||||
"delete_cmd_prefix": "del ",
|
||||
"library": "var_list.py",
|
||||
"varRefreshCmd": "print(var_dic_list())"
|
||||
},
|
||||
"r": {
|
||||
"delete_cmd_postfix": ") ",
|
||||
"delete_cmd_prefix": "rm(",
|
||||
"library": "var_list.r",
|
||||
"varRefreshCmd": "cat(var_dic_list()) "
|
||||
}
|
||||
},
|
||||
"types_to_exclude": [
|
||||
"module",
|
||||
"function",
|
||||
"builtin_function_or_method",
|
||||
"instance",
|
||||
"_Feature"
|
||||
],
|
||||
"window_display": false
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
@ -1,176 +0,0 @@
|
||||
{
|
||||
"type": "FeatureCollection",
|
||||
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
|
||||
"features": [
|
||||
{ "type": "Feature", "properties": { "name": "Dee Why Beach", "id": "way4042355", "orientation": 117.8, "beach_length": 1367.6787658900539 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.297426209301022, -33.749879728713736 ], [ 151.303385953876585, -33.752492350405944 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way5757445", "orientation": 143.1, "beach_length": 7266.5228394129863 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.688273909621756, -35.014570881874867 ], [ 150.692319159611543, -35.018983377256809 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Currarong Beach", "id": "way5953437", "orientation": 11.8, "beach_length": 1164.4552080889939 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.816619637270946, -35.016318170650749 ], [ 150.817997401737813, -35.010916773154712 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way5953443", "orientation": 69.5, "beach_length": 4390.1973527327182 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.764949528081843, -34.922137344927066 ], [ 150.771260230160465, -34.920202716150072 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Seven Mile Beach", "id": "way5954873", "orientation": 112.2, "beach_length": 20574.535884866931 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.75586860673252, -34.827258822743808 ], [ 150.762106534463413, -34.829348464229902 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way7451332", "orientation": 129.4, "beach_length": 1603.4355113199163 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.729218741295767, -32.972437522611465 ], [ 151.734424929142961, -32.976025069724002 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Nobbys Beach", "id": "way7487200", "orientation": 133.2, "beach_length": 1421.6256822067976 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.794836683328242, -32.920272132548867 ], [ 151.799748010775659, -32.924143524568301 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Pebbly Beach", "id": "way12591715", "orientation": 142.5, "beach_length": 1009.719676912916 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.326250780857038, -35.608170015695841 ], [ 150.330352228577539, -35.612515567552236 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Long Reef Beach", "id": "way13858409", "orientation": 155.6, "beach_length": 1471.3219055229181 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.308732365126104, -33.742496010220343 ], [ 151.311515600300481, -33.747597876386187 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Mona Vale Beach", "id": "way13858412", "orientation": 116.4, "beach_length": 1338.8494987369897 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.310415712181566, -33.682410890665786 ], [ 151.31645044891448, -33.684903624108799 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Narrabeen Beach", "id": "way14040821", "orientation": 116.3, "beach_length": 1494.2626149369187 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.302693891096908, -33.70727706277372 ], [ 151.308733847065042, -33.709760309505015 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Narrabeen Beach", "id": "way14040977", "orientation": 100.8, "beach_length": 1373.2055690473667 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.298426776423128, -33.717344730525362 ], [ 151.305044803803554, -33.718394817682203 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Collaroy Beach", "id": "way14041013", "orientation": 71.2, "beach_length": 1661.418374661537 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.299027248983037, -33.728910506530326 ], [ 151.30540517022601, -33.727104738954885 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Newport Beach", "id": "way14041635", "orientation": 95.4, "beach_length": 1427.2517323683933 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.321643394091922, -33.653983190352847 ], [ 151.328350858060105, -33.654510964979941 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Garie Beach", "id": "way23140813", "orientation": 143.0, "beach_length": 1288.4252068564786 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.06759399349653, -34.169448561443176 ], [ 151.071648640747895, -34.173900327402919 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way25005079", "orientation": 129.1, "beach_length": 2025.6498496395332 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.426597886628542, -33.493461222834767 ], [ 151.431826394243473, -33.497004676254477 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Fingal Beach", "id": "way28969442", "orientation": 126.4, "beach_length": 1780.5075815647056 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.170819933303505, -32.742104345084726 ], [ 152.176242796305473, -32.745467120413899 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way30978532", "orientation": 91.0, "beach_length": 27866.970405937809 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.416889214109915, -27.219547910447488 ], [ 153.423625552607518, -27.219652472501451 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way30978533", "orientation": 106.5, "beach_length": 9949.6233025637666 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.449802428748541, -27.074975863105792 ], [ 153.45626234691764, -27.076679666207621 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Main Beach", "id": "way31000898", "orientation": 107.1, "beach_length": 31524.504786492536 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.489726460582915, -27.551922865831692 ], [ 153.496165986635276, -27.553679241015541 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Werri Beach", "id": "way37890458", "orientation": 109.9, "beach_length": 2529.5719540259611 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.833054662544583, -34.73513954960935 ], [ 150.839389726514355, -34.737024118079752 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Murramarang Beach", "id": "way38144058", "orientation": 66.5, "beach_length": 1274.9971066031389 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.400692945942467, -35.523329416877623 ], [ 150.406871514052028, -35.521142886604814 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way38145816", "orientation": 150.6, "beach_length": 3574.9449970357891 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.572125558604938, -35.19167332543941 ], [ 150.575432956191719, -35.1964700587423 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Curl Curl Beach", "id": "way41740993", "orientation": 127.2, "beach_length": 1438.5285128735152 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.294465322293036, -33.768981301790767 ], [ 151.299831834790211, -33.772367397196469 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Terrigal Beach", "id": "way54609773", "orientation": 124.5, "beach_length": 4017.1932249514589 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.447616400405025, -33.428842988601993 ], [ 151.453168839039648, -33.432027722974048 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Shelly Beach", "id": "way54609778", "orientation": 121.9, "beach_length": 3050.9077716642514 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.488212799687375, -33.370587687036029 ], [ 151.493932631508244, -33.373560935520118 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Avoca Beach", "id": "way54667480", "orientation": 123.9, "beach_length": 1035.2939738578305 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.437043809557593, -33.459692013650027 ], [ 151.442635904970388, -33.462826934868581 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Avoca Beach", "id": "way54669965", "orientation": 83.6, "beach_length": 1104.9091210534039 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.433667144832867, -33.467661343336587 ], [ 151.440362521662735, -33.467034853259669 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way54680278", "orientation": 129.4, "beach_length": 3136.0127011669574 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.119879943007049, -32.769000081845114 ], [ 152.125086130854243, -32.772595870258733 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way55486822", "orientation": 132.7, "beach_length": 1104.4302797352184 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.988772086251771, -34.229364098107446 ], [ 150.993723473851873, -34.233141635265319 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way55889024", "orientation": 81.5, "beach_length": 1232.6737616867676 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.872006675532447, -34.58628581273998 ], [ 150.878670036029661, -34.585465956389292 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Long Beach", "id": "way77993576", "orientation": 162.4, "beach_length": 2803.7250094349192 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.223220089980771, -35.702454079182424 ], [ 150.22525726618818, -35.707668942903908 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bherwerre Beach", "id": "way78271876", "orientation": 160.9, "beach_length": 8449.8285415778719 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.625636911751968, -35.166840954141129 ], [ 150.62784149805114, -35.172045235294341 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Towradgi Beach", "id": "way78388644", "orientation": 113.9, "beach_length": 3490.7143043811325 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.904518090155705, -34.395944048215512 ], [ 150.910677752417371, -34.39819634581459 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Greenhills Beach", "id": "way86379306", "orientation": 154.9, "beach_length": 1429.709869940998 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.171313877866794, -34.036110676981444 ], [ 151.174171864054045, -34.041166455329495 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bombo Beach", "id": "way87849653", "orientation": 102.7, "beach_length": 1687.8658370549799 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.85306294036414, -34.65870796445229 ], [ 150.859635472296731, -34.659926309498161 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Jones beach", "id": "way87849657", "orientation": 100.9, "beach_length": 1267.3743657941834 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.85355202581303, -34.639271225925022 ], [ 150.860167839712972, -34.640319403010693 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Manyana Beach", "id": "way93288569", "orientation": 126.6, "beach_length": 1265.8364782053784 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.514262690688838, -35.261694144827743 ], [ 150.519671564751263, -35.264974041569936 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Inyadda Beach", "id": "way93288573", "orientation": 145.1, "beach_length": 1942.857515375455 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.53092366649139, -35.250734160332513 ], [ 150.534778421862882, -35.255246479286811 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Berrara Beach", "id": "way93288644", "orientation": 131.0, "beach_length": 1579.9830572842293 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.541223871887212, -35.213278350672283 ], [ 150.5463086255196, -35.216889549211594 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Mystics", "id": "way95039342", "orientation": 71.6, "beach_length": 1991.0695681795767 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.854936238195023, -34.621731961966979 ], [ 150.861329161875005, -34.619981884762197 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "mollymook Sea beach", "id": "way95783344", "orientation": 106.3, "beach_length": 3036.5051584484122 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.473685638356073, -35.331300579853242 ], [ 150.480152196582765, -35.332843246565965 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way95783345", "orientation": 99.7, "beach_length": 1927.0868091013078 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.470428905438524, -35.310528010692501 ], [ 150.477069949127696, -35.311454343327576 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Conjola Beach", "id": "way95783351", "orientation": 137.4, "beach_length": 4306.8063800566279 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.491096091453329, -35.277269304505388 ], [ 150.495656451670982, -35.281317855539413 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Monument Beach", "id": "way95783352", "orientation": 119.2, "beach_length": 1072.0866326632313 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.530556492244102, -35.227820112586109 ], [ 150.536437686573038, -35.230505011981634 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Burrill Beach", "id": "way95783353", "orientation": 133.6, "beach_length": 1286.446905199022 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.452706601041683, -35.387695041636611 ], [ 150.457585610927623, -35.391482792332773 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Racecourse Beach", "id": "way95783354", "orientation": 128.8, "beach_length": 1596.7646423799629 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.461154139235646, -35.379007442639015 ], [ 150.466404823276093, -35.382449456071562 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Turmeil Beach", "id": "way95783355", "orientation": 102.1, "beach_length": 1955.8102034852254 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.389695625072335, -35.468052014337133 ], [ 150.396283307268362, -35.469202219596916 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Wairo Beach", "id": "way95783357", "orientation": 126.5, "beach_length": 6076.6508842274216 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.419104180875422, -35.42288519949318 ], [ 150.424520057656451, -35.426150861400252 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Buckley's Beach", "id": "way95783362", "orientation": 108.6, "beach_length": 1639.2461082610259 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.474209662923386, -35.296434667863565 ], [ 150.480595124287277, -35.29818856105905 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Woonona Beach", "id": "way95786684", "orientation": 101.2, "beach_length": 1533.3014801414317 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.917936878559942, -34.352650552061306 ], [ 150.92454593112808, -34.353730922053344 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way95786688", "orientation": 128.4, "beach_length": 1777.0098733519792 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.917913801592391, -34.378596883231367 ], [ 150.923193830173261, -34.382050711696237 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Thirroul Beach", "id": "way95786690", "orientation": 121.3, "beach_length": 1146.3923780647244 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.927324604641882, -34.314707974364282 ], [ 150.933081405342563, -34.317598918887732 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bulli Beach", "id": "way95786693", "orientation": 110.0, "beach_length": 1137.4537692982917 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.921885521804398, -34.342371884249602 ], [ 150.928216573631602, -34.344274492046054 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "McCauleys Beach", "id": "way95786716", "orientation": 90.3, "beach_length": 1195.787102761587 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.922697670004027, -34.325167095777559 ], [ 150.929434942280807, -34.325196228974804 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way100421203", "orientation": 105.8, "beach_length": 3372.8094924808279 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.900497590727099, -34.433679306094753 ], [ 150.906980404203892, -34.435192313457918 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way104748689", "orientation": 121.2, "beach_length": 8172.9490380456255 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.88336709945861, -34.513386224405068 ], [ 150.889130000371551, -34.516262026022645 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Racecourse Beach", "id": "way109973381", "orientation": 124.5, "beach_length": 1591.3190450316545 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.390997637206112, -35.535433026785185 ], [ 150.396550075840736, -35.538538329953681 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Nine Mile Beach", "id": "way133342568", "orientation": 109.2, "beach_length": 14293.895488916563 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.509906010523565, -32.120017022573229 ], [ 152.516268618478648, -32.121893555214456 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "One Mile Beach", "id": "way134627391", "orientation": 83.5, "beach_length": 1587.5752698474448 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.534435943097009, -32.19040419356476 ], [ 152.541129998975663, -32.189758739028534 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Lighthouse Beach", "id": "way159040990", "orientation": 160.3, "beach_length": 2188.4790477193133 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.52755624748616, -32.44265404793714 ], [ 152.529827381157503, -32.448006954924764 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Forrester's beach", "id": "way159095198", "orientation": 136.5, "beach_length": 1873.0872243549609 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.46551114701694, -33.410090887477729 ], [ 151.470148842788717, -33.414170312508581 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Lighthouse Beach", "id": "way173070325", "orientation": 127.9, "beach_length": 18557.898178721829 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.878255590261972, -31.52595483907065 ], [ 152.883571937465945, -31.529482583127681 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bondi Beach", "id": "way173244595", "orientation": 155.5, "beach_length": 1157.8171737499761 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.276852978827577, -33.890229978666355 ], [ 151.279646918413306, -33.895319000087184 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Treachery Beach", "id": "way182614828", "orientation": 161.0, "beach_length": 2256.0227991927118 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.508033818036949, -32.452672005068422 ], [ 152.510227289405776, -32.458047329581028 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way182621183", "orientation": 49.4, "beach_length": 1407.2757678250591 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.518988562089731, -32.430655695383784 ], [ 152.524104049741027, -32.426954918488342 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Gap Beach", "id": "way189322079", "orientation": 88.1, "beach_length": 1081.631611844778 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.079203669747642, -30.903108127704439 ], [ 153.08593733028286, -30.902916460461707 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way189326452", "orientation": 85.5, "beach_length": 4082.4382816626189 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.962444578523389, -31.300519519577751 ], [ 152.969161174107597, -31.300067847608144 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way189326453", "orientation": 104.9, "beach_length": 7522.1080741998449 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.961455195228041, -31.221414334816714 ], [ 152.967966023244998, -31.222895818497459 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Shore Beach", "id": "way189407637", "orientation": 115.3, "beach_length": 14990.004608269586 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.930721702813173, -31.36849654051866 ], [ 152.936812836606663, -31.370954932399229 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "minnie Water Back Beach", "id": "way189511968", "orientation": 108.7, "beach_length": 3400.9642340079649 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.287920046443048, -29.793801709778041 ], [ 153.294301747466392, -29.795676256889305 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Hyland Park Beach", "id": "way189940675", "orientation": 87.4, "beach_length": 3513.1309001894724 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.008425950330434, -30.613595802509401 ], [ 153.015156379314647, -30.613332773080192 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Killick Beach", "id": "way190929758", "orientation": 121.5, "beach_length": 16792.206309969777 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.000639688093969, -31.128367126991147 ], [ 153.006384235780189, -31.131380464144375 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Tallow Beach", "id": "way191324197", "orientation": 108.8, "beach_length": 7471.6596449413955 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.620095345273228, -28.668157592473477 ], [ 153.626473266516172, -28.670062632938784 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Fiona Beach", "id": "way192040645", "orientation": 156.8, "beach_length": 10844.393389875697 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.449758896197949, -32.468983496034681 ], [ 152.452413026486255, -32.474207888506783 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Black Head Beach", "id": "way192060355", "orientation": 99.3, "beach_length": 1728.0323893188813 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.541791477197393, -32.064615722434894 ], [ 152.548440283996911, -32.065538407601309 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way197880327", "orientation": 86.2, "beach_length": 1180.6458185008732 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.052890320666677, -36.492283185563338 ], [ 150.059612872987799, -36.49192421768501 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way197880338", "orientation": 57.5, "beach_length": 1375.8986285786966 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.066808158511378, -36.421051519025809 ], [ 150.072490394208387, -36.418138551698206 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Baragoot Beach", "id": "way197880340", "orientation": 112.6, "beach_length": 3798.3301609793943 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.058344540531692, -36.472792043473511 ], [ 150.064564544395353, -36.474874042724238 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Blueys Beach", "id": "way222144734", "orientation": 126.8, "beach_length": 1164.8691470618292 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.53569772120116, -32.351457297001396 ], [ 152.541092540418646, -32.354866636424404 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Boomerang Beach", "id": "way222145626", "orientation": 119.6, "beach_length": 1694.4202609719857 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.541804963908675, -32.339767659869565 ], [ 152.547663068293502, -32.342579299773398 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Sandbar Beach", "id": "way222267320", "orientation": 114.5, "beach_length": 3451.260473267856 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.520034124508214, -32.388714057241145 ], [ 152.526164865390115, -32.391073322483898 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way222390016", "orientation": 110.7, "beach_length": 11554.002303613363 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.528985590011047, -32.268921293411289 ], [ 152.535288017538591, -32.270934942720324 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Old Bar Beach", "id": "way224198013", "orientation": 135.6, "beach_length": 13321.86971687392 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.645502295395801, -31.909877497319457 ], [ 152.650216182439721, -31.91396363156792 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Pippi Beach", "id": "way226380327", "orientation": 114.9, "beach_length": 2093.2915651251469 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.363939285464653, -29.444978137816115 ], [ 153.370050371725199, -29.447448361479612 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Nelsons Beach", "id": "way230385239", "orientation": 116.6, "beach_length": 1293.1567245991507 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.692601124420293, -35.074497160024031 ], [ 150.698625367550136, -35.076966020068063 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Wanda Beach", "id": "way253258718", "orientation": 132.4, "beach_length": 1060.2608924073777 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.162102441399753, -34.041487964454262 ], [ 151.16707768429319, -34.045252375542432 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way303942662", "orientation": 99.5, "beach_length": 2762.1546538324255 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.028039902935916, -30.512880275794156 ], [ 153.034684868663675, -30.513838263655757 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Belongil Beach", "id": "way306312031", "orientation": 52.2, "beach_length": 8141.3112283828377 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.584817232086579, -28.619517676145943 ], [ 153.590140794519868, -28.615892763297513 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Rosedale Beach", "id": "way307144510", "orientation": 82.5, "beach_length": 1165.3429096205248 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.22424175613682, -35.814033256814547 ], [ 150.230921481679331, -35.81332012803567 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way307145569", "orientation": 124.0, "beach_length": 1161.7524386962018 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.184625773494815, -35.832802090307361 ], [ 150.19021130191382, -35.835856441300486 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Broulee Beach", "id": "way307146328", "orientation": 78.5, "beach_length": 2586.1859082861752 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.17610432066931, -35.851149084440259 ], [ 150.182706430715172, -35.850060346173386 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bengello Beach", "id": "way307149092", "orientation": 114.5, "beach_length": 7504.7002482403259 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.153612670538081, -35.87796575013369 ], [ 150.159743411419981, -35.88022955506014 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Murrays Beach", "id": "way310196798", "orientation": 117.1, "beach_length": 1103.5523309805026 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.104568382584773, -30.36079172594734 ], [ 153.110566070848535, -30.363439955215096 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Sawtell Main Beach", "id": "way310196987", "orientation": 89.3, "beach_length": 1489.8880110251912 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.10139610757065, -30.370640708343696 ], [ 153.108132969389715, -30.370569693148227 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way317885375", "orientation": 108.5, "beach_length": 2691.2574178703148 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.867638260544226, -34.550898108893534 ], [ 150.874027462797471, -34.552658828545852 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Wonboyn Beach", "id": "way331942614", "orientation": 121.9, "beach_length": 5220.5938607999415 }, "geometry": { "type": "LineString", "coordinates": [ [ 149.949107620675818, -37.264548086321227 ], [ 149.954827452496687, -37.26738147714029 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Mungo Beach", "id": "way345715960", "orientation": 135.7, "beach_length": 20333.423704402718 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.330376531181741, -32.528990937624144 ], [ 152.335082009623477, -32.533056269003325 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Maggies Beach", "id": "way363247956", "orientation": 89.7, "beach_length": 3326.2603702300262 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.572355514757646, -28.348615503975967 ], [ 153.579092787034426, -28.348584457927902 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Durras Beach", "id": "way369080492", "orientation": 166.0, "beach_length": 1984.6596123443278 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.313479270156165, -35.633341813967576 ], [ 150.315109186179001, -35.638654853629106 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Durras Beach", "id": "way369080493", "orientation": 114.4, "beach_length": 2943.0947142531068 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.296985945447489, -35.647239147736421 ], [ 150.30312155333371, -35.649500829782816 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way369096364", "orientation": 146.4, "beach_length": 1141.6980793948974 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.194808950406383, -35.701536213195205 ], [ 150.198537351057297, -35.706093159502608 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Middle Beach", "id": "way378122331", "orientation": 121.0, "beach_length": 1922.6746961684682 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.007885658320987, -36.649069667894786 ], [ 150.013660706975486, -36.651853620773231 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Gillard's Beach", "id": "way378122714", "orientation": 114.0, "beach_length": 2094.4092640085746 }, "geometry": { "type": "LineString", "coordinates": [ [ 149.999680965236621, -36.66151550000216 ], [ 150.005835854091657, -36.663713700440503 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Newtons Beach", "id": "way378241758", "orientation": 96.9, "beach_length": 2464.6005137677139 }, "geometry": { "type": "LineString", "coordinates": [ [ 149.948838840493693, -37.368150005961624 ], [ 149.95552740869627, -37.368793280099332 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Brou Beach", "id": "way379893070", "orientation": 97.9, "beach_length": 8153.2304828166261 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.123707370774383, -36.13300438403936 ], [ 150.130380794198572, -36.133752276027401 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bingie Beach", "id": "way380212327", "orientation": 109.5, "beach_length": 5003.7820816744024 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.144683962011328, -36.034019336769752 ], [ 150.151034881453029, -36.035837992412894 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bellambi Beach", "id": "way380544388", "orientation": 66.3, "beach_length": 1200.2978014331732 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.919155726067373, -34.363012740527012 ], [ 150.925324878837557, -34.360777259436915 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Crowdy Beach", "id": "way382435196", "orientation": 113.6, "beach_length": 14924.135725110977 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.7449755551186, -31.78988067777415 ], [ 152.751149424961852, -31.792173313703469 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Kylie's Beach", "id": "way382435198", "orientation": 129.9, "beach_length": 2246.9701416882372 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.790388118551334, -31.737718911209132 ], [ 152.795556789911245, -31.741394275911432 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Middle Beach", "id": "way383244210", "orientation": 92.1, "beach_length": 1851.546310777074 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.995274850847295, -30.770927446815428 ], [ 153.002007690622179, -30.771139572458786 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Grassy Beach", "id": "way383244211", "orientation": 82.5, "beach_length": 1925.165672768919 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.995263592122683, -30.786957434846368 ], [ 153.001943317665194, -30.786201957871473 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Main Beach", "id": "way383406313", "orientation": 79.5, "beach_length": 1744.3495056482277 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.012156447242234, -30.632923190351136 ], [ 153.018780994079577, -30.631866735861752 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Wooli Beach", "id": "way383421939", "orientation": 115.9, "beach_length": 4657.8222218240326 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.274285629804325, -29.843986469996882 ], [ 153.280346278567691, -29.846539054623989 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Jones Beach", "id": "way383425054", "orientation": 42.8, "beach_length": 1009.249355477669 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.269644689491315, -29.892583341463258 ], [ 153.274222333303413, -29.888297508101981 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Main Beach", "id": "way383426042", "orientation": 97.9, "beach_length": 1888.1674259404499 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.265653317839536, -29.868307734576753 ], [ 153.272326741263754, -29.869110745098936 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "South Terrace Beach", "id": "way383426043", "orientation": 83.4, "beach_length": 1586.8584308614954 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.264661382941171, -29.881769328339743 ], [ 153.271354097477456, -29.881097901425584 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Shelley Beach", "id": "way383957412", "orientation": 112.3, "beach_length": 2980.2934950329336 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.348055569761613, -29.53868296432687 ], [ 153.354289044994431, -29.540907183787219 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Plumbago Beach", "id": "way383957413", "orientation": 111.9, "beach_length": 2569.2015052608722 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.338765564169393, -29.559365693171785 ], [ 153.345016735329722, -29.561551550638629 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Little Shelley Beach", "id": "way383963164", "orientation": 93.8, "beach_length": 1342.2180551735103 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.355793752382255, -29.519645951743392 ], [ 153.362516304703405, -29.520034499351762 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Angourie Beach", "id": "way383966415", "orientation": 115.7, "beach_length": 2064.1673424809965 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.359553569320127, -29.48847920888787 ], [ 153.365624453773279, -29.491022401311739 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Main Beach", "id": "way383975077", "orientation": 100.7, "beach_length": 3400.7434052670405 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.330416921066472, -29.586548854027352 ], [ 153.33703714176778, -29.587636647604317 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Main Beach", "id": "way383975083", "orientation": 50.6, "beach_length": 1679.2778386106322 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.334021876084876, -29.607369988880109 ], [ 153.33922806393204, -29.603651874464926 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Cowdroy's Beach", "id": "way384818393", "orientation": 105.8, "beach_length": 1628.4546064287781 }, "geometry": { "type": "LineString", "coordinates": [ [ 149.993405148499221, -36.67438201328843 ], [ 149.999887961976043, -36.675853307633389 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Barri Beach", "id": "way387687423", "orientation": 104.3, "beach_length": 1834.4547960472853 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.358600578934727, -29.460293199232954 ], [ 153.365129191250304, -29.461742134849143 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Warrain Beach", "id": "way392061792", "orientation": 73.3, "beach_length": 12744.474986398533 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.778929198509729, -34.97700120886693 ], [ 150.785382397909189, -34.975414826433735 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Broken Head Beach", "id": "way395502828", "orientation": 66.7, "beach_length": 1494.3866570127263 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.612700208323474, -28.703113501640605 ], [ 153.618888116488534, -28.700776008201203 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Wallabi Beach", "id": "way398697323", "orientation": 119.9, "beach_length": 1652.5524600985955 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.565894383302407, -32.000511472503632 ], [ 152.571734982797295, -32.003359576539971 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Saltwater Beach", "id": "way398859521", "orientation": 129.8, "beach_length": 2796.5816172543705 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.555005812839426, -32.016093257066153 ], [ 152.560182019077814, -32.019749878747241 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Diamond Beach", "id": "way400673679", "orientation": 100.8, "beach_length": 3628.4618027263687 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.541416452217334, -32.037841321306679 ], [ 152.548034479597789, -32.038911496585932 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Forster Beach", "id": "way450323845", "orientation": 98.1, "beach_length": 13147.88015272033 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.989502206606062, -30.698279238482932 ], [ 152.996172356981305, -30.699095509906883 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "South Kingscliff Beach", "id": "way477089973", "orientation": 101.7, "beach_length": 4203.5437019182154 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.580143313377306, -28.272071885866115 ], [ 153.58674069450737, -28.273275148555147 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bogangar Beach", "id": "way477198650", "orientation": 92.3, "beach_length": 1449.7460371951504 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.570111445218203, -28.323258051115097 ], [ 153.576843382195221, -28.323496064480167 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Casuarina Beach", "id": "way477198652", "orientation": 99.5, "beach_length": 3821.7978986879175 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.572970562311127, -28.302797638905098 ], [ 153.579615528038886, -28.303776687062488 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Park Beach", "id": "way506150317", "orientation": 98.6, "beach_length": 2246.6670041287475 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.138021268576523, -30.288371722558704 ], [ 153.144682880846744, -30.289241670527939 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Moruya Heads Beach", "id": "way561980537", "orientation": 86.1, "beach_length": 3082.6172105517958 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.155081677619819, -35.922289409723518 ], [ 150.161803440392845, -35.921918316841641 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Pedro Beach", "id": "way562090488", "orientation": 90.4, "beach_length": 3006.79413295329 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.153541243669679, -35.944080647570594 ], [ 150.160278444115789, -35.944118726879765 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way584741088", "orientation": 79.0, "beach_length": 6848.4922804323187 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.558994945717757, -28.565150014001521 ], [ 153.565608525984231, -28.564020943031426 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way594795760", "orientation": 56.7, "beach_length": 1274.3702762914984 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.47230415616994, -28.120590640740772 ], [ 153.477935295124695, -28.117328259348785 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Beach", "id": "way614055418", "orientation": 89.8, "beach_length": 1199.859909578041 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.866651391051846, -34.569558648783961 ], [ 150.873388714636405, -34.5695392833398 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way664606753", "orientation": 83.1, "beach_length": 1059.6770018698264 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.130182206375309, -36.206410368832543 ], [ 150.136870774577886, -36.205757261328436 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Haywards Beach", "id": "way664606754", "orientation": 107.4, "beach_length": 5332.0491423246485 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.066518609255041, -36.394044999198101 ], [ 150.072947674293772, -36.395666763781968 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way664606765", "orientation": 122.6, "beach_length": 3772.0435881595317 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.135725762944929, -36.275995490791587 ], [ 150.141401671927724, -36.278921771233478 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "way664713177", "orientation": 69.5, "beach_length": 1088.4824703652382 }, "geometry": { "type": "LineString", "coordinates": [ [ 150.132623945658793, -36.091488510007352 ], [ 150.138934647737358, -36.089581848595152 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Maroubra Beach", "id": "relation2251446", "orientation": 92.3, "beach_length": 1304.7786949436613 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.255225107515685, -33.949647248967025 ], [ 151.261957044492732, -33.949871538726846 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "North Entrance Beach", "id": "relation2303044", "orientation": 137.0, "beach_length": 9900.7078833312644 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.524384869207324, -33.313435454839677 ], [ 151.528979740836718, -33.31755307743564 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Seven Mile Beach", "id": "relation2580527", "orientation": 101.9, "beach_length": 9795.7333395711594 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.597650977776965, -28.763556961476439 ], [ 153.604243549604206, -28.764774808848394 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Seventy Five Mile Beach", "id": "relation2611157", "orientation": 109.8, "beach_length": 104340.30115250713 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.175701667463983, -25.394744299356859 ], [ 153.182040724278636, -25.396805964137293 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation2723197", "orientation": 99.4, "beach_length": 3817.6368266284194 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.833201610246448, -31.621366879809862 ], [ 152.839848506633871, -31.622303888877077 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Burwood Beach", "id": "relation2874470", "orientation": 132.6, "beach_length": 1740.965507343255 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.739813879778922, -32.959293379850727 ], [ 151.744773234258588, -32.963119700548077 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Palm Beach", "id": "relation2978551", "orientation": 98.7, "beach_length": 2771.560782507047 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.323728294761452, -33.590944774061647 ], [ 151.330388138512319, -33.591793688458139 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Fishermans Beach", "id": "relation3359340", "orientation": 54.7, "beach_length": 1102.4177924785465 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.304926647932973, -33.738182783520628 ], [ 151.310425264466318, -33.734945167696978 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation3429622", "orientation": 100.0, "beach_length": 1491.3424905260258 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.499150446483497, -33.357989048245138 ], [ 151.505785455406851, -33.358966229185413 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation3433510", "orientation": 112.4, "beach_length": 11802.823651619099 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.206015285788453, -30.020201713319093 ], [ 153.212244289534908, -30.022424677992785 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Safety Beach", "id": "relation3903790", "orientation": 92.9, "beach_length": 1337.5116588251049 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.198295596874715, -30.089175916493701 ], [ 153.205024333348007, -30.089470846525359 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Woolgoolga Beach", "id": "relation3906007", "orientation": 63.7, "beach_length": 1582.8092428572195 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.198088813490756, -30.106416729330963 ], [ 153.20412876945889, -30.10383427192788 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation5141614", "orientation": 100.3, "beach_length": 19300.646943814729 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.040794241999237, -30.988061743377703 ], [ 153.047423034254706, -30.989094458534261 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Boambee Beach", "id": "relation5206367", "orientation": 124.8, "beach_length": 6957.6457413516446 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.118093971968221, -30.33072027520112 ], [ 153.123626353606511, -30.334039025235214 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation5374363", "orientation": 120.2, "beach_length": 35839.673826800223 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.485185922621667, -28.984732247121318 ], [ 153.491008857103736, -28.987696753726748 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation5403455", "orientation": 107.4, "beach_length": 3467.7786659012691 }, "geometry": { "type": "LineString", "coordinates": [ [ 153.571344835025997, -28.374362035021626 ], [ 153.577773900064699, -28.37613471802031 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Birdie Beach", "id": "relation5610370", "orientation": 141.0, "beach_length": 1786.9807326381458 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.605942741343654, -33.204047496782778 ], [ 151.610182702287801, -33.2084284120345 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Birdie Beach", "id": "relation5610374", "orientation": 122.0, "beach_length": 9329.2708899211502 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.571638528455793, -33.233911939456952 ], [ 151.577352137704139, -33.236898196437302 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Moonee Beach", "id": "relation5614833", "orientation": 85.2, "beach_length": 1414.8674396814627 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.629334061854081, -33.170133845309714 ], [ 151.636047797598934, -33.16966194212808 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Catherine Hill Bay Beach", "id": "relation5614835", "orientation": 114.4, "beach_length": 2223.576899805525 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.629636403814999, -33.150707397544664 ], [ 151.635772011701221, -33.153037588724935 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation5622499", "orientation": 126.3, "beach_length": 3544.1909415687978 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.652586518417479, -33.099743272054873 ], [ 151.658016351121375, -33.103084550421912 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Stockton Beach", "id": "relation5632945", "orientation": 154.4, "beach_length": 39038.19328012778 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.914889658623906, -32.82458287310245 ], [ 151.917800777865409, -32.829688574995842 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Bennetts Beach", "id": "relation6158652", "orientation": 125.3, "beach_length": 17070.736099033849 }, "geometry": { "type": "LineString", "coordinates": [ [ 152.204507339562895, -32.634999855045457 ], [ 152.210005956096239, -32.638278380319974 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Soldiers Beach", "id": "relation7491177", "orientation": 128.9, "beach_length": 1341.891275117024 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.562510072353433, -33.290767981140093 ], [ 151.567753380216516, -33.294304430933003 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "Putty Beach", "id": "relation7966651", "orientation": 162.2, "beach_length": 2032.480907416342 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.365952074280472, -33.528593310798257 ], [ 151.368011655015948, -33.533940624398603 ] ] } },
|
||||
{ "type": "Feature", "properties": { "name": "noname", "id": "relation8302312", "orientation": 129.6, "beach_length": 12508.032082382493 }, "geometry": { "type": "LineString", "coordinates": [ [ 151.67716704962902, -33.041776618031669 ], [ 151.682358278298523, -33.045376557328964 ] ] } }
|
||||
]
|
||||
}
|
@ -1,656 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Setup notebook"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import datetime\n",
|
||||
"import pickle\n",
|
||||
"import fiona\n",
|
||||
"import shapely\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"import pandas as pd\n",
|
||||
"import geopandas\n",
|
||||
"from scipy.stats import percentileofscore\n",
|
||||
"from shapely.geometry import Point, MultiPoint\n",
|
||||
"import numpy as np\n",
|
||||
"import requests\n",
|
||||
"from bs4 import BeautifulSoup\n",
|
||||
"import urllib.parse\n",
|
||||
"import itertools\n",
|
||||
"from tqdm import tqdm\n",
|
||||
"import glob\n",
|
||||
"from scipy.interpolate import griddata, SmoothBivariateSpline\n",
|
||||
"from scipy.ndimage.filters import gaussian_filter\n",
|
||||
"import colorcet as cc\n",
|
||||
"import pytz"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Import data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Import Chris' data\n",
|
||||
"def df_from_csv(csv, index_col, data_folder='../data/interim'):\n",
|
||||
" print('Importing {}'.format(csv))\n",
|
||||
" return pd.read_csv(os.path.join(data_folder,csv), index_col=index_col)\n",
|
||||
"\n",
|
||||
"df_sites = df_from_csv('sites.csv', index_col=[0])\n",
|
||||
"df_obs_impacts = df_from_csv('impacts_observed.csv', index_col=[0])\n",
|
||||
"df_waves = df_from_csv('waves.csv', index_col=[0,1])\n",
|
||||
"df_waves.index = df_waves.index.set_levels([df_waves.index.levels[0], pd.to_datetime(df_waves.index.levels[1])])\n",
|
||||
"df_profiles = df_from_csv('profiles.csv', index_col=[0,1,2])\n",
|
||||
"\n",
|
||||
"# Load shoreline transects from coastsat\n",
|
||||
"shoreline_transect_files = glob.glob('./15_data/df_transect_shorelines_*.csv')\n",
|
||||
"df_shoreline_transects = pd.concat((pd.read_csv(f,\n",
|
||||
" skiprows=0,\n",
|
||||
" index_col=[0,1])\n",
|
||||
" for f in shoreline_transect_files))\n",
|
||||
"\n",
|
||||
"# Convert index to datetime\n",
|
||||
"df_shoreline_transects.index = df_shoreline_transects.index.set_levels(\n",
|
||||
" [df_shoreline_transects.index.levels[0], pd.to_datetime(df_shoreline_transects.index.levels[1])])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print('df_shoreline_transects:')\n",
|
||||
"df_shoreline_transects.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Calculate shoreline percentiles"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"data = []\n",
|
||||
"\n",
|
||||
"for site_id, df in df_shoreline_transects.groupby(['site_id']):\n",
|
||||
" \n",
|
||||
" # Find last date before June 2016 storm\n",
|
||||
" mask = pd.Series(df.index.get_level_values('time') < datetime.datetime(2016,6,3).replace(tzinfo=pytz.utc))\n",
|
||||
" i_last_obs = mask[::-1].idxmax()\n",
|
||||
" prestorm_observation = df.iloc[i_last_obs]\n",
|
||||
"\n",
|
||||
" # Get prestorm and post storm shoreline locations\n",
|
||||
" # This is a shortcut, because the last x value of our profiles are at the shoreline\n",
|
||||
" # If we wanted another elevation, this code needs to change\n",
|
||||
" prestorm_shoreline_x = df_profiles.loc[(site_id,'prestorm',)].dropna(subset=['z']).iloc[-1].name\n",
|
||||
" poststorm_shoreline_x = df_profiles.loc[(site_id,'poststorm',)].dropna(subset=['z']).iloc[-1].name\n",
|
||||
"\n",
|
||||
" # Find the corresponding percentile of prestorm and poststorm shorelines\n",
|
||||
" prestorm_shoreline_pct = percentileofscore(df.shoreline_chainage_w_tide_correction.dropna(), prestorm_shoreline_x)\n",
|
||||
" poststorm_shoreline_pct = percentileofscore(df.shoreline_chainage_w_tide_correction.dropna(), poststorm_shoreline_x)\n",
|
||||
" change_shoreline_pct = poststorm_shoreline_pct - prestorm_shoreline_pct\n",
|
||||
"\n",
|
||||
" data.append({\n",
|
||||
" 'site_id': site_id,\n",
|
||||
" 'prestorm_shoreline_x': prestorm_shoreline_x,\n",
|
||||
" 'poststorm_shoreline_x': poststorm_shoreline_x,\n",
|
||||
" 'prestorm_shoreline_pct': prestorm_shoreline_pct,\n",
|
||||
" 'poststorm_shoreline_pct': poststorm_shoreline_pct,\n",
|
||||
" 'change_shoreline_pct': change_shoreline_pct,\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
"df_shorelines_pct = pd.DataFrame(data).set_index('site_id')\n",
|
||||
"\n",
|
||||
"print('df_shorelines_pct:')\n",
|
||||
"df_shorelines_pct.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Get wave data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Find closest nsw nearshore transformation output point to each site_id\n",
|
||||
"\n",
|
||||
"# Import nearshore wave site_id data\n",
|
||||
"df_nearshore_ids = df_from_csv('nsw_nearshore_tool_site_locations.csv', index_col=[0],data_folder='./15_data/')\n",
|
||||
"df_nearshore_ids.lat = pd.to_numeric(df_nearshore_ids.lat,errors='coerce')\n",
|
||||
"df_nearshore_ids.lon = pd.to_numeric(df_nearshore_ids.lon,errors='coerce')\n",
|
||||
"df_nearshore_ids.depth = pd.to_numeric(df_nearshore_ids.depth,errors='coerce')\n",
|
||||
"gdf_nearshore_ids = geopandas.GeoDataFrame(\n",
|
||||
" df_nearshore_ids, geometry=geopandas.points_from_xy(df_nearshore_ids.lon, df_nearshore_ids.lat))\n",
|
||||
"gdf_nearshore_ids.crs = {'init' :'epsg:4283'}\n",
|
||||
"gdf_nearshore_ids = gdf_nearshore_ids.to_crs(epsg=28356)\n",
|
||||
"\n",
|
||||
"gdf_nearshore_ids.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Convert sites to geopandas dataframe\n",
|
||||
"\n",
|
||||
"gdf_sites = geopandas.GeoDataFrame(\n",
|
||||
" df_sites, geometry=geopandas.points_from_xy(df_sites.lon, df_sites.lat))\n",
|
||||
"gdf_sites.crs = {'init' :'epsg:4283'}\n",
|
||||
"gdf_sites = gdf_sites.to_crs(epsg=28356)\n",
|
||||
"gdf_sites.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Warning, this cell takes around 9 mins to compute\n",
|
||||
"# Need to do some optimizations\n",
|
||||
"\n",
|
||||
"# Calculate closest nearshore wave id for each site\n",
|
||||
"from shapely.ops import nearest_points\n",
|
||||
"\n",
|
||||
"def near(point, df2=gdf_nearshore_ids):\n",
|
||||
" # find the nearest point and return the corresponding Place value\n",
|
||||
" nearest = df2.geometry == nearest_points(point, df2.geometry.unary_union)[1]\n",
|
||||
" return df_nearshore_ids[nearest].index[0]\n",
|
||||
"\n",
|
||||
"start_time = datetime.datetime.now()\n",
|
||||
"gdf_sites.loc[:,'nearshore_wave_id'] = gdf_sites.apply(lambda row: near(row.geometry), axis=1)\n",
|
||||
"end_time = datetime.datetime.now()\n",
|
||||
"print(\"Executed in: {}\".format(end_time - start_time))\n",
|
||||
"gdf_sites.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Download wave data using the NSW nearshore transformation website, for sites where we have shoreline measurements"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def nearshore_wave_csv_url(id,start_date,end_date):\n",
|
||||
" URL = 'http://www.nswaves.com.au/transform.php'\n",
|
||||
" payload = {\n",
|
||||
" 'init': '1',\n",
|
||||
" 'type': 'Transform-Full',\n",
|
||||
" 'startsite': '{}'.format(id),\n",
|
||||
" 'endsite': '{}'.format(id),\n",
|
||||
" 'timestep': 'null',\n",
|
||||
" 'startdate': start_date.strftime('%Y-%m-%d'),\n",
|
||||
" 'starthour': '00',\n",
|
||||
" 'enddate': end_date.strftime('%Y-%m-%d'),\n",
|
||||
" 'endhour': '00',\n",
|
||||
" 'sitestep': '1',\n",
|
||||
" 'method': 'Parametric',\n",
|
||||
" 'source': 'Waverider',\n",
|
||||
" 'filename': 'ckl',\n",
|
||||
" 'format': 'csv',\n",
|
||||
" }\n",
|
||||
"\n",
|
||||
" session = requests.session()\n",
|
||||
" r = requests.post(URL, data=payload)\n",
|
||||
" \n",
|
||||
" soup = BeautifulSoup(r.text)\n",
|
||||
" \n",
|
||||
" # Check if data extraction was successful\n",
|
||||
" if soup.findAll(text=\"OK : Data Extraction Successful - Click filename/s to download data file\"):\n",
|
||||
"\n",
|
||||
" # Find all links\n",
|
||||
" for link in soup.find_all('a'):\n",
|
||||
"\n",
|
||||
" href = link.get('href')\n",
|
||||
" if '/data/full' not in href:\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" # Convert to absolute convert to absolute url\n",
|
||||
" csv_url = urllib.parse.urljoin(URL, href)\n",
|
||||
"\n",
|
||||
" return csv_url\n",
|
||||
" else:\n",
|
||||
" return None\n",
|
||||
"\n",
|
||||
" \n",
|
||||
"def download_csv(url, file_path):\n",
|
||||
" urllib.request.urlretrieve(url,file_path)\n",
|
||||
" print('Downloaded {}'.format(file_path))\n",
|
||||
" \n",
|
||||
" \n",
|
||||
"def daterange(start_date, end_date,delta):\n",
|
||||
" while start_date < end_date:\n",
|
||||
" yield start_date\n",
|
||||
" start_date += delta\n",
|
||||
" \n",
|
||||
"def download_nearshore_csv(site_id, nsw_nearshore_id, start_date, end_date,output_folder='./15_nearshore_waves/'):\n",
|
||||
" \n",
|
||||
" # Create output folder if doesn't already exists\n",
|
||||
" os.makedirs(output_folder, exist_ok=True)\n",
|
||||
"\n",
|
||||
" # Output filename\n",
|
||||
" output_filename = '{}_{}_{}_{}.csv'.format(\n",
|
||||
" site_id,\n",
|
||||
" nsw_nearshore_id,\n",
|
||||
" start_date.strftime('%Y%m%d'),\n",
|
||||
" end_date.strftime('%Y%m%d'),\n",
|
||||
" )\n",
|
||||
" output_filepath = os.path.join(output_folder,output_filename)\n",
|
||||
"\n",
|
||||
" # Don't download if file already exists\n",
|
||||
" if os.path.isfile(output_filepath):\n",
|
||||
" return\n",
|
||||
"\n",
|
||||
" csv_url = nearshore_wave_csv_url(nsw_nearshore_id,start_date,end_date)\n",
|
||||
"\n",
|
||||
" if csv_url:\n",
|
||||
" download_csv(csv_url, output_filepath)\n",
|
||||
" else:\n",
|
||||
" print('No url found')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df_sites[df_sites.beach.isin(['DIAMONDn','HARR','OLDBAR','DEEWHYn'])].index"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# beach = 'DIAMONDn'\n",
|
||||
"# beach = 'HARR'\n",
|
||||
"# beach = 'OLDBAR'\n",
|
||||
"# beach = 'DEEWHYn'\n",
|
||||
"\n",
|
||||
"site_ids_to_get = df_sites[df_sites.beach.isin(['DIAMONDn','HARR','OLDBAR','DEEWHYn'])].index\n",
|
||||
"# site_ids_to_get = df_shorelines_pct.index\n",
|
||||
"\n",
|
||||
"# Define what start and end year we want to get\n",
|
||||
"start_year=2005\n",
|
||||
"end_year=2014\n",
|
||||
"\n",
|
||||
"# Construct a list of start and end times we can query\n",
|
||||
"date_ranges = [(datetime.datetime(x, 1, 1), datetime.datetime(x, 12, 31))\n",
|
||||
" for x in range(start_year, end_year + 1)]\n",
|
||||
"\n",
|
||||
"# Creates a list of inputs...\n",
|
||||
"# [('NARRA0001',\n",
|
||||
"# (datetime.datetime(2005, 1, 1, 0, 0),\n",
|
||||
"# datetime.datetime(2005, 12, 31, 0, 0))),\n",
|
||||
"# ('NARRA0001',\n",
|
||||
"# (datetime.datetime(2006, 1, 1, 0, 0),\n",
|
||||
"# datetime.datetime(2006, 12, 31, 0, 0))), ...\n",
|
||||
"inputs = [x for x in list(itertools.product(site_ids_to_get, date_ranges))]\n",
|
||||
"\n",
|
||||
"for input_params in inputs:\n",
|
||||
" site_id, (start_date, end_date) = input_params\n",
|
||||
" download_nearshore_csv(site_id=site_id,\n",
|
||||
" nsw_nearshore_id = gdf_sites.loc[site_id].nearshore_wave_id,\n",
|
||||
" start_date=start_date,\n",
|
||||
" end_date=end_date)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Calculate mean wave conditions"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import glob\n",
|
||||
"from tqdm import tqdm\n",
|
||||
"wave_power_data = []\n",
|
||||
"\n",
|
||||
"# For each site, calculate mean wave power and storm wave power\n",
|
||||
"site_ids = df_shorelines_pct.index\n",
|
||||
"\n",
|
||||
"for site_id in tqdm(site_ids):\n",
|
||||
" \n",
|
||||
" # Get shoreline orientation\n",
|
||||
" orientation = df_sites.loc[site_id].orientation\n",
|
||||
"\n",
|
||||
"# # Get peak hour wave energy from June 2016 storm\n",
|
||||
"# max_hrly_wave_power = df_waves.loc[[site['site_id']]].Pxs.max()\n",
|
||||
"\n",
|
||||
" # Load nearshore wave csv files into one dataframe\n",
|
||||
" site_nearshore_wave_files = glob.glob('./15_nearshore_waves/*{}*'.format(site_id))\n",
|
||||
"\n",
|
||||
" if len(site_nearshore_wave_files) == 0:\n",
|
||||
" continue\n",
|
||||
"\n",
|
||||
" df_hist_waves = pd.concat((pd.read_csv(f,\n",
|
||||
" skiprows=8,\n",
|
||||
" index_col=0,\n",
|
||||
" names=['Hs', 'Tp', 'dir'],\n",
|
||||
" na_values=' NaN')\n",
|
||||
" for f in site_nearshore_wave_files))\n",
|
||||
" df_hist_waves.index = pd.to_datetime(df_hist_waves.index)\n",
|
||||
"\n",
|
||||
" # At each row, calculate crossshore component of nearshore wave energy\n",
|
||||
" df_hist_waves['d'] = 10\n",
|
||||
" df_hist_waves['L'] = 9.81 * df_hist_waves.Tp**2 / 2 / np.pi\n",
|
||||
" df_hist_waves['n'] = 0.5 * (\n",
|
||||
" 1 + (4 * np.pi * df_hist_waves.d / df_hist_waves.L) /\n",
|
||||
" (np.sinh(4 * np.pi * df_hist_waves.d / df_hist_waves.L)))\n",
|
||||
" df_hist_waves['E'] = 1 / 16 * 1025 * 9.81 * df_hist_waves.Hs**2\n",
|
||||
" df_hist_waves['C'] = 9.81 * df_hist_waves.Tp / 2 / np.pi * np.tanh(\n",
|
||||
" 2 * np.pi * df_hist_waves.d / df_hist_waves.L)\n",
|
||||
" df_hist_waves['shoreline_tn_angle'] = 270 - orientation\n",
|
||||
" df_hist_waves.loc[\n",
|
||||
" df_hist_waves.shoreline_tn_angle > 360,\n",
|
||||
" 'shoreline_tn_angle'] = df_hist_waves.shoreline_tn_angle - 360\n",
|
||||
" df_hist_waves[\n",
|
||||
" 'alpha'] = df_hist_waves.shoreline_tn_angle - df_hist_waves.dir\n",
|
||||
" df_hist_waves[\n",
|
||||
" 'Px'] = df_hist_waves.n * df_hist_waves.E * df_hist_waves.C * np.cos(\n",
|
||||
" np.deg2rad(df_hist_waves.alpha))\n",
|
||||
"\n",
|
||||
"# # Apply percentileofscore for June 2016 wave energy\n",
|
||||
"# storm_Px_hrly_pctile = percentileofscore(df_hist_waves.Px.dropna().values,\n",
|
||||
"# max_hrly_wave_power,\n",
|
||||
"# kind='mean')\n",
|
||||
"\n",
|
||||
" # Calculate cumulate wave energy from storm\n",
|
||||
" idx = ((df_waves.index.get_level_values('datetime') > '2016-06-04') &\n",
|
||||
" (df_waves.index.get_level_values('datetime') < '2016-06-07') &\n",
|
||||
" (df_waves.index.get_level_values('site_id') == site_id))\n",
|
||||
" hrs = len(df_waves[idx])\n",
|
||||
" Pxscum_storm = df_waves[idx].Pxs.sum()\n",
|
||||
" \n",
|
||||
" # Calculate cumulate wave energy of mean wave conditions over length of storm\n",
|
||||
" Pxscum_mean = df_hist_waves['Px'].mean() * hrs\n",
|
||||
" Pxscum_storm_mean_ratio = Pxscum_storm / Pxscum_mean\n",
|
||||
"\n",
|
||||
" wave_power_data.append({\n",
|
||||
" 'site_id': site_id,\n",
|
||||
" 'Pxscum_mean': Pxscum_mean,\n",
|
||||
" 'Pxscum_storm': Pxscum_storm,\n",
|
||||
" 'Pxscum_storm_mean_ratio': Pxscum_storm_mean_ratio\n",
|
||||
" })\n",
|
||||
"\n",
|
||||
"df_wave_power = pd.DataFrame(wave_power_data).set_index('site_id')\n",
|
||||
"df_wave_power.head()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Plot data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Combine data into one data frame\n",
|
||||
"df_plot = pd.concat([df_wave_power,df_shorelines_pct],axis=1)\n",
|
||||
"df_plot.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Define some helper functions\n",
|
||||
"def round_down(num, divisor):\n",
|
||||
" return num - (num%divisor)\n",
|
||||
"\n",
|
||||
"def round_up(x, divisor): \n",
|
||||
" return (x + divisor - 1) // divisor * divisor"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Create interpolated grid to plot\n",
|
||||
"x_col = 'Pxscum_storm_mean_ratio'\n",
|
||||
"y_col = 'prestorm_shoreline_pct'\n",
|
||||
"z_col = 'change_shoreline_pct'\n",
|
||||
"\n",
|
||||
"# Grid data\n",
|
||||
"x_grid_max = round_up(max(df_plot[x_col]), 2)\n",
|
||||
"y_grid_max = 100\n",
|
||||
"\n",
|
||||
"grid_x, grid_y = np.mgrid[0:x_grid_max:100j, 0:y_grid_max:100j]\n",
|
||||
"\n",
|
||||
"x_vals = df_plot[x_col].values\n",
|
||||
"y_vals = df_plot[y_col].values\n",
|
||||
"z_vals = df_plot[z_col].values\n",
|
||||
"\n",
|
||||
"points = [[x, y] for x, y in zip(\n",
|
||||
" x_vals,\n",
|
||||
" y_vals,\n",
|
||||
")]\n",
|
||||
"\n",
|
||||
"grid = griddata((x_vals,y_vals), z_vals, (grid_x, grid_y), method='linear',rescale=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#https://stackoverflow.com/a/47500863\n",
|
||||
"import scipy as sp\n",
|
||||
"import scipy.interpolate\n",
|
||||
"from mpl_toolkits.mplot3d import axes3d\n",
|
||||
"spline = sp.interpolate.Rbf(x_vals,y_vals,z_vals,function='linear',smooth=30)\n",
|
||||
"\n",
|
||||
"x_grid = np.linspace(0, max(x_vals), 100)\n",
|
||||
"y_grid = np.linspace(0, max(y_vals), 100)\n",
|
||||
"B1, B2 = np.meshgrid(x_grid, y_grid, indexing='xy')\n",
|
||||
"Z = np.zeros((x_vals.size, z_vals.size))\n",
|
||||
"\n",
|
||||
"Z = spline(B1,B2)\n",
|
||||
"fig = plt.figure(figsize=(10,6))\n",
|
||||
"ax = axes3d.Axes3D(fig)\n",
|
||||
"ax.view_init(elev=10., azim=230)\n",
|
||||
"ax.plot_wireframe(B1, B2, Z,alpha=0.1)\n",
|
||||
"ax.plot_surface(B1, B2, Z,alpha=0.1)\n",
|
||||
"ax.scatter3D(x_vals,y_vals,z_vals, c='r')\n",
|
||||
"plt.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Create figure\n",
|
||||
"fig = plt.figure(figsize=(3, 3), dpi=150, facecolor='w', edgecolor='k')\n",
|
||||
"ax = fig.add_subplot(111)\n",
|
||||
"\n",
|
||||
"# Define colors\n",
|
||||
"cmap_interval = 25\n",
|
||||
"cmap = cc.cm.fire\n",
|
||||
"vmin = round_down(np.min(z_vals), cmap_interval)\n",
|
||||
"vmax = round_up(np.max(z_vals), cmap_interval)\n",
|
||||
"levels = [x*cmap_interval for x in range(-4,2)]\n",
|
||||
"\n",
|
||||
"# Plot grid surface\n",
|
||||
"cf = plt.contourf(grid_x, grid_y,grid,levels=levels, cmap=cmap,vmin=vmin,vmax=vmax)\n",
|
||||
"\n",
|
||||
"# Plot contours\n",
|
||||
"cs = plt.contour(grid_x, grid_y,grid,levels=levels,linewidths=0.5,colors='white', vmin=vmin,vmax=vmax)\n",
|
||||
"ax.clabel(cs, inline=1, fontsize=4, fmt='%1.0f%%')\n",
|
||||
"\n",
|
||||
"scatter = ax.scatter(\n",
|
||||
" x=x_vals,\n",
|
||||
" y=y_vals,\n",
|
||||
" c=z_vals,\n",
|
||||
" s=10,\n",
|
||||
" linewidth=0.8,\n",
|
||||
" edgecolor='k',\n",
|
||||
" cmap=cmap,vmin=vmin,vmax=vmax\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"ax.set_xlim([round_down(min(x_vals), 1),round_up(max(x_vals), 1)])\n",
|
||||
"\n",
|
||||
"ax.set_xlabel(x_col)\n",
|
||||
"ax.set_ylabel(y_col)\n",
|
||||
"\n",
|
||||
"cbar = plt.colorbar(cf)\n",
|
||||
"cbar.set_label(z_col)\n",
|
||||
"\n",
|
||||
"ax.grid(True, linestyle=\"--\", alpha=0.2, color='grey', linewidth=1)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Working"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Find which beaches we should extract next\n",
|
||||
"\n",
|
||||
"d = pd.concat([df_sites, df_obs_impacts], axis=1)\n",
|
||||
"for site_id, df in d.groupby(['beach']):\n",
|
||||
" break\n",
|
||||
" \n",
|
||||
"d_msl_change= d.groupby(['beach']).width_msl_change_m.mean()\n",
|
||||
"d_site_count = d.groupby(['beach']).lat.count()\n",
|
||||
"d_site_count_msl_change = pd.concat([d_msl_change,d_site_count],axis=1).sort_values(by='width_msl_change_m')\n",
|
||||
"l = len(d_site_count_msl_change)\n",
|
||||
"print(d_site_count_msl_change[:10])\n",
|
||||
"print('...')\n",
|
||||
"print(d_site_count_msl_change[int(l/2-5):int(l/2+5)])\n",
|
||||
"print('...')\n",
|
||||
"print(d_site_count_msl_change[-10:])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Get polygons for coastsat\n",
|
||||
"\n",
|
||||
"# beach = 'DIAMONDn'\n",
|
||||
"# beach = 'HARR'\n",
|
||||
"# beach = 'OLDBAR'\n",
|
||||
"beach = 'DEEWHYn'\n",
|
||||
"buffer = 0.3 #percent\n",
|
||||
"\n",
|
||||
"points = MultiPoint([Point(r[1].lon, r[1].lat) for r in df_sites[df_sites.beach==beach].iterrows()])\n",
|
||||
"# bounds = points.envelope.bounds\n",
|
||||
"bounds = points.minimum_rotated_rectangle.bounds\n",
|
||||
"\n",
|
||||
"lon_range = bounds[2] - bounds[0]\n",
|
||||
"lat_range = bounds[3] - bounds[1]\n",
|
||||
"\n",
|
||||
"x1 = bounds[0] - lon_range * buffer\n",
|
||||
"x2 = bounds[2] + lon_range* buffer\n",
|
||||
"y1 = bounds[1] - lat_range * buffer\n",
|
||||
"y2 = bounds[3] + lat_range * buffer\n",
|
||||
"\n",
|
||||
"# Create our polygon\n",
|
||||
"polygon = [[[x1, y1],\n",
|
||||
" [x1, y2],\n",
|
||||
" [x2, y2],\n",
|
||||
" [x2, y1],\n",
|
||||
" [x1, y1]]] \n",
|
||||
"\n",
|
||||
"print(beach)\n",
|
||||
"polygon"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
Binary file not shown.
@ -0,0 +1,13 @@
|
||||
import pandas as pd
|
||||
import os
|
||||
|
||||
def main():
|
||||
|
||||
data_folder = './data/interim'
|
||||
df_waves = pd.read_csv(os.path.join(data_folder, 'waves.csv'), index_col=[0,1])
|
||||
df_tides = pd.read_csv(os.path.join(data_folder, 'tides.csv'), index_col=[0,1])
|
||||
df_profiles = pd.read_csv(os.path.join(data_folder, 'profiles.csv'), index_col=[0,1,2])
|
||||
df_sites = pd.read_csv(os.path.join(data_folder, 'sites.csv'),index_col=[0])
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -1,220 +1,51 @@
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from logs import setup_logging
|
||||
|
||||
logger = setup_logging()
|
||||
def sto06_individual(Hs0, Tp, beta):
|
||||
|
||||
Lp = 9.8 * Tp ** 2 / 2 / np.pi
|
||||
|
||||
def sto06(Hs0, Tp, beta, **kwargs):
|
||||
"""
|
||||
:param Hs0: List or float of offshore significant wave height values
|
||||
:param Tp: List or float of peak wave period
|
||||
:param beta: List of float of beach slope
|
||||
:return: Float or list of R2, setup, S_total, S_inc and S_ig values
|
||||
"""
|
||||
|
||||
df = pd.DataFrame(
|
||||
{"Hs0": Hs0, "Tp": Tp, "beta": beta}, index=[x for x in range(0, np.size(Hs0))]
|
||||
)
|
||||
|
||||
df["Lp"] = 9.8 * df["Tp"] ** 2 / 2 / np.pi
|
||||
|
||||
# General equation
|
||||
df["S_ig"] = pd.to_numeric(0.06 * np.sqrt(df["Hs0"] * df["Lp"]), errors="coerce")
|
||||
df["S_inc"] = pd.to_numeric(
|
||||
0.75 * df["beta"] * np.sqrt(df["Hs0"] * df["Lp"]), errors="coerce"
|
||||
)
|
||||
df["setup"] = pd.to_numeric(
|
||||
0.35 * df["beta"] * np.sqrt(df["Hs0"] * df["Lp"]), errors="coerce"
|
||||
)
|
||||
df["S_total"] = np.sqrt(df["S_inc"] ** 2 + df["S_ig"] ** 2)
|
||||
df["R2"] = 1.1 * (df["setup"] + df["S_total"] / 2)
|
||||
S_ig = 0.06 * np.sqrt(Hs0 * Lp)
|
||||
S_inc = 0.75 * beta * np.sqrt(Hs0 * Lp)
|
||||
|
||||
# Dissipative conditions
|
||||
dissipative = df["beta"] / (df["Hs0"] / df["Lp"]) ** (0.5) <= 0.3
|
||||
|
||||
df.loc[dissipative, "setup"] = 0.016 * (df["Hs0"] * df["Lp"]) ** (0.5) # eqn 16
|
||||
df.loc[dissipative, "S_total"] = 0.046 * (df["Hs0"] * df["Lp"]) ** (0.5) # eqn 17
|
||||
df.loc[dissipative, "R2"] = 0.043 * (df["Hs0"] * df["Lp"]) ** (0.5) # eqn 18
|
||||
|
||||
return (
|
||||
float_or_list(df["R2"].tolist()),
|
||||
float_or_list(df["setup"].tolist()),
|
||||
float_or_list(df["S_total"].tolist()),
|
||||
float_or_list(df["S_inc"].tolist()),
|
||||
float_or_list(df["S_ig"].tolist()),
|
||||
)
|
||||
|
||||
|
||||
def hol86(Hs0, Tp, beta, **kwargs):
|
||||
df = pd.DataFrame(
|
||||
{"Hs0": Hs0, "Tp": Tp, "beta": beta}, index=[x for x in range(0, np.size(Hs0))]
|
||||
)
|
||||
|
||||
df["Lp"] = 9.8 * df["Tp"] ** 2 / 2 / np.pi
|
||||
|
||||
df["setup"] = 0.2 * df["Hs0"]
|
||||
df["R2"] = 0.83 * df["beta"] * np.sqrt(df["Hs0"] * df["Lp"]) + df["setup"]
|
||||
|
||||
df["S_ig"] = np.nan
|
||||
df["S_inc"] = np.nan
|
||||
df["S_total"] = np.nan
|
||||
|
||||
return (
|
||||
float_or_list(df["R2"].tolist()),
|
||||
float_or_list(df["setup"].tolist()),
|
||||
float_or_list(df["S_total"].tolist()),
|
||||
float_or_list(df["S_inc"].tolist()),
|
||||
float_or_list(df["S_ig"].tolist()),
|
||||
)
|
||||
|
||||
|
||||
def nie91(Hs0, Tp, beta, **kwargs):
|
||||
df = pd.DataFrame(
|
||||
{"Hs0": Hs0, "Tp": Tp, "beta": beta}, index=[x for x in range(0, np.size(Hs0))]
|
||||
)
|
||||
|
||||
df["Lp"] = 9.8 * df["Tp"] ** 2 / 2 / np.pi
|
||||
df["Ls"] = df["Lp"] # Need to make this approximation, refer to Atkinson 2017
|
||||
df["Hrms"] = df["Hs0"] / np.sqrt(
|
||||
2
|
||||
) # Acceptable approximation, refer to Atkinson 2017
|
||||
|
||||
df.loc[df.beta >= 0.1, "LR"] = 0.6 * df["beta"] * np.sqrt(df["Hrms"] * df["Ls"])
|
||||
df.loc[df.beta < 0.1, "LR"] = 0.06 * np.sqrt(df["Hrms"] * df["Ls"])
|
||||
|
||||
df["R2"] = 1.98 * df["LR"]
|
||||
|
||||
# Note that this should be the level above Z100%, which in this case is taken as the time varying tide level,
|
||||
# even though minimum run-down can still occur below tide SWL.
|
||||
|
||||
df["setup"] = np.nan
|
||||
df["S_ig"] = np.nan
|
||||
df["S_inc"] = np.nan
|
||||
df["S_total"] = np.nan
|
||||
|
||||
return (
|
||||
float_or_list(df["R2"].tolist()),
|
||||
float_or_list(df["setup"].tolist()),
|
||||
float_or_list(df["S_total"].tolist()),
|
||||
float_or_list(df["S_inc"].tolist()),
|
||||
float_or_list(df["S_ig"].tolist()),
|
||||
)
|
||||
|
||||
|
||||
def atk18(Hs0, Tp, beta):
|
||||
pass
|
||||
|
||||
|
||||
def pow18(Hs0, Tp, beta, r, **kwargs):
|
||||
logger.info("Calculating runup using Power et al. (2018)")
|
||||
df = pd.DataFrame(
|
||||
{"Hs0": Hs0, "Tp": Tp, "beta": beta, "r": r},
|
||||
index=[x for x in range(0, np.size(Hs0))],
|
||||
)
|
||||
|
||||
df["Lp"] = 9.8 * df["Tp"] ** 2 / 2 / np.pi
|
||||
df["x1"] = df["Hs0"] / df["Lp"]
|
||||
df["x2"] = df["beta"]
|
||||
df["x3"] = df["r"] / df["Hs0"]
|
||||
|
||||
df["R2"] = df.Hs0 * (
|
||||
(df.x2 + (((df.x3 * 3) / np.exp(-5)) * ((3 * df.x3) * df.x3)))
|
||||
+ ((((df.x1 + df.x3) - 2) - (df.x3 - df.x2)) + ((df.x2 - df.x1) - df.x3))
|
||||
+ (((df.x3 ** df.x1) - (df.x3 ** (1 / 3))) - (np.exp(df.x2) ** (df.x1 * 3)))
|
||||
+ np.sqrt((((df.x3 + df.x1) - df.x2) - (df.x2 + np.log10(df.x3))))
|
||||
+ ((((df.x2 ** 2) / (df.x1 ** (1 / 3))) ** (df.x1 ** (1 / 3))) - np.sqrt(df.x3))
|
||||
+ (
|
||||
(df.x2 + ((df.x3 / df.x1) ** (1 / 3)))
|
||||
+ (np.log(2) - (1 / (1 + np.exp(-(df.x2 + df.x3)))))
|
||||
)
|
||||
+ ((np.sqrt(df.x3) - (((3 ** 2) + 3) * (df.x2 ** 2))) ** 2)
|
||||
+ ((((df.x3 * -5) ** 2) ** 2) + (((df.x3 + df.x3) * df.x1) / (df.x2 ** 2)))
|
||||
+ np.log(
|
||||
(np.sqrt(((df.x2 ** 2) + (df.x3 ** (1 / 3)))) + ((df.x2 + 3) ** (1 / 3)))
|
||||
)
|
||||
+ (
|
||||
(((df.x1 / df.x3) * (-5 ** 2)) * (df.x3 ** 2))
|
||||
- np.log10((1 / (1 + np.exp(-(df.x2 + df.x3)))))
|
||||
)
|
||||
+ (df.x1 ** df.x3)
|
||||
+ np.exp(-((((df.x3 / df.x1) ** np.exp(4)) + (np.exp(df.x3) ** 3)) ** 2))
|
||||
+ np.exp((np.log((df.x2 - df.x3)) - np.log(np.exp(-((-1 + df.x1) ** 2)))))
|
||||
+ ((np.sqrt(4) * (((df.x3 / df.x2) - df.x2) - (0 - df.x1))) ** 2)
|
||||
+ (2 * ((((-5 * df.x3) + df.x1) * (2 - df.x3)) - 2))
|
||||
+ ((np.sqrt(4) * (((df.x3 / df.x2) - df.x2) - (0 - df.x1))) ** 2)
|
||||
+ ((((-5 + df.x1) - df.x2) * (df.x2 - df.x3)) * ((df.x1 - df.x2) - (-4 ** -5)))
|
||||
+ (np.exp(-((df.x2 + (-5 - df.x1)) ** 2)) + ((df.x2 + 5) * (df.x3 ** 2)))
|
||||
+ np.sqrt(
|
||||
1
|
||||
/ (
|
||||
1
|
||||
+ np.exp(
|
||||
-(
|
||||
(np.exp(df.x1) - np.exp(-((df.x3 + df.x3) ** 2)))
|
||||
+ ((df.x1 ** df.x3) - (df.x3 * 4))
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
+ (
|
||||
(
|
||||
np.exp(
|
||||
-(
|
||||
(
|
||||
(
|
||||
(
|
||||
np.exp(
|
||||
-(
|
||||
(
|
||||
(np.sqrt(df.x3) * 4)
|
||||
+ (1 / (1 + np.exp(-(df.x2 + 2))))
|
||||
)
|
||||
** 2
|
||||
)
|
||||
)
|
||||
)
|
||||
** 2
|
||||
)
|
||||
+ df.x1
|
||||
)
|
||||
** 2
|
||||
)
|
||||
)
|
||||
)
|
||||
** 3
|
||||
)
|
||||
)
|
||||
|
||||
df["setup"] = np.nan
|
||||
df["S_ig"] = np.nan
|
||||
df["S_inc"] = np.nan
|
||||
df["S_total"] = np.nan
|
||||
|
||||
return (
|
||||
float_or_list(df["R2"].tolist()),
|
||||
float_or_list(df["setup"].tolist()),
|
||||
float_or_list(df["S_total"].tolist()),
|
||||
float_or_list(df["S_inc"].tolist()),
|
||||
float_or_list(df["S_ig"].tolist()),
|
||||
)
|
||||
|
||||
|
||||
def beu(Hs0, Tp, beta):
|
||||
pass
|
||||
if beta / (Hs0/Lp)**(0.5) <= 0.3:
|
||||
setup = 0.016 * (Hs0 * Lp) ** 0.5
|
||||
S_total = 0.046 * (Hs0 * Lp) ** 0.5
|
||||
R2 = 0.043 * (Hs0 * Lp) ** 0.5
|
||||
else:
|
||||
setup = 0.35 * beta * (Hs0 * Lp) ** 0.5
|
||||
S_total = np.sqrt(S_inc ** 2 + S_ig **2)
|
||||
R2 = 1.1 * (setup + S_total / 2)
|
||||
|
||||
return R2, setup, S_total, S_inc, S_ig
|
||||
|
||||
def float_or_list(a):
|
||||
def sto06(df, Hs0_col, Tp_col, beta_col):
|
||||
"""
|
||||
If only one value in the array, return the float, else return a list
|
||||
:param a:
|
||||
Vectorized version of Stockdon06 which can be used with dataframes
|
||||
:param df:
|
||||
:param Hs0_col:
|
||||
:param Tp_col:
|
||||
:param beta_col:
|
||||
:return:
|
||||
"""
|
||||
if len(a) == 1:
|
||||
return a[0]
|
||||
else:
|
||||
return list(a)
|
||||
|
||||
Lp = 9.8 * df[Tp_col] ** 2 / 2 / np.pi
|
||||
|
||||
# General equation
|
||||
S_ig = pd.to_numeric(0.06 * np.sqrt(df[Hs0_col] * Lp), errors='coerce')
|
||||
S_inc = pd.to_numeric(0.75 * df[beta_col] * np.sqrt(df[Hs0_col] * Lp), errors='coerce')
|
||||
setup = pd.to_numeric(0.35 * df[beta_col] * np.sqrt(df[Hs0_col] * Lp), errors='coerce')
|
||||
S_total = np.sqrt(S_inc ** 2 + S_ig ** 2)
|
||||
R2 = 1.1 * (setup + S_total / 2)
|
||||
|
||||
# Dissipative conditions
|
||||
dissipative = df[beta_col] / (df[Hs0_col] / Lp)**(0.5) <= 0.3
|
||||
setup.loc[dissipative,:] = 0.016 * (df[Hs0_col] * Lp) ** (0.5) # eqn 16
|
||||
S_total.loc[dissipative,:] = 0.046 * (df[Hs0_col] * Lp) ** (0.5) # eqn 17
|
||||
R2.loc[dissipative,:] = 0.043 * (df[Hs0_col] * Lp) ** (0.5) # eqn 18
|
||||
|
||||
return R2, setup, S_total, S_inc, S_ig
|
||||
|
||||
if __name__ == "__main__":
|
||||
if __name__ == '__main__':
|
||||
pass
|
||||
|
@ -1,37 +0,0 @@
|
||||
"""
|
||||
Entry point to run data processing and analysis commands.
|
||||
"""
|
||||
|
||||
# Disable numpy warnings
|
||||
import warnings
|
||||
|
||||
import click
|
||||
|
||||
import analysis.forecast_twl as forecast_twl
|
||||
import analysis.forecasted_storm_impacts as forecasted_storm_impacts
|
||||
import analysis.observed_storm_impacts as observed_storm_impacts
|
||||
import data.csv_to_geojson as csv_to_geojson
|
||||
import data.parse_mat as parse_mat
|
||||
|
||||
warnings.simplefilter(action="ignore", category=FutureWarning)
|
||||
|
||||
|
||||
@click.group()
|
||||
def cli():
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
cli.add_command(csv_to_geojson.impacts_to_geojson)
|
||||
cli.add_command(csv_to_geojson.profile_features_crest_toes_to_geojson)
|
||||
cli.add_command(csv_to_geojson.R_high_to_geojson)
|
||||
cli.add_command(csv_to_geojson.sites_csv_to_geojson)
|
||||
cli.add_command(forecast_twl.create_twl_forecast)
|
||||
cli.add_command(forecasted_storm_impacts.create_forecasted_impacts)
|
||||
cli.add_command(observed_storm_impacts.create_observed_impacts)
|
||||
cli.add_command(parse_mat.create_crest_toes)
|
||||
cli.add_command(parse_mat.create_sites_and_profiles_csv)
|
||||
cli.add_command(parse_mat.create_tides_csv)
|
||||
cli.add_command(parse_mat.create_waves_csv)
|
||||
cli.add_command(parse_mat.create_grain_size_csv)
|
||||
cli()
|
@ -1,339 +0,0 @@
|
||||
"""
|
||||
Converts .csv files to .shape files
|
||||
"""
|
||||
import os
|
||||
import numpy.ma as ma
|
||||
import click
|
||||
import fiona
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from fiona.crs import from_epsg
|
||||
from shapely.geometry import Point, mapping, LineString
|
||||
from collections import OrderedDict
|
||||
from utils import crossings, convert_coord_systems
|
||||
from logs import setup_logging
|
||||
|
||||
logger = setup_logging()
|
||||
|
||||
|
||||
def lat_lon_from_profile_x_coord(
|
||||
center_lat_lon, orientation, center_profile_x, x_coord
|
||||
):
|
||||
"""
|
||||
Returns the lat/lon of a point on a profile with the given x_coord
|
||||
:param center_lat_lon: Shapely point of lat/lon of profile center
|
||||
:param orientation: Orientation of the profile (positive east, counterclockwise)
|
||||
:param center_profile_x: x value of the center of the profile
|
||||
:param x_coord: X coordinate of the point to get a lat lon from
|
||||
:return:
|
||||
"""
|
||||
center_xy = convert_coord_systems(center_lat_lon)
|
||||
center_x, center_y = center_xy.xy
|
||||
|
||||
point_x = center_x + (center_profile_x - x_coord) * np.cos(np.deg2rad(orientation))
|
||||
point_y = center_y + (center_profile_x - x_coord) * np.sin(np.deg2rad(orientation))
|
||||
point_xy = Point(point_x, point_y)
|
||||
point_lat_lon = convert_coord_systems(
|
||||
point_xy, in_coord_system="EPSG:28356", out_coord_system="EPSG:4326"
|
||||
)
|
||||
return point_lat_lon
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.option("--sites-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--profiles-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--crest-toes-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--impacts-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--output-geojson", required=True, help="where to store .geojson file")
|
||||
def R_high_to_geojson(
|
||||
sites_csv, profiles_csv, crest_toes_csv, impacts_csv, output_geojson
|
||||
):
|
||||
"""
|
||||
Converts impact R_high into a lat/lon geojson that we can plot in QGIS
|
||||
:param sites_csv:
|
||||
:param profiles_csv:
|
||||
:param impacts_csv:
|
||||
:param output_geojson:
|
||||
:return:
|
||||
"""
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_profiles = pd.read_csv(profiles_csv, index_col=[0, 1, 2])
|
||||
df_crest_toes = pd.read_csv(crest_toes_csv, index_col=[0, 1])
|
||||
df_impacts = pd.read_csv(impacts_csv, index_col=[0])
|
||||
|
||||
# Create geojson file
|
||||
schema = {
|
||||
"geometry": "Point",
|
||||
"properties": OrderedDict(
|
||||
[("beach", "str"), ("site_id", "str"), ("elevation", "float")]
|
||||
),
|
||||
}
|
||||
|
||||
with fiona.open(
|
||||
output_geojson, "w", driver="GeoJSON", crs=from_epsg(4326), schema=schema
|
||||
) as output:
|
||||
for index, row in df_impacts.iterrows():
|
||||
|
||||
site_id = index
|
||||
beach = index[:-4]
|
||||
|
||||
# Find lat/lon of R_high position
|
||||
R_high_z = row["R_high"]
|
||||
|
||||
# Get poststorm profile
|
||||
df_profile = df_profiles.loc[(site_id, "prestorm")]
|
||||
int_x = crossings(
|
||||
df_profile.index.get_level_values("x").tolist(),
|
||||
df_profile.z.tolist(),
|
||||
R_high_z,
|
||||
)
|
||||
|
||||
# Take the intersection closest to the dune face.
|
||||
try:
|
||||
x_cols = [x for x in df_crest_toes.columns if "_x" in x]
|
||||
dune_face_x = np.mean(
|
||||
df_crest_toes.loc[(site_id, "prestorm"), x_cols].tolist()
|
||||
)
|
||||
int_x = min(int_x, key=lambda x: abs(x - dune_face_x))
|
||||
except:
|
||||
continue
|
||||
|
||||
# Get lat/lon on intercept position
|
||||
site = df_sites.loc[site_id]
|
||||
point_lat_lon = lat_lon_from_profile_x_coord(
|
||||
center_lat_lon=Point(site["lon"], site["lat"]),
|
||||
orientation=site["orientation"],
|
||||
center_profile_x=site["profile_x_lat_lon"],
|
||||
x_coord=int_x,
|
||||
)
|
||||
|
||||
prop = OrderedDict(
|
||||
[("beach", beach), ("site_id", site_id), ("elevation", R_high_z)]
|
||||
)
|
||||
output.write({"geometry": mapping(point_lat_lon), "properties": prop})
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.option("--sites-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--profile-features-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--output-geojson", required=True, help="where to store .geojson file")
|
||||
def profile_features_crest_toes_to_geojson(
|
||||
sites_csv, profile_features_csv, output_geojson
|
||||
):
|
||||
"""
|
||||
Converts profile_features containing dune toes and crest locations to a geojson we can load into QGIS
|
||||
:param sites_csv:
|
||||
:param profiles_csv:
|
||||
:param profile_features_csv:
|
||||
:param output_geojson:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Creating profile features geojson")
|
||||
|
||||
# Read files from interim folder
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_profile_features = pd.read_csv(profile_features_csv, index_col=[0])
|
||||
|
||||
# Create geojson file
|
||||
schema = {
|
||||
"geometry": "Point",
|
||||
"properties": OrderedDict(
|
||||
[
|
||||
("beach", "str"),
|
||||
("site_id", "str"),
|
||||
(
|
||||
"point_type",
|
||||
"str",
|
||||
), # prestorm_dune_toe, prestorm_dune_crest, poststorm_dune_toe, poststorm_dune_crest
|
||||
("profile_type", "str"),
|
||||
("elevation", "float"),
|
||||
]
|
||||
),
|
||||
}
|
||||
|
||||
with fiona.open(
|
||||
output_geojson, "w", driver="GeoJSON", crs=from_epsg(4326), schema=schema
|
||||
) as output:
|
||||
for index, row in df_profile_features.iterrows():
|
||||
beach = index[:-4]
|
||||
site_id = index
|
||||
profile_type = row["profile_type"]
|
||||
|
||||
for point_type in ["crest", "toe"]:
|
||||
# point_type='crest'
|
||||
elevation = row["dune_{}_z".format(point_type)]
|
||||
x = row["dune_{}_x".format(point_type)]
|
||||
|
||||
if np.isnan(x):
|
||||
continue
|
||||
|
||||
# Geojsons need to use 'null' instead of 'nan'
|
||||
if np.isnan(elevation):
|
||||
elevation = None
|
||||
|
||||
# Convert x position to lat/lon
|
||||
site = df_sites.loc[site_id]
|
||||
point_lat_lon = lat_lon_from_profile_x_coord(
|
||||
center_lat_lon=Point(site["lon"], site["lat"]),
|
||||
orientation=site["orientation"],
|
||||
center_profile_x=site["profile_x_lat_lon"],
|
||||
x_coord=x,
|
||||
)
|
||||
|
||||
prop = OrderedDict(
|
||||
[
|
||||
("beach", beach),
|
||||
("site_id", site_id),
|
||||
("point_type", point_type),
|
||||
("profile_type", profile_type),
|
||||
("elevation", elevation),
|
||||
]
|
||||
)
|
||||
output.write({"geometry": mapping(point_lat_lon), "properties": prop})
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.option("--input-csv", required=True, help=".csv file to convert")
|
||||
@click.option("--output-geojson", required=True, help="where to store .geojson file")
|
||||
def sites_csv_to_geojson(input_csv, output_geojson):
|
||||
"""
|
||||
Converts our dataframe of sites to .geojson to load in QGis. Sites are loaded as linestrings of the profile
|
||||
cross-sections
|
||||
:param input_csv:
|
||||
:param output_geojson:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Converting %s to %s", input_csv, output_geojson)
|
||||
df_sites = pd.read_csv(input_csv, index_col=[0])
|
||||
logger.info(os.environ.get("GDAL_DATA", None))
|
||||
|
||||
schema = {
|
||||
"geometry": "LineString",
|
||||
"properties": OrderedDict([("beach", "str"), ("site_id", "str")]),
|
||||
}
|
||||
|
||||
with fiona.open(
|
||||
output_geojson, "w", driver="GeoJSON", crs=from_epsg(4326), schema=schema
|
||||
) as output:
|
||||
for index, row in df_sites.iterrows():
|
||||
|
||||
center_lat_lon = Point(row["lon"], row["lat"])
|
||||
# Work out where landward profile limit is
|
||||
land_lat_lon = lat_lon_from_profile_x_coord(
|
||||
center_lat_lon=center_lat_lon,
|
||||
orientation=row["orientation"],
|
||||
center_profile_x=row["profile_x_lat_lon"],
|
||||
x_coord=0,
|
||||
)
|
||||
# Work out where seaward profile limit is
|
||||
sea_lat_lon = lat_lon_from_profile_x_coord(
|
||||
center_lat_lon=center_lat_lon,
|
||||
orientation=row["orientation"],
|
||||
center_profile_x=row["profile_x_lat_lon"],
|
||||
x_coord=2 * row["profile_x_lat_lon"],
|
||||
)
|
||||
|
||||
line_string = LineString([land_lat_lon, center_lat_lon, sea_lat_lon])
|
||||
prop = OrderedDict([("beach", row["beach"]), ("site_id", index)])
|
||||
output.write({"geometry": mapping(line_string), "properties": prop})
|
||||
|
||||
logger.info("Done!")
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.option("--sites-csv", required=True, help="sites.csv file to convert")
|
||||
@click.option(
|
||||
"--observed-impacts-csv", required=True, help="impacts-observed.csv file to convert"
|
||||
)
|
||||
@click.option(
|
||||
"--forecast-impacts-csv", required=True, help="impacts-forecast.csv file to convert"
|
||||
)
|
||||
@click.option("--output-geojson", required=True, help="where to store .geojson file")
|
||||
def impacts_to_geojson(
|
||||
sites_csv, observed_impacts_csv, forecast_impacts_csv, output_geojson
|
||||
):
|
||||
"""
|
||||
Converts impacts observed and forecasted to a geojson for visualization in QGIS
|
||||
:param sites_csv:
|
||||
:param observed_impacts_csv:
|
||||
:param forecast_impacts_csv:
|
||||
:param output_geojson:
|
||||
:return:
|
||||
"""
|
||||
|
||||
# Get information from .csv and read into pandas dataframe
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_observed = pd.read_csv(observed_impacts_csv, index_col=[0])
|
||||
df_forecast = pd.read_csv(forecast_impacts_csv, index_col=[0]).rename(
|
||||
{"storm_regime": "forecast_storm_regime"}
|
||||
)
|
||||
|
||||
# Rename columns, so we can distinguish between forecast and observed
|
||||
df_observed = df_observed.rename(columns={"storm_regime": "observed_storm_regime"})
|
||||
df_forecast = df_forecast.rename(columns={"storm_regime": "forecast_storm_regime"})
|
||||
|
||||
# Concat into one big dataframe
|
||||
df = pd.concat([df_sites, df_observed, df_forecast], sort=True, axis=1)
|
||||
|
||||
# Make new column for accuracy of forecast. Use underpredict/correct/overpredict classes
|
||||
df.loc[
|
||||
df.observed_storm_regime == df.forecast_storm_regime, "forecast_accuray"
|
||||
] = "correct"
|
||||
|
||||
# Observed/Forecasted/Class for each combination
|
||||
classes = [
|
||||
("swash", "collision", "overpredict"),
|
||||
("swash", "swash", "correct"),
|
||||
("swash", "overwash", "overpredict"),
|
||||
("collision", "swash", "underpredict"),
|
||||
("collision", "collision", "correct"),
|
||||
("collision", "overwash", "overpredict"),
|
||||
("overwash", "swash", "underpredict"),
|
||||
("overwash", "collision", "underpredict"),
|
||||
("overwash", "overwash", "correct"),
|
||||
]
|
||||
for c in classes:
|
||||
df.loc[
|
||||
(df.observed_storm_regime == c[0]) & (df.forecast_storm_regime == c[1]),
|
||||
"forecast_accuracy",
|
||||
] = c[2]
|
||||
|
||||
schema = {
|
||||
"geometry": "Point",
|
||||
"properties": OrderedDict(
|
||||
[
|
||||
("beach", "str"),
|
||||
("site_id", "str"),
|
||||
("forecast_storm_regime", "str"),
|
||||
("observed_storm_regime", "str"),
|
||||
("forecast_accuracy", "str"),
|
||||
]
|
||||
),
|
||||
}
|
||||
|
||||
with fiona.open(
|
||||
output_geojson, "w", driver="GeoJSON", crs=from_epsg(4326), schema=schema
|
||||
) as output:
|
||||
for index, row in df.iterrows():
|
||||
|
||||
# Locate the marker at the seaward end of the profile to avoid cluttering the coastline.
|
||||
# Work out where seaward profile limit is
|
||||
sea_lat_lon = lat_lon_from_profile_x_coord(
|
||||
center_lat_lon=Point(row["lon"], row["lat"]),
|
||||
orientation=row["orientation"],
|
||||
center_profile_x=row["profile_x_lat_lon"],
|
||||
x_coord=2 * row["profile_x_lat_lon"],
|
||||
)
|
||||
prop = OrderedDict(
|
||||
[
|
||||
("beach", row["beach"]),
|
||||
("site_id", index),
|
||||
("forecast_storm_regime", row["forecast_storm_regime"]),
|
||||
("observed_storm_regime", row["observed_storm_regime"]),
|
||||
("forecast_accuracy", row["forecast_accuracy"]),
|
||||
]
|
||||
)
|
||||
|
||||
output.write({"geometry": mapping(sea_lat_lon), "properties": prop})
|
||||
|
||||
logger.info("Done!")
|
@ -0,0 +1,48 @@
|
||||
"""
|
||||
Converts .csv files to .shape files
|
||||
"""
|
||||
|
||||
import click
|
||||
import fiona
|
||||
import pandas as pd
|
||||
from fiona.crs import from_epsg
|
||||
from shapely.geometry import Point, mapping
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument('input_csv')
|
||||
@click.argument('output_shp')
|
||||
def sites_csv_to_shp(input_csv, output_shp):
|
||||
"""
|
||||
Converts our dataframe of sites to .shp to load in QGis
|
||||
:param input_csv:
|
||||
:param output_shp:
|
||||
:return:
|
||||
"""
|
||||
df_sites = pd.read_csv(input_csv, index_col=[0])
|
||||
|
||||
schema = {
|
||||
'geometry': 'Point',
|
||||
'properties': {
|
||||
'beach': 'str',
|
||||
'site_id': 'str'
|
||||
}
|
||||
}
|
||||
with fiona.open(output_shp, 'w', crs=from_epsg(4326), driver='ESRI Shapefile', schema=schema) as output:
|
||||
for index, row in df_sites.iterrows():
|
||||
point = Point(row['lon'], row['lat'])
|
||||
prop = {
|
||||
'beach': row['beach'],
|
||||
'site_id': index,
|
||||
}
|
||||
output.write({'geometry': mapping(point), 'properties': prop})
|
||||
|
||||
|
||||
@click.group()
|
||||
def cli():
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
cli.add_command(sites_csv_to_shp)
|
||||
cli()
|
@ -0,0 +1,263 @@
|
||||
"""
|
||||
Converts raw .mat files into a flattened .csv structure which can be imported into python pandas.
|
||||
"""
|
||||
|
||||
import logging.config
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
import pandas as pd
|
||||
from mat4py import loadmat
|
||||
import numpy as np
|
||||
|
||||
logging.config.fileConfig('./src/logging.conf', disable_existing_loggers=False)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def parse_orientations(orientations_mat):
|
||||
"""
|
||||
Parses the raw orientations.mat file and returns a pandas dataframe. Note that orientations are the direction
|
||||
towards land measured in degrees anti-clockwise from east.
|
||||
:param orientations_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info('Parsing %s', orientations_mat)
|
||||
mat_data = loadmat(orientations_mat)['output']
|
||||
rows = []
|
||||
for i in range(0, len(mat_data['beach'])):
|
||||
rows.append({
|
||||
'beach': mat_data['beach'][i],
|
||||
'orientation': mat_data['orientation'][i],
|
||||
'lat_center': mat_data['lat_center'][i],
|
||||
'lon_center': mat_data['lon_center'][i],
|
||||
'lat_land': mat_data['lat_land'][i],
|
||||
'lon_land': mat_data['lon_land'][i],
|
||||
'lat_sea': mat_data['lat_sea'][i],
|
||||
'lon_sea': mat_data['lon_sea'][i],
|
||||
})
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
return df
|
||||
|
||||
def combine_sites_and_orientaions(df_sites, df_orientations):
|
||||
"""
|
||||
Replaces beach/lat/lon columns with the unique site_id.
|
||||
:param dfs:
|
||||
:param df_sites:
|
||||
:return:
|
||||
"""
|
||||
df_merged_sites = df_sites.merge(df_orientations[['beach', 'lat_center', 'lon_center', 'orientation']],
|
||||
left_on=['beach', 'lat', 'lon'],
|
||||
right_on=['beach', 'lat_center', 'lon_center'])
|
||||
|
||||
# Check that all our records have a unique site identifier
|
||||
n_unmatched = len(df_sites) - len(df_merged_sites)
|
||||
if n_unmatched > 0:
|
||||
logger.warning('Not all records (%d of %d) matched with an orientation', n_unmatched, len(df_sites))
|
||||
|
||||
# Drop extra columns
|
||||
df_merged_sites = df_merged_sites.drop(columns = ['lat_center', 'lon_center'])
|
||||
|
||||
return df_merged_sites
|
||||
|
||||
def specify_lat_lon_profile_center(df_sites, x_val=200):
|
||||
"""
|
||||
Specify which x-coordinate in the beach profile cross section the lat/lon corresponds to
|
||||
:param df_sites:
|
||||
:return:
|
||||
"""
|
||||
df_sites['profile_x_lat_lon'] = x_val
|
||||
return df_sites
|
||||
|
||||
def parse_waves(waves_mat):
|
||||
"""
|
||||
Parses the raw waves.mat file and returns a pandas dataframe
|
||||
:param waves_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info('Parsing %s', waves_mat)
|
||||
mat_data = loadmat(waves_mat)['data']
|
||||
rows = []
|
||||
for i in range(0, len(mat_data['site'])):
|
||||
for j in range(0, len(mat_data['dates'][i])):
|
||||
rows.append({
|
||||
'beach': mat_data['site'][i],
|
||||
'lon': mat_data['lon'][i],
|
||||
'lat': mat_data['lat'][i],
|
||||
'datetime': matlab_datenum_to_datetime(mat_data['dates'][i][j][0]),
|
||||
'Hs': mat_data['H'][i][j][0],
|
||||
'Hs0': mat_data['Ho'][i][j][0],
|
||||
'Tp': mat_data['T'][i][j][0],
|
||||
'dir': mat_data['D'][i][j][0],
|
||||
'E': mat_data['E'][i][j][0],
|
||||
'P': mat_data['P'][i][j][0],
|
||||
'Exs': mat_data['Exs'][i][j][0],
|
||||
'Pxs': mat_data['Pxs'][i][j][0],
|
||||
})
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
df['datetime'] = df['datetime'].dt.round('1s')
|
||||
return df
|
||||
|
||||
|
||||
def parse_tides(tides_mat):
|
||||
"""
|
||||
Parses the raw tides.mat file and returns a pandas dataframe
|
||||
:param tides_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info('Parsing %s', tides_mat)
|
||||
mat_data = loadmat(tides_mat)['data']
|
||||
rows = []
|
||||
for i in range(0, len(mat_data['site'])):
|
||||
for j in range(0, len(mat_data['time'])):
|
||||
rows.append({
|
||||
'beach': mat_data['site'][i][0],
|
||||
'lon': mat_data['lons'][i][0],
|
||||
'lat': mat_data['lats'][i][0],
|
||||
'datetime': matlab_datenum_to_datetime(mat_data['time'][j][0]),
|
||||
'tide': mat_data['tide'][i][j]
|
||||
})
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
df['datetime'] = df['datetime'].dt.round('1s')
|
||||
return df
|
||||
|
||||
|
||||
def parse_profiles(profiles_mat):
|
||||
"""
|
||||
Parses the raw profiles.mat file and returns a pandas dataframe
|
||||
:param tides_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info('Parsing %s', profiles_mat)
|
||||
mat_data = loadmat(profiles_mat)['data']
|
||||
rows = []
|
||||
for i in range(0, len(mat_data['site'])):
|
||||
for j in range(0, len(mat_data['pfx'][i])):
|
||||
for profile_type in ['prestorm', 'poststorm']:
|
||||
|
||||
if profile_type == 'prestorm':
|
||||
z = mat_data['pf1'][i][j][0]
|
||||
if profile_type == 'poststorm':
|
||||
z = mat_data['pf2'][i][j][0]
|
||||
|
||||
rows.append({
|
||||
'beach': mat_data['site'][i],
|
||||
'lon': mat_data['lon'][i],
|
||||
'lat': mat_data['lat'][i],
|
||||
'profile_type': profile_type,
|
||||
'x': mat_data['pfx'][i][j][0],
|
||||
'z': z,
|
||||
})
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
return df
|
||||
|
||||
def remove_zeros(df_profiles):
|
||||
"""
|
||||
When parsing the pre/post storm profiles, the end of some profiles have constant values of zero. Let's change
|
||||
these to NaNs for consistancy. Didn't use pandas fillnan because 0 may still be a valid value.
|
||||
:param df:
|
||||
:return:
|
||||
"""
|
||||
|
||||
df_profiles = df_profiles.sort_index()
|
||||
groups = df_profiles.groupby(level=['site_id','profile_type'])
|
||||
for key, _ in groups:
|
||||
logger.debug('Removing zeros from {} profile at {}'.format(key[1], key[0]))
|
||||
idx_site = (df_profiles.index.get_level_values('site_id') == key[0]) & \
|
||||
(df_profiles.index.get_level_values('profile_type') == key[1])
|
||||
df_profile = df_profiles[idx_site]
|
||||
x_last_ele = df_profile[df_profile.z!=0].index.get_level_values('x')[-1]
|
||||
df_profiles.loc[idx_site & (df_profiles.index.get_level_values('x')>x_last_ele), 'z'] = np.nan
|
||||
|
||||
return df_profiles
|
||||
|
||||
def matlab_datenum_to_datetime(matlab_datenum):
|
||||
# https://stackoverflow.com/a/13965852
|
||||
return datetime.fromordinal(int(matlab_datenum)) + timedelta(days=matlab_datenum % 1) - timedelta(
|
||||
days=366)
|
||||
|
||||
|
||||
def get_unique_sites(dfs, cols=['beach', 'lat', 'lon']):
|
||||
"""
|
||||
Generates a dataframe of unique sites based on beach names, lats and lons. Creates a unique site ID for each.
|
||||
:param dfs:
|
||||
:param cols:
|
||||
:return:
|
||||
"""
|
||||
|
||||
rows = []
|
||||
df_all = pd.concat([df[cols] for df in dfs])
|
||||
beach_groups = df_all.groupby(['beach'])
|
||||
for beach_name, beach_group in beach_groups:
|
||||
site_groups = beach_group.groupby(['lat', 'lon'])
|
||||
siteNo = 1
|
||||
for site_name, site_group in site_groups:
|
||||
site = '{}{:04d}'.format(beach_name, siteNo)
|
||||
rows.append({'site_id': site,
|
||||
'lat': site_name[0],
|
||||
'lon': site_name[1],
|
||||
'beach': beach_name})
|
||||
siteNo += 1
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
|
||||
return df
|
||||
|
||||
|
||||
def replace_unique_sites(df, df_sites, cols=['beach', 'lat', 'lon']):
|
||||
"""
|
||||
Replaces beach/lat/lon columns with the unique site_id
|
||||
:param dfs:
|
||||
:param df_sites:
|
||||
:return:
|
||||
"""
|
||||
|
||||
df_merged = df.merge(df_sites, on=cols)
|
||||
|
||||
# Check that all our records have a unique site identifier
|
||||
n_unmatched = len(df) - len(df_merged)
|
||||
if n_unmatched > 0:
|
||||
logger.warning('Not all records (%d of %d) matched with a unique site', n_unmatched, len(df))
|
||||
|
||||
df_merged = df_merged.drop(columns=cols)
|
||||
|
||||
return df_merged
|
||||
|
||||
|
||||
def main():
|
||||
df_waves = parse_waves(waves_mat='./data/raw/processed_shorelines/waves.mat')
|
||||
df_tides = parse_tides(tides_mat='./data/raw/processed_shorelines/tides.mat')
|
||||
df_profiles = parse_profiles(profiles_mat='./data/raw/processed_shorelines/profiles.mat')
|
||||
df_sites = get_unique_sites(dfs=[df_waves, df_tides, df_profiles])
|
||||
df_orientations = parse_orientations(orientations_mat='./data/raw/processed_shorelines/orientations.mat')
|
||||
|
||||
logger.info('Identifying unique sites')
|
||||
df_waves = replace_unique_sites(df_waves, df_sites)
|
||||
df_tides = replace_unique_sites(df_tides, df_sites)
|
||||
df_profiles = replace_unique_sites(df_profiles, df_sites)
|
||||
|
||||
logger.info('Combine orientations into sites')
|
||||
df_sites = combine_sites_and_orientaions(df_sites, df_orientations)
|
||||
df_sites = specify_lat_lon_profile_center(df_sites)
|
||||
|
||||
logger.info('Setting pandas index')
|
||||
df_profiles.set_index(['site_id', 'profile_type', 'x'], inplace=True)
|
||||
df_waves.set_index(['site_id', 'datetime'], inplace=True)
|
||||
df_tides.set_index(['site_id', 'datetime'], inplace=True)
|
||||
df_sites.set_index(['site_id'], inplace=True)
|
||||
|
||||
logger.info('Nanning profile zero elevations')
|
||||
df_profiles = remove_zeros(df_profiles)
|
||||
|
||||
logger.info('Outputting .csv files')
|
||||
df_profiles.to_csv('./data/interim/profiles.csv')
|
||||
df_tides.to_csv('./data/interim/tides.csv')
|
||||
df_waves.to_csv('./data/interim/waves.csv')
|
||||
df_sites.to_csv('./data/interim/sites.csv')
|
||||
logger.info('Done!')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -1,565 +0,0 @@
|
||||
"""
|
||||
Converts raw .mat files into a flattened .csv structure which can be imported into python pandas.
|
||||
"""
|
||||
|
||||
import math
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
import click
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from mat4py import loadmat
|
||||
from shapely.geometry import Point
|
||||
|
||||
from utils import convert_coord_systems
|
||||
from logs import setup_logging
|
||||
|
||||
logger = setup_logging()
|
||||
|
||||
|
||||
def parse_crest_toes(df_raw_features, df_profiles):
|
||||
"""
|
||||
Parses profile_features_chris_leaman.csv
|
||||
:param profile_features_csv:
|
||||
:return:
|
||||
"""
|
||||
|
||||
# Puts profiles_features_csv into format expected by rest of analysis
|
||||
df_crest_toes = df_raw_features.reset_index().melt(
|
||||
id_vars=["site_id"],
|
||||
value_vars=[
|
||||
"prestorm_dune_crest_x",
|
||||
"prestorm_dune_toe_x",
|
||||
"poststorm_dune_crest_x",
|
||||
"poststorm_dune_toe_x",
|
||||
],
|
||||
)
|
||||
df_crest_toes["profile_type"] = df_crest_toes.variable.str.extract(
|
||||
r"(prestorm|poststorm)"
|
||||
)
|
||||
df_crest_toes["point_type"] = df_crest_toes.variable.str.extract(
|
||||
r"(dune_crest_x|dune_toe_x)"
|
||||
)
|
||||
df_crest_toes = df_crest_toes.drop(columns=["variable"])
|
||||
df_crest_toes = df_crest_toes.sort_values("site_id")
|
||||
df_crest_toes = df_crest_toes.set_index(["site_id", "profile_type", "point_type"])
|
||||
df_crest_toes = df_crest_toes.unstack()
|
||||
df_crest_toes.columns = df_crest_toes.columns.droplevel()
|
||||
|
||||
# Now let's calculate the corresponding z elevations for each of our x coordinates
|
||||
for site_id in df_crest_toes.index.get_level_values("site_id").unique():
|
||||
logger.info("Calculating dune toe/crest z elevations for {}".format(site_id))
|
||||
|
||||
# Get profile for this site
|
||||
idx = pd.IndexSlice
|
||||
df_profile = df_profiles.loc[idx[site_id, :, :], :]
|
||||
|
||||
for param in ["prestorm", "poststorm"]:
|
||||
for loc in ["crest", "toe"]:
|
||||
|
||||
# Get x value to find corresponding z value
|
||||
x_val = df_crest_toes.loc[(site_id, param), "dune_{}_x".format(loc)]
|
||||
|
||||
if np.isnan(x_val):
|
||||
df_crest_toes.loc[
|
||||
(site_id, param), "dune_{}_z".format(loc)
|
||||
] = np.nan
|
||||
continue
|
||||
|
||||
# Try get the value from the other profile if we return nan or empty dataframe
|
||||
df_z = df_profile.loc[idx[site_id, param, x_val], :]
|
||||
if np.isnan(df_z.z):
|
||||
if param == "prestorm":
|
||||
new_param = "poststorm"
|
||||
elif param == "poststorm":
|
||||
new_param = "prestorm"
|
||||
z_val = df_profile.loc[idx[site_id, new_param, x_val], :].z
|
||||
else:
|
||||
z_val = df_z.z
|
||||
|
||||
# Put results back into merged dataframe
|
||||
df_crest_toes.loc[(site_id, param), "dune_{}_z".format(loc)] = z_val
|
||||
|
||||
return df_crest_toes
|
||||
|
||||
|
||||
def parse_dune_crest_toes(df_sites, crest_mat, toe_mat):
|
||||
"""
|
||||
:param df_sites:
|
||||
:param crest_mat:
|
||||
:param toe_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Parsing dune crests and toes")
|
||||
|
||||
rows = []
|
||||
crest_data = loadmat(crest_mat)
|
||||
toe_data = loadmat(toe_mat)
|
||||
|
||||
for n, _ in enumerate(crest_data["xc1"]):
|
||||
rows.extend(
|
||||
[
|
||||
{
|
||||
"dune_crest_x": crest_data["xc1"][n],
|
||||
"dune_crest_z": crest_data["zc1"][n],
|
||||
"dune_toe_x": toe_data["xt1"][n],
|
||||
"dune_toe_z": toe_data["zt1"][n],
|
||||
"profile_type": "prestorm",
|
||||
"site_no": n + 1,
|
||||
},
|
||||
{
|
||||
"dune_crest_x": crest_data["xc2"][n],
|
||||
"dune_crest_z": crest_data["zc2"][n],
|
||||
"dune_toe_x": toe_data["xt2"][n],
|
||||
"dune_toe_z": toe_data["zt2"][n],
|
||||
"profile_type": "poststorm",
|
||||
"site_no": n + 1,
|
||||
},
|
||||
]
|
||||
)
|
||||
|
||||
df_profile_features = pd.DataFrame(rows)
|
||||
|
||||
# Want the site_id instead of the site_no, so merge in df_sites
|
||||
df_sites.reset_index(inplace=True)
|
||||
df_profile_features = df_sites[["site_no", "site_id"]].merge(
|
||||
df_profile_features, how="outer", on=["site_no"]
|
||||
)
|
||||
df_profile_features.drop(columns=["site_no"], inplace=True)
|
||||
df_profile_features.set_index(["site_id", "profile_type"], inplace=True)
|
||||
df_profile_features.sort_index(inplace=True)
|
||||
df_profile_features = df_profile_features.round(3)
|
||||
|
||||
return df_profile_features
|
||||
|
||||
|
||||
def parse_waves(waves_mat):
|
||||
"""
|
||||
Parses the raw waves.mat file and returns a pandas dataframe
|
||||
:param waves_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Parsing %s", waves_mat)
|
||||
mat_data = loadmat(waves_mat)["data"]
|
||||
rows = []
|
||||
for i in range(0, len(mat_data["site"])):
|
||||
for j in range(0, len(mat_data["dates"][i])):
|
||||
rows.append(
|
||||
{
|
||||
"beach": mat_data["site"][i],
|
||||
"lon": mat_data["lon"][i],
|
||||
"lat": mat_data["lat"][i],
|
||||
"datetime": matlab_datenum_to_datetime(mat_data["dates"][i][j][0]),
|
||||
"Hs": mat_data["H"][i][j][0],
|
||||
"Hs0": mat_data["Ho"][i][j][0],
|
||||
"Tp": mat_data["T"][i][j][0],
|
||||
"dir": mat_data["D"][i][j][0],
|
||||
"E": mat_data["E"][i][j][0],
|
||||
"P": mat_data["P"][i][j][0],
|
||||
"Exs": mat_data["Exs"][i][j][0],
|
||||
"Pxs": mat_data["Pxs"][i][j][0],
|
||||
"Ecum": mat_data["Ecum"][i],
|
||||
"Exscum": mat_data["Exscum"][i],
|
||||
"Pcum": mat_data["Pxscum"][i],
|
||||
"Pxscum": mat_data["Pxscum"][i],
|
||||
}
|
||||
)
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
df["datetime"] = df["datetime"].dt.round("1s")
|
||||
return df
|
||||
|
||||
|
||||
def parse_tides(tides_mat):
|
||||
"""
|
||||
Parses the raw tides.mat file and returns a pandas dataframe
|
||||
:param tides_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Parsing %s", tides_mat)
|
||||
mat_data = loadmat(tides_mat)["data"]
|
||||
rows = []
|
||||
for i in range(0, len(mat_data["site"])):
|
||||
for j in range(0, len(mat_data["time"])):
|
||||
rows.append(
|
||||
{
|
||||
"beach": mat_data["site"][i][0],
|
||||
"lon": mat_data["lons"][i][0],
|
||||
"lat": mat_data["lats"][i][0],
|
||||
"datetime": matlab_datenum_to_datetime(mat_data["time"][j][0]),
|
||||
"tide": mat_data["tide"][i][j],
|
||||
}
|
||||
)
|
||||
|
||||
df = pd.DataFrame(rows)
|
||||
df["datetime"] = df["datetime"].dt.round("1s")
|
||||
return df
|
||||
|
||||
|
||||
def parse_profiles_and_sites(profiles_mat):
|
||||
"""
|
||||
Parses the raw profiles.mat file and returns a pandas dataframe
|
||||
:param tides_mat:
|
||||
:return:
|
||||
"""
|
||||
logger.info("Parsing %s", profiles_mat)
|
||||
mat_data = loadmat(profiles_mat)["data"]
|
||||
profile_rows = []
|
||||
site_rows = []
|
||||
site_counter = 0
|
||||
|
||||
# Our z values can come from these columns, depending on the isgood flag.
|
||||
# Let's reoganise them into a list of list
|
||||
z_names = ["Zpre", "Zpost", "Zrec1", "Zrec2", "Zrec3", "Zrec4"]
|
||||
z_cols = [mat_data[col] for col in z_names]
|
||||
z_sites = []
|
||||
for cols in zip(*z_cols):
|
||||
z_vals = []
|
||||
for z_vector in zip(*cols):
|
||||
z_vals.append([z[0] for z in z_vector])
|
||||
z_sites.append(z_vals)
|
||||
|
||||
for i, site in enumerate(mat_data["site"]):
|
||||
logger.debug("Processing site {} of {}".format(i + 1, len(mat_data["site"])))
|
||||
|
||||
# Give each site a unique id
|
||||
if len(site_rows) == 0 or site_rows[-1]["beach"] != site:
|
||||
site_counter = 1
|
||||
else:
|
||||
site_counter += 1
|
||||
site_id = "{}{:04d}".format(site, site_counter)
|
||||
|
||||
# Initalize location of x=200m latitude and longitude
|
||||
x_200_lat = np.nan
|
||||
x_200_lon = np.nan
|
||||
|
||||
# Want to calculation the orientation
|
||||
orientation = {}
|
||||
|
||||
for x, lat, lon, z_site, easting, northing in zip(
|
||||
mat_data["x"][i],
|
||||
mat_data["lats"][i],
|
||||
mat_data["lons"][i],
|
||||
z_sites[i],
|
||||
mat_data["eastings"][i],
|
||||
mat_data["northings"][i],
|
||||
):
|
||||
|
||||
profile_type = None
|
||||
for j, is_good in enumerate([1] + mat_data["isgood"][i]):
|
||||
|
||||
# Assumes the first profile is always good and is the prestorm profike
|
||||
if j == 0:
|
||||
profile_type = "prestorm"
|
||||
z = z_site[j]
|
||||
land_lim = np.nan
|
||||
|
||||
# Skips bad profiles
|
||||
elif is_good == 0:
|
||||
continue
|
||||
|
||||
# Takes the first isgood profile as the post storm profile
|
||||
else:
|
||||
profile_type = "poststorm"
|
||||
z = z_site[j]
|
||||
land_lim = mat_data["landlims"][i][j]
|
||||
|
||||
survey_datetime = matlab_datenum_to_datetime(
|
||||
mat_data["surveydates"][i][j]
|
||||
)
|
||||
|
||||
# Keep a record of the where the center of the profile is located, and the locations of the land
|
||||
# and sea
|
||||
|
||||
# TODO: This code isn't very transferrable. What if we don't have lat/lons at 200 m? Relook at this
|
||||
if x[0] == 200:
|
||||
x_200_lat = lat[0]
|
||||
x_200_lon = lon[0]
|
||||
elif x[0] == 0:
|
||||
orientation["land_easting"] = easting[0]
|
||||
orientation["land_northing"] = northing[0]
|
||||
elif x[0] == 400:
|
||||
orientation["sea_easting"] = easting[0]
|
||||
orientation["sea_northing"] = northing[0]
|
||||
|
||||
profile_rows.append(
|
||||
{
|
||||
"site_id": site_id,
|
||||
"lon": lon[0],
|
||||
"lat": lat[0],
|
||||
"profile_type": profile_type,
|
||||
"x": x[0],
|
||||
"z": z,
|
||||
"land_lim": land_lim,
|
||||
"survey_datetime": survey_datetime,
|
||||
}
|
||||
)
|
||||
|
||||
# Stop looking at profiles if we've got our post-storm profile
|
||||
if profile_type == "poststorm":
|
||||
break
|
||||
|
||||
orientation = math.degrees(
|
||||
math.atan2(
|
||||
orientation["land_northing"] - orientation["sea_northing"],
|
||||
orientation["land_easting"] - orientation["sea_easting"],
|
||||
)
|
||||
)
|
||||
site_rows.append(
|
||||
{
|
||||
"site_id": site_id,
|
||||
"site_no": i + 1,
|
||||
"beach": site,
|
||||
"lat": x_200_lat,
|
||||
"lon": x_200_lon,
|
||||
"orientation": orientation,
|
||||
"profile_x_lat_lon": 200,
|
||||
}
|
||||
)
|
||||
|
||||
df_profiles = pd.DataFrame(profile_rows)
|
||||
df_sites = pd.DataFrame(site_rows)
|
||||
|
||||
logger.info("Parsed profiles and sites")
|
||||
return df_profiles, df_sites
|
||||
|
||||
|
||||
def remove_zeros(df_profiles):
|
||||
"""
|
||||
When parsing the pre/post storm profiles, the end of some profiles have constant values of zero. Let's change
|
||||
these to NaNs for consistancy. Didn't use pandas fillnan because 0 may still be a valid value.
|
||||
:param df_profiles:
|
||||
:return:
|
||||
"""
|
||||
|
||||
logger.info("Removing zeros from end of profiles")
|
||||
df_profiles = df_profiles.sort_index()
|
||||
groups = df_profiles.groupby(level=["site_id", "profile_type"])
|
||||
for key, _ in groups:
|
||||
logger.debug("Removing zeros from {} profile at {}".format(key[1], key[0]))
|
||||
idx_site = (df_profiles.index.get_level_values("site_id") == key[0]) & (
|
||||
df_profiles.index.get_level_values("profile_type") == key[1]
|
||||
)
|
||||
df_profile = df_profiles[idx_site]
|
||||
x_last_ele = df_profile[df_profile.z == 0].index.get_level_values("x")[0]
|
||||
df_profiles.loc[
|
||||
idx_site & (df_profiles.index.get_level_values("x") > x_last_ele), "z"
|
||||
] = np.nan
|
||||
logger.info("Removed zeros from end of profiles")
|
||||
|
||||
return df_profiles
|
||||
|
||||
|
||||
def matlab_datenum_to_datetime(matlab_datenum):
|
||||
"""
|
||||
Adapted from https://stackoverflow.com/a/13965852
|
||||
:param matlab_datenum:
|
||||
:return:
|
||||
"""
|
||||
return (
|
||||
datetime.fromordinal(int(matlab_datenum))
|
||||
+ timedelta(days=matlab_datenum % 1)
|
||||
- timedelta(days=366)
|
||||
)
|
||||
|
||||
|
||||
def replace_unique_sites(df, df_sites):
|
||||
"""
|
||||
Replaces beach/lat/lon columns with the unique site_id
|
||||
:param dfs:
|
||||
:param df_sites:
|
||||
:return:
|
||||
"""
|
||||
# Make the sites index a column, so it can be merged into df
|
||||
df_sites["site_id"] = df_sites.index.get_level_values("site_id")
|
||||
|
||||
# Create eastings and northings so we can calculate distances
|
||||
site_points = [
|
||||
convert_coord_systems(Point(lon, lat)).xy
|
||||
for lon, lat in zip(df_sites["lon"], df_sites["lat"])
|
||||
]
|
||||
df_sites["easting"] = [x[0][0] for x in site_points]
|
||||
df_sites["northing"] = [x[1][0] for x in site_points]
|
||||
|
||||
# Process each unique combination lat/lons in groups
|
||||
groups = df.groupby(["lat", "lon"])
|
||||
for (lat, lon), df_group in groups:
|
||||
|
||||
# Calculate distances from each point to each site and determine closest site
|
||||
easting, northing = [x[0] for x in convert_coord_systems(Point(lon, lat)).xy]
|
||||
distances_to_sites = np.sqrt(
|
||||
(df_sites["easting"] - easting) ** 2
|
||||
+ (df_sites["northing"] - northing) ** 2
|
||||
)
|
||||
min_distance = distances_to_sites.min()
|
||||
closest_site = distances_to_sites.idxmin()
|
||||
|
||||
# Do some logging so we can check later.
|
||||
if min_distance > 1:
|
||||
logger.warning(
|
||||
"Closest site to (%.4f,%.4f) is %s (%.2f m away)",
|
||||
lat,
|
||||
lon,
|
||||
closest_site,
|
||||
min_distance,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
"Closest site to (%.4f,%.4f) is %s (%.2f m away)",
|
||||
lat,
|
||||
lon,
|
||||
closest_site,
|
||||
min_distance,
|
||||
)
|
||||
|
||||
# Assign site_id based on closest site
|
||||
df.loc[df_group.index, "site_id"] = closest_site
|
||||
|
||||
nan_count = df.site_id.isna().sum()
|
||||
if nan_count > 0:
|
||||
logger.warning(
|
||||
"Not all records (%d of %d) matched with a unique site", nan_count, len(df)
|
||||
)
|
||||
|
||||
df = df.drop(columns=["lat", "lon", "beach"])
|
||||
|
||||
return df
|
||||
|
||||
|
||||
def split_site_wave_params(df_waves):
|
||||
"""
|
||||
When we parse the waves.mat file, the cumulative wave energy and power properties are given for each time step.
|
||||
This is unnecessary, so let's extract them out of our dataframe and put them in their own seperate dataframe.
|
||||
:param df_waves:
|
||||
:return:
|
||||
"""
|
||||
cols_to_extract = ["Ecum", "Exscum", "Pcum", "Pxscum"]
|
||||
|
||||
df_sites_waves = df_waves.loc[:, cols_to_extract].groupby(["site_id"]).first()
|
||||
df_waves = df_waves.drop(columns=cols_to_extract, errors="ignore")
|
||||
return df_waves, df_sites_waves
|
||||
|
||||
|
||||
@click.command(short_help="create waves.csv")
|
||||
@click.option("--waves-mat", required=True, help=".mat file containing wave records")
|
||||
@click.option(
|
||||
"--sites-csv", required=True, help=".csv file description of cross section sites"
|
||||
)
|
||||
@click.option("--waves-output-file", required=True, help="where to save waves.csv")
|
||||
@click.option(
|
||||
"--sites-waves-output-file", required=True, help="where to save sites_waves.csv"
|
||||
)
|
||||
def create_waves_csv(waves_mat, sites_csv, waves_output_file, sites_waves_output_file):
|
||||
logger.info("Creating %s", waves_output_file)
|
||||
df_waves = parse_waves(waves_mat=waves_mat)
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_waves = replace_unique_sites(df_waves, df_sites)
|
||||
df_waves.set_index(["site_id", "datetime"], inplace=True)
|
||||
df_waves.sort_index(inplace=True)
|
||||
df_waves, df_sites_waves = split_site_wave_params(df_waves)
|
||||
df_waves.to_csv(waves_output_file, float_format="%.4f")
|
||||
df_sites_waves.to_csv(sites_waves_output_file, float_format="%.4f")
|
||||
logger.info("Created %s", waves_output_file)
|
||||
logger.info("Created %s", sites_waves_output_file)
|
||||
|
||||
|
||||
# @click.command(short_help="create profile_features.csv")
|
||||
# @click.option("--crest-mat", required=True, help=".mat file containing wave records")
|
||||
# @click.option("--toe-mat", required=True, help=".mat file containing wave records")
|
||||
# @click.option("--sites-csv", required=True, help=".csv file description of cross section sites")
|
||||
# @click.option("--output-file", required=True, help="where to save waves.csv")
|
||||
# def create_profile_features(crest_mat, toe_mat, sites_csv, output_file):
|
||||
# logger.info("Creating %s", output_file)
|
||||
# df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
# df_profile_features = parse_dune_crest_toes(df_sites, crest_mat, toe_mat)
|
||||
# df_profile_features.to_csv(output_file)
|
||||
# logger.info("Created %s", output_file)
|
||||
|
||||
|
||||
@click.command(short_help="create profile_features.csv")
|
||||
@click.option(
|
||||
"--profile-features-csv", required=True, help=".mat file containing wave records"
|
||||
)
|
||||
@click.option("--profiles-csv", required=True, help=".mat file containing wave records")
|
||||
@click.option("--output-file", required=True, help="where to save waves.csv")
|
||||
def create_crest_toes(profile_features_csv, profiles_csv, output_file):
|
||||
logger.info("Creating %s", output_file)
|
||||
|
||||
df_raw_features = pd.read_csv(profile_features_csv, index_col=[0])
|
||||
df_profiles = pd.read_csv(profiles_csv, index_col=[0, 1, 2])
|
||||
df_crest_toes = parse_crest_toes(df_raw_features, df_profiles)
|
||||
|
||||
df_crest_toes.to_csv(output_file, float_format="%.3f")
|
||||
logger.info("Created %s", output_file)
|
||||
|
||||
|
||||
@click.command(short_help="create profiles.csv")
|
||||
@click.option(
|
||||
"--profiles-mat", required=True, help=".mat file containing beach profiles"
|
||||
)
|
||||
@click.option(
|
||||
"--profiles-output-file", required=True, help="where to save profiles.csv"
|
||||
)
|
||||
@click.option("--sites-output-file", required=True, help="where to save sites.csv")
|
||||
def create_sites_and_profiles_csv(
|
||||
profiles_mat, profiles_output_file, sites_output_file
|
||||
):
|
||||
logger.info("Creating sites and profiles csvs")
|
||||
df_profiles, df_sites = parse_profiles_and_sites(profiles_mat=profiles_mat)
|
||||
df_profiles.set_index(["site_id", "profile_type", "x"], inplace=True)
|
||||
df_profiles.sort_index(inplace=True)
|
||||
df_profiles = remove_zeros(df_profiles)
|
||||
|
||||
df_sites.set_index(["site_id"], inplace=True)
|
||||
df_sites.sort_index(inplace=True)
|
||||
|
||||
df_profiles.to_csv(profiles_output_file, float_format="%.8f")
|
||||
logger.info("Created %s", profiles_output_file)
|
||||
df_sites.to_csv(sites_output_file, float_format="%.8f")
|
||||
logger.info("Created %s", sites_output_file)
|
||||
|
||||
|
||||
@click.command(short_help="create profiles.csv")
|
||||
@click.option("--tides-mat", required=True, help=".mat file containing tides")
|
||||
@click.option(
|
||||
"--sites-csv", required=True, help=".csv file description of cross section sites"
|
||||
)
|
||||
@click.option("--output-file", required=True, help="where to save tides.csv")
|
||||
def create_tides_csv(tides_mat, sites_csv, output_file):
|
||||
logger.info("Creating %s", output_file)
|
||||
df_tides = parse_tides(tides_mat=tides_mat)
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_tides = replace_unique_sites(df_tides, df_sites)
|
||||
df_tides.set_index(["site_id", "datetime"], inplace=True)
|
||||
df_tides.sort_index(inplace=True)
|
||||
df_tides.to_csv(output_file, float_format="%.4f")
|
||||
logger.info("Created %s", output_file)
|
||||
|
||||
|
||||
@click.command(short_help="create sites_grain_size.csv")
|
||||
@click.option(
|
||||
"--grain-size-csv",
|
||||
required=True,
|
||||
help=".csv file description of cross section sites",
|
||||
)
|
||||
@click.option("--output-file", required=True, help="where to save sites_grain_size.csv")
|
||||
def create_grain_size_csv(grain_size_csv, output_file):
|
||||
logger.info("Creating %s", output_file)
|
||||
df_sites_grain_size = pd.read_csv(grain_size_csv, index_col=[0])
|
||||
|
||||
# Calculate roughness, refer to Power et al (2018)
|
||||
df_sites_grain_size["r"] = 2.5 * df_sites_grain_size["d_50"]
|
||||
df_sites_grain_size.to_csv(output_file)
|
||||
logger.info("Created %s", output_file)
|
||||
|
||||
|
||||
@click.group()
|
||||
def cli():
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
cli.add_command(create_waves_csv)
|
||||
cli.add_command(create_sites_and_profiles_csv)
|
||||
cli.add_command(create_tides_csv)
|
||||
cli()
|
@ -1,168 +0,0 @@
|
||||
"""
|
||||
This file can probably be removed since we are not reading from shp files any more
|
||||
"""
|
||||
|
||||
import click
|
||||
import fiona
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from shapely.geometry import LineString, Point
|
||||
from shapely.geometry import shape
|
||||
|
||||
from logs import setup_logging
|
||||
from utils import convert_coord_systems
|
||||
|
||||
logger = setup_logging()
|
||||
|
||||
|
||||
def shapes_from_shp(shp_file):
|
||||
"""
|
||||
Parses a shape file and returns a list of shapely shapes, ids and properties
|
||||
:param shp_file:
|
||||
:return:
|
||||
"""
|
||||
shapes = []
|
||||
ids = []
|
||||
properties = []
|
||||
for feat in fiona.open(shp_file, "r"):
|
||||
shapes.append(shape(feat["geometry"]))
|
||||
ids.append(feat["id"])
|
||||
properties.append(feat["properties"])
|
||||
return shapes, ids, properties
|
||||
|
||||
|
||||
def distance_to_intersection(
|
||||
lat, lon, landward_orientation, beach, line_strings, line_properties
|
||||
):
|
||||
"""
|
||||
Returns the distance at whjch a line drawn from a lat/lon at an orientation intersects a line stinrg
|
||||
:param lat:
|
||||
:param lon:
|
||||
:param landward_orientation: Angle, anticlockwise positive from east in degrees, towards the land
|
||||
towards the
|
||||
land.
|
||||
:param line_string:
|
||||
:return:
|
||||
"""
|
||||
start_point = Point(lon, lat)
|
||||
start_point = convert_coord_systems(start_point)
|
||||
|
||||
distance = 1000 # m look up to 1000m for an intersection
|
||||
landward_point = Point(
|
||||
start_point.coords.xy[0] + distance * np.cos(np.deg2rad(landward_orientation)),
|
||||
start_point.coords.xy[1] + distance * np.sin(np.deg2rad(landward_orientation)),
|
||||
)
|
||||
landward_line = LineString([start_point, landward_point])
|
||||
seaward_point = Point(
|
||||
start_point.coords.xy[0] - distance * np.cos(np.deg2rad(landward_orientation)),
|
||||
start_point.coords.xy[1] - distance * np.sin(np.deg2rad(landward_orientation)),
|
||||
)
|
||||
seaward_line = LineString([start_point, seaward_point])
|
||||
|
||||
# Look at relevant line_strings which have the same beach property in order to reduce computation time
|
||||
line_strings = [
|
||||
s for s, p in zip(line_strings, line_properties) if p["beach"] == beach
|
||||
]
|
||||
|
||||
# Check whether profile_line intersects with any lines in line_string. If intersection point is landwards,
|
||||
# consider this negative, otherwise seawards is positive.
|
||||
for line_string in line_strings:
|
||||
land_intersect_points = landward_line.intersection(line_string)
|
||||
if not land_intersect_points.is_empty:
|
||||
return -land_intersect_points.distance(start_point)
|
||||
|
||||
sea_intersect_points = seaward_line.intersection(line_string)
|
||||
if not sea_intersect_points.is_empty:
|
||||
return sea_intersect_points.distance(start_point)
|
||||
|
||||
# If no intersections are found, return nothing.
|
||||
return None
|
||||
|
||||
|
||||
def beach_profile_elevation(x_coord, df_profiles, profile_type, site_id):
|
||||
"""
|
||||
Returns the beach profile elevation at a particular x-coordinate
|
||||
:param x_coord:
|
||||
:param df_profiles:
|
||||
:param profile_type: "prestorm" or "poststorm"
|
||||
:param site_id:
|
||||
:return:
|
||||
"""
|
||||
|
||||
if np.isnan(x_coord):
|
||||
return None
|
||||
|
||||
# Get profile
|
||||
df_profile = df_profiles.query(
|
||||
'profile_type == "{}" and site_id =="{}"'.format(profile_type, site_id)
|
||||
)
|
||||
return np.interp(x_coord, df_profile.index.get_level_values("x"), df_profile["z"])
|
||||
|
||||
|
||||
def parse_profile_features(df_sites, df_profiles, dune_crest_shp, dune_toe_shp):
|
||||
"""
|
||||
Reads dune crest and toe files and creates a pandas dataframe with crest/toe locations at each site
|
||||
:return:
|
||||
"""
|
||||
|
||||
# Get site information. Base our profile features on each site
|
||||
df_profile_features = df_sites
|
||||
|
||||
features = {
|
||||
"dune_crest": {"file": dune_crest_shp},
|
||||
"dune_toe": {"file": dune_toe_shp},
|
||||
}
|
||||
|
||||
# Import our dune crest and toes
|
||||
for feat in features.keys():
|
||||
shapes, _, properties = shapes_from_shp(features[feat]["file"])
|
||||
shapes = [convert_coord_systems(x) for x in shapes]
|
||||
|
||||
# Figure out the x coordinates of our crest and toes, by looking at where our beach sections intersect our
|
||||
# shape files.
|
||||
col_name = "{}_x".format(feat)
|
||||
df_profile_features[col_name] = df_profile_features[
|
||||
"profile_x_lat_lon"
|
||||
] + df_profile_features.apply(
|
||||
lambda row: distance_to_intersection(
|
||||
row["lat"],
|
||||
row["lon"],
|
||||
row["orientation"],
|
||||
row["beach"],
|
||||
shapes,
|
||||
properties,
|
||||
),
|
||||
axis=1,
|
||||
)
|
||||
# Get the elevations of the crest and toe
|
||||
col_name = "{}_z".format(feat)
|
||||
df_profile_features[col_name] = df_profile_features.apply(
|
||||
lambda row: beach_profile_elevation(
|
||||
row["{}_x".format(feat)], df_profiles, "prestorm", row.name
|
||||
),
|
||||
axis=1,
|
||||
)
|
||||
|
||||
df_profile_features = df_profile_features.drop(
|
||||
columns=["beach", "lat", "lon", "orientation", "profile_x_lat_lon"]
|
||||
)
|
||||
return df_profile_features
|
||||
|
||||
|
||||
@click.command(short_help="create .csv of dune toe and crest positions")
|
||||
@click.option("--dune-crest-shp", required=True, help=".csv file to convert")
|
||||
@click.option("--dune-toe-shp", required=True, help="where to store .shp file")
|
||||
@click.option("--sites-csv", required=True, help="where to store .shp file")
|
||||
@click.option("--profiles-csv", required=True, help="where to store .shp file")
|
||||
@click.option("--output-csv", required=True, help="where to store .shp file")
|
||||
def create_profile_features(
|
||||
dune_crest_shp, dune_toe_shp, sites_csv, profiles_csv, output_csv
|
||||
):
|
||||
logger.info("Creating .csv of dune crests and toes")
|
||||
df_sites = pd.read_csv(sites_csv, index_col=[0])
|
||||
df_profiles = pd.read_csv(profiles_csv, index_col=[0, 1, 2])
|
||||
df_profile_features = parse_profile_features(
|
||||
df_sites, df_profiles, dune_crest_shp, dune_toe_shp
|
||||
)
|
||||
df_profile_features.to_csv(output_csv)
|
||||
logger.info("Done!")
|
@ -0,0 +1,161 @@
|
||||
import os
|
||||
from functools import partial
|
||||
|
||||
import fiona
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
import pyproj
|
||||
from shapely.geometry import LineString, Point
|
||||
from shapely.geometry import shape
|
||||
from shapely.ops import transform
|
||||
|
||||
|
||||
def shapes_from_shp(shp_file):
|
||||
"""
|
||||
Parses a shape file and returns a list of shapely shapes, ids and properties
|
||||
:param shp_file:
|
||||
:return:
|
||||
"""
|
||||
shapes = []
|
||||
ids = []
|
||||
properties = []
|
||||
for feat in fiona.open(shp_file, 'r'):
|
||||
shapes.append(shape(feat['geometry']))
|
||||
ids.append(feat['id'])
|
||||
properties.append(feat['properties'])
|
||||
return shapes, ids, properties
|
||||
|
||||
|
||||
def convert_coord_systems(g1, in_coord_system='EPSG:4326', out_coord_system='EPSG:28356'):
|
||||
"""
|
||||
Converts coordinates from one coordinates system to another. Needed because shapefiles are usually defined in
|
||||
lat/lon but should be converted to GDA to calculated distances.
|
||||
https://gis.stackexchange.com/a/127432
|
||||
:param in_coord_system: Default is lat/lon WGS84
|
||||
:param out_coord_system: Default is GDA56 for NSW coastline
|
||||
:return:
|
||||
"""
|
||||
project = partial(
|
||||
pyproj.transform,
|
||||
pyproj.Proj(init=in_coord_system), # source coordinate system
|
||||
pyproj.Proj(init=out_coord_system)) # destination coordinate system
|
||||
|
||||
g2 = transform(project, g1) # apply projection
|
||||
return g2
|
||||
|
||||
|
||||
def distance_to_intersection(lat, lon, landward_orientation, beach, line_strings, line_properties):
|
||||
"""
|
||||
Returns the distance at whjch a line drawn from a lat/lon at an orientation intersects a line stinrg
|
||||
:param lat:
|
||||
:param lon:
|
||||
:param landward_orientation: Angle, anticlockwise positive from east in degrees, towards the land
|
||||
towards the
|
||||
land.
|
||||
:param line_string:
|
||||
:return:
|
||||
"""
|
||||
start_point = Point(lon, lat)
|
||||
start_point = convert_coord_systems(start_point)
|
||||
|
||||
distance = 1000 # m look up to 1000m for an intersection
|
||||
landward_point = Point(start_point.coords.xy[0] + distance * np.cos(np.deg2rad(landward_orientation)),
|
||||
start_point.coords.xy[1] + distance * np.sin(np.deg2rad(landward_orientation)))
|
||||
landward_line = LineString([start_point, landward_point])
|
||||
seaward_point = Point(start_point.coords.xy[0] - distance * np.cos(np.deg2rad(landward_orientation)),
|
||||
start_point.coords.xy[1] - distance * np.sin(np.deg2rad(landward_orientation)))
|
||||
seaward_line = LineString([start_point, seaward_point])
|
||||
|
||||
# Look at relevant line_strings which have the same beach property in order to reduce computation time
|
||||
line_strings = [s for s, p in zip(line_strings, line_properties) if p['beach'] == beach]
|
||||
|
||||
# Check whether profile_line intersects with any lines in line_string. If intersection point is landwards,
|
||||
# consider this negative, otherwise seawards is positive.
|
||||
for line_string in line_strings:
|
||||
land_intersect_points = landward_line.intersection(line_string)
|
||||
if not land_intersect_points.is_empty:
|
||||
return -land_intersect_points.distance(start_point)
|
||||
|
||||
sea_intersect_points = seaward_line.intersection(line_string)
|
||||
if not sea_intersect_points.is_empty:
|
||||
return sea_intersect_points.distance(start_point)
|
||||
|
||||
# If no intersections are found, return nothing.
|
||||
return None
|
||||
|
||||
|
||||
def beach_profile_elevation(x_coord, df_profiles, profile_type, site_id):
|
||||
"""
|
||||
Returns the beach profile elevation at a particular x-coordinate
|
||||
:param x_coord:
|
||||
:param df_profiles:
|
||||
:param profile_type: "prestorm" or "poststorm"
|
||||
:param site_id:
|
||||
:return:
|
||||
"""
|
||||
|
||||
if np.isnan(x_coord):
|
||||
return None
|
||||
|
||||
# Get profile
|
||||
df_profile = df_profiles.query('profile_type == "{}" and site_id =="{}"'.format(profile_type, site_id))
|
||||
return np.interp(x_coord, df_profile.index.get_level_values('x'), df_profile['z'])
|
||||
|
||||
|
||||
def parse_profile_features(df_sites, df_profiles, dune_crest_shp, dune_toe_shp):
|
||||
"""
|
||||
Reads dune crest and toe files and creates a pandas dataframe with crest/toe locations at each site
|
||||
:return:
|
||||
"""
|
||||
|
||||
# Get site information. Base our profile features on each site
|
||||
df_profile_features = df_sites
|
||||
|
||||
features = {
|
||||
'dune_crest':
|
||||
{
|
||||
'file': dune_crest_shp
|
||||
},
|
||||
'dune_toe':
|
||||
{
|
||||
'file': dune_toe_shp
|
||||
},
|
||||
}
|
||||
|
||||
# Import our dune crest and toes
|
||||
for feat in features.keys():
|
||||
shapes, _, properties = shapes_from_shp(features[feat]['file'])
|
||||
shapes = [convert_coord_systems(x) for x in shapes]
|
||||
|
||||
# Figure out the x coordinates of our crest and toes, by looking at where our beach sections intersect our
|
||||
# shape files.
|
||||
col_name = '{}_x'.format(feat)
|
||||
df_profile_features[col_name] = df_profile_features['profile_x_lat_lon'] + \
|
||||
df_profile_features.apply(lambda row:
|
||||
distance_to_intersection(
|
||||
row['lat'], row['lon'], row['orientation'],
|
||||
row['beach'], shapes, properties),
|
||||
axis=1)
|
||||
# Get the elevations of the crest and toe
|
||||
col_name = '{}_z'.format(feat)
|
||||
df_profile_features[col_name] = df_profile_features.apply(lambda row:
|
||||
beach_profile_elevation(
|
||||
row['{}_x'.format(feat)],
|
||||
df_profiles,
|
||||
'prestorm',
|
||||
row.name),
|
||||
axis=1)
|
||||
|
||||
df_profile_features = df_profile_features.drop(columns=['beach', 'lat', 'lon', 'orientation'])
|
||||
return df_profile_features
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
data_folder = './data/interim'
|
||||
df_sites = pd.read_csv(os.path.join(data_folder, 'sites.csv'), index_col=[0])
|
||||
df_profiles = pd.read_csv(os.path.join(data_folder, 'profiles.csv'), index_col=[0, 1, 2])
|
||||
|
||||
dune_crest_shp = './data/raw/profile_features/dune_crests.shp'
|
||||
dune_toe_shp = './data/raw/profile_features/dune_toes.shp'
|
||||
df_profile_features = parse_profile_features(df_sites, df_profiles, dune_crest_shp, dune_toe_shp)
|
||||
df_profile_features.to_csv('./data/interim/profile_features.csv')
|
@ -0,0 +1,27 @@
|
||||
[loggers]
|
||||
keys=root, matplotlib
|
||||
|
||||
[handlers]
|
||||
keys=consoleHandler
|
||||
|
||||
[formatters]
|
||||
keys=simpleFormatter
|
||||
|
||||
[logger_root]
|
||||
level=DEBUG
|
||||
handlers=consoleHandler
|
||||
|
||||
[logger_matplotlib]
|
||||
level=WARNING
|
||||
handlers=consoleHandler
|
||||
qualname=matplotlib
|
||||
|
||||
[handler_consoleHandler]
|
||||
class=StreamHandler
|
||||
level=DEBUG
|
||||
formatter=simpleFormatter
|
||||
args=(sys.stdout,)
|
||||
|
||||
[formatter_simpleFormatter]
|
||||
format=%(asctime)s %(name)-17s %(levelname)-8s %(message)s
|
||||
datefmt=%a, %d %b %Y %H:%M:%S
|
@ -1,51 +0,0 @@
|
||||
---
|
||||
version: 1
|
||||
disable_existing_loggers: False
|
||||
formatters:
|
||||
simple:
|
||||
format: "[%(asctime)s] [%(filename)15.15s:%(lineno)4.4s %(funcName)15.15s] [%(levelname)-4.4s] %(message)s"
|
||||
datefmt: "%Y-%m-%d %H:%M:%S"
|
||||
|
||||
handlers:
|
||||
console:
|
||||
class: logging.StreamHandler
|
||||
level: INFO
|
||||
formatter: simple
|
||||
stream: ext://sys.stdout
|
||||
|
||||
info_file_handler:
|
||||
class: logging.handlers.RotatingFileHandler
|
||||
level: INFO
|
||||
formatter: simple
|
||||
filename: info.log
|
||||
maxBytes: 10485760 # 10MB
|
||||
backupCount: 3
|
||||
encoding: utf8
|
||||
|
||||
warning_file_handler:
|
||||
class: logging.handlers.RotatingFileHandler
|
||||
level: WARNING
|
||||
formatter: simple
|
||||
filename: warning.log
|
||||
maxBytes: 10485760 # 10MB
|
||||
backupCount: 3
|
||||
encoding: utf8
|
||||
|
||||
error_file_handler:
|
||||
class: logging.handlers.RotatingFileHandler
|
||||
level: ERROR
|
||||
formatter: simple
|
||||
filename: error.log
|
||||
maxBytes: 10485760 # 10MB
|
||||
backupCount: 3
|
||||
encoding: utf8
|
||||
|
||||
loggers:
|
||||
my_module:
|
||||
level: ERROR
|
||||
handlers: [console]
|
||||
propagate: no
|
||||
|
||||
root:
|
||||
level: DEBUG
|
||||
handlers: [console, info_file_handler, error_file_handler, warning_file_handler]
|
@ -1,17 +0,0 @@
|
||||
import logging.config
|
||||
import os
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
def setup_logging(path="./src/logging.yaml", default_level=logging.INFO):
|
||||
"""
|
||||
Setup logging configuration
|
||||
"""
|
||||
if os.path.exists(path):
|
||||
with open(path, "rt") as f:
|
||||
config = yaml.safe_load(f.read())
|
||||
logging.config.dictConfig(config)
|
||||
else:
|
||||
logging.basicConfig(level=default_level)
|
||||
return logging.getLogger(__name__)
|
@ -1,80 +0,0 @@
|
||||
from functools import partial
|
||||
|
||||
import numpy as np
|
||||
import pyproj
|
||||
from numpy import ma as ma
|
||||
from shapely.ops import transform
|
||||
|
||||
|
||||
def crossings(profile_x, profile_z, constant_z):
|
||||
"""
|
||||
Finds the x coordinate of a z elevation for a beach profile. Much faster than using shapely to calculate
|
||||
intersections since we are only interested in intersections of a constant value. Will return multiple
|
||||
intersections if found. Used in calculating beach slope.
|
||||
Adapted from https://stackoverflow.com/a/34745789
|
||||
:param profile_x: List of x coordinates for the beach profile section
|
||||
:param profile_z: List of z coordinates for the beach profile section
|
||||
:param constant_z: Float of the elevation to find corresponding x coordinates
|
||||
:return: List of x coordinates which correspond to the constant_z
|
||||
"""
|
||||
|
||||
# Remove nans to suppress warning messages
|
||||
valid = ~ma.masked_invalid(profile_z).mask
|
||||
profile_z = np.array(profile_z)[valid]
|
||||
profile_x = np.array(profile_x)[valid]
|
||||
|
||||
# Return empty list if mask removes all values
|
||||
if profile_x.size == 0:
|
||||
return []
|
||||
|
||||
# Normalize the 'signal' to zero.
|
||||
# Use np.subtract rather than a list comprehension for performance reasons
|
||||
z = np.subtract(profile_z, constant_z)
|
||||
|
||||
# Find all indices right before any crossing.
|
||||
# TODO Sometimes this can give a runtime warning https://stackoverflow.com/a/36489085
|
||||
indicies = np.where(z[:-1] * z[1:] < 0)[0]
|
||||
|
||||
# Use linear interpolation to find intersample crossings.
|
||||
x_crossings = [
|
||||
profile_x[i] - (profile_x[i] - profile_x[i + 1]) / (z[i] - z[i + 1]) * (z[i])
|
||||
for i in indicies
|
||||
]
|
||||
|
||||
# Also need to check the end points as the above will not include them if they are close
|
||||
x_crossings += [
|
||||
profile_x[i]
|
||||
for i in [0, len(profile_z) - 1]
|
||||
if np.isclose(constant_z, profile_z[i])
|
||||
]
|
||||
|
||||
return sorted(x_crossings)
|
||||
|
||||
|
||||
# TODO Think of a better way to do this than having to manually specify the coordinate systems
|
||||
def convert_coord_systems(
|
||||
g1, in_coord_system="EPSG:4326", out_coord_system="EPSG:28356"
|
||||
):
|
||||
"""
|
||||
Converts coordinates from one coordinates system to another. Needed because shapefiles are usually defined in
|
||||
lat/lon but should be converted to GDA to calculated distances.
|
||||
https://gis.stackexchange.com/a/127432
|
||||
:param in_coord_system: Default is lat/lon WGS84
|
||||
:param out_coord_system: Default is GDA56 for NSW coastline
|
||||
:return:
|
||||
"""
|
||||
project = partial(
|
||||
pyproj.transform,
|
||||
pyproj.Proj(init=in_coord_system), # source coordinate system
|
||||
pyproj.Proj(init=out_coord_system),
|
||||
) # destination coordinate system
|
||||
|
||||
g2 = transform(project, g1) # apply projection
|
||||
return g2
|
||||
|
||||
|
||||
def get_i_or_default(l, i, default=None):
|
||||
try:
|
||||
return l[i]
|
||||
except IndexError:
|
||||
return default
|
Loading…
Reference in New Issue