{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Hackathon Q1 - 2024" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Context" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Overall goal" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Setting up a workflow to enable running RA2CE with multiple flood scenarios in an efficient way, on a larger geographical scale and post-processing the outcomes effectively into meaningful results" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### User questions" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- [UQ1](#user-question-1) Which roads are most likely to get hit by flooding from this hurricane given its projected flood maps?\n", "- [UQ2](#user-question-2) Where are the flood impacts most disruptive in terms of accessibility? (detour length) = multi-link redundancy \n", "- [UQ3] Which areas/locations are most likely to be unreachable because of this hurricane given its possible tracks? (OD analysis) (to be refined)\n", "- [UQ4] Optional if damages module works: What is the range of minimum and maximum damages that might occur because of this hurricane given its possible tracks?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Results" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can inspect the generated results directly by running the \"visualize\" snippets of each user question:\n", "\n", "- [Results UQ.1](#uq1-results-visualization)\n", "- [Results UQ.2](#uq2-results-visualization)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Hackathon requirements" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- Being able to run RA2CE on a large scale with a complex network \n", " - This should be coded generically so that we could do this ‘anywhere’\n", " - But computational time increases a lot with the scale and complexity of the network – how to handle?\n", " - How to determine the extent of the network with different flood maps?\n", " - Splitting up in smaller subsets? How would that change workflow and results?\n", "- Running RA2CE in the cloud with multiple flood maps (100+) in different formats and storing the results\n", " - Being able to handle netcdf / data array / zar data formats?\n", " - Storing the different RA2CE runs and data efficiently\n", " - Skipping the second hazard overlay with the segmented graph as it increases computational time? \n", " - Running multiple flood maps that represent a time series and then adding a time dimension in RA2CE results / analysis \n", "- Having a script that handles and post-processes multiple results \n", " - Processing and storing results for all scenario runs and consolidate/merge them\n", " - Determining what the most interesting information is and how to communicate/visualize this\n", " - Visualization outputs such as statistics or ranges" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Generic workflow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The workflow to be run (with slight modifications) to solve all the [user questions](#user-questions) is as described in the following diagram:\n", "\n", "| ![ra2ce_cloud.drawio.png](./user_question_1/ra2ce_cloud.drawio.png)| \n", "|:--:| \n", "| *Generic hackathon workflow* |" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## User Question 1" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ1 Description" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "UQ1: Which roads are most likely to get hit by flooding from this hurricane given its projected flood maps?\n", "\n", "- __Input__: road extent (OSM), 100+ flood maps from SFINCS in Beira/Qualimane area and a RA2CE folder setup with .ini files.\n", "- __Proposed workflow__: multiple flood maps – for every flood map a seperate RA2CE run - for every scenario seperate results – save the results in an efficient way – postprocess into meaningful results.\n", "- __Expected output__: flood depths on edges for every scenario.\n", "- __Expected post-processed results__: per edge an indication of ‘likelihood’ of flooding (e.g. in 90 / 100 scenario’s this edge gets hit (% hits)). \n", "- __Acceptance level__: Tackle user question 1 on Tuesday.\n", "- Which questions for user questions 2,3,4 are also relevant for question 1?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Input" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- Collection of hazard files in `.tif` format.\n", "- Boundary box (coordinates) of the network extent.\n", "- ra2ce network configuration file in `.ini` file." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ1 workflow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Based on the [generic workflow](#generic-workflow) we create our own [Argo](https://argoproj.github.io/) worfklow. This configuration file can also be found at `user_question_1\\hackathon_workflow.yaml`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "vscode": { "languageId": "yaml" } }, "outputs": [], "source": [ "# Content of `user_question_1\\hackathon_workflow.yaml`\n", "apiVersion: argoproj.io/v1alpha1\n", "kind: Workflow\n", "metadata:\n", " generateName: ra2ce-hackathon-uq1-\n", "spec:\n", " entrypoint: scenario-workflow\n", " templates:\n", " - name: scenario-workflow\n", " steps:\n", " - - name: define-subdirs\n", " template: read-members\n", " - - name: run-scenario\n", " template: run-scenario\n", " arguments:\n", " parameters:\n", " - name: member\n", " value: \"{{item}}\"\n", " withParam: \"{{steps.define-subdirs.outputs.result}}\"\n", " - - name: post-processing\n", " template: post-processing\n", "\n", " - name: read-members\n", " script:\n", " image: 798877697266.dkr.ecr.eu-west-1.amazonaws.com/boto3:latest\n", " workingDir: /data\n", " command: [python]\n", " source: |\n", " import boto3\n", " import json\n", "\n", " bucket = 'ra2ce-data'\n", " prefix = 'sfincs_floodmaps_sub/'\n", " \n", " client = boto3.client('s3')\n", " result = client.list_objects(Bucket=bucket, Prefix=prefix, Delimiter='/')\n", "\n", " members = []\n", " for o in result.get('Contents'):\n", " mem = o.get('Key').split('/')[1].split('.')[0]\n", " if mem != \"\":\n", " members.append(mem)\n", " print(json.dumps(members))\n", "\n", " - name: run-scenario\n", " container:\n", " image: containers.deltares.nl/ra2ce/ra2ce:latest\n", " command: [\"python\", \"/scripts/user_question_1/hazard_overlay_cloud_run.py\"]\n", " nodeSelector:\n", " beta.kubernetes.io/instance-type: \"m5.xlarge\"\n", " inputs:\n", " parameters:\n", " - name: member\n", " artifacts:\n", " - name: input\n", " path: /input/{{inputs.parameters.member}}.tif\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: sfincs_floodmaps_sub/{{inputs.parameters.member}}.tif\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: data\n", " path: /data\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/data\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: scripts\n", " path: /scripts\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/scripts\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " outputs:\n", " artifacts:\n", " - name: ra2ce-output\n", " path: /data\n", " s3:\n", " bucket: ra2ce-data\n", " endpoint: s3.amazonaws.com\n", " region: eu-west-1\n", " key: beira_qualimane_sfincs_fm/output_q1/{{inputs.parameters.member}}/\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", "\n", " - name: post-processing\n", " container:\n", " image: containers.deltares.nl/ra2ce/ra2ce:latest\n", " command: [\"python\", \"/scripts/user_question_1/post_processing.py\"]\n", " nodeSelector:\n", " beta.kubernetes.io/instance-type: \"m5.xlarge\"\n", " inputs:\n", " artifacts:\n", " - name: output\n", " path: /data\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/output_q1\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: scripts\n", " path: /scripts\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/scripts\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " outputs:\n", " artifacts:\n", " - name: pp_output\n", " path: /output\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/postprocessing_output_q1\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ1 Pre-processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Preprocessing has three steps:\n", "- [the creation of a network](#network-creation),\n", "- [creating the buckets](#creating-the-buckets) by splitting the floodmaps on different directories,\n", "- and the optional [reprojection](#re-projecting)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Network creation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Preparing initial data" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# This script contains the first two sections of `hazard_overlay_preprocessing.ipynb`\n", "# General imports\n", "import geopandas as gpd\n", "import pandas as pd\n", "from pathlib import Path\n", "from shapely import geometry, Polygon\n", "from osgeo import gdal, osr\n", "from shutil import copyfile\n", "\n", "# RA2CE imports and constants\n", "from ra2ce.network.network_config_data.enums.network_type_enum import NetworkTypeEnum\n", "from ra2ce.network.network_config_data.enums.road_type_enum import RoadTypeEnum\n", "from ra2ce.network.network_config_data.network_config_data import NetworkConfigData, NetworkSection\n", "from ra2ce.network.network_wrappers.osm_network_wrapper.osm_network_wrapper import OsmNetworkWrapper\n", "from ra2ce.network.exporters.geodataframe_network_exporter import GeoDataFrameNetworkExporter\n", "from ra2ce.network.exporters.multi_graph_network_exporter import MultiGraphNetworkExporter\n", "\n", "INPUT_FOLDER = Path('P:\\moonshot2-casestudy\\SFINCS\\models')\n", "INPUT_FLOODMAP_FOLDER = 'floodmaps_wgs84'\n", "OUTPUT_FOLDER = Path('P:\\moonshot2-casestudy\\RA2CE')\n", "OUTPUT_FLOODMAP_FOLDER = 'floodmaps'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Determine the polygon of the total extent of the regions" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The extent of each region is read from file `region.geojson` and appended in a GeoDataFrame.\n", "From this GeoDataFrame a Polygon is extracted." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# This script contains the third section of `hazard_overlay_preprocessing.ipynb`\n", "# Loop over region folders and concatenate the extents\n", "_combined_gdf = gpd.GeoDataFrame()\n", "for _region in [f for f in INPUT_FOLDER.iterdir() if f.is_dir()]:\n", " _extent_file = _region.joinpath('gis', 'region.geojson')\n", " assert _extent_file.is_file()\n", " _gdf = gpd.read_file(_extent_file)\n", " _combined_gdf = pd.concat([_combined_gdf, _gdf.to_crs(4326)], ignore_index=True)\n", "\n", "# Extract polygon of the total extent spanning the concatenated regions\n", "_total_extent = _combined_gdf.total_bounds\n", "_polygon = Polygon(geometry.box(*_total_extent))\n", "\n", "# Write polygon (not required)\n", "_polygon_file = INPUT_FOLDER.joinpath('polygon.geojson')\n", "_polygon_gdf = gpd.GeoDataFrame(index=[0], geometry=[_polygon], crs='epsg:4326')\n", "_polygon_gdf.to_file(_polygon_file, driver='GeoJSON')\n", "\n", "# Write the combined extent to a new file (not required)\n", "_combined_extent_file = INPUT_FOLDER.joinpath('combined_extent.geojson')\n", "_combined_geojson = _combined_gdf.to_json()\n", "with open(_combined_extent_file, 'w') as f:\n", " f.write(_combined_geojson)\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Create network" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The network is downloaded from OSM based on the given polygon." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Read network from polygon\n", "\n", "_road_type_list = [RoadTypeEnum.MOTORWAY, RoadTypeEnum.MOTORWAY_LINK, RoadTypeEnum.TRUNK, RoadTypeEnum.TRUNK_LINK, RoadTypeEnum.PRIMARY, RoadTypeEnum.PRIMARY_LINK, RoadTypeEnum.SECONDARY, RoadTypeEnum.SECONDARY_LINK, RoadTypeEnum.TERTIARY, RoadTypeEnum.TERTIARY_LINK]\n", "_network_section = NetworkSection(network_type=NetworkTypeEnum.DRIVE, road_types=_road_type_list, save_gpkg=True)\n", "_config_data = NetworkConfigData(network=_network_section, static_path=OUTPUT_FOLDER.joinpath('static'))\n", "_network_wrapper = OsmNetworkWrapper.with_polygon(_config_data, _polygon)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Export the network to file" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "[_graph, _gdf] = _network_wrapper.get_network()\n", "\n", "# Export the graph\n", "_exporter = MultiGraphNetworkExporter(basename='base_graph', export_types=['gpkg', 'pickle'])\n", "_exporter.export(export_path=Path(OUTPUT_FOLDER).joinpath('static', 'output_graph'), export_data=_graph)\n", "\n", "# Export the network\n", "_exporter = GeoDataFrameNetworkExporter(basename='base_network', export_types=['gpkg', 'pickle'])\n", "_exporter.export(export_path=Path(OUTPUT_FOLDER).joinpath('static', 'output_graph'), export_data=_gdf)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Re-projecting" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It might be possible that we require to pre-process the hazard files due to a different projection than WGS-84.\n", "\n", "This can be done either locally or \"in the cloud\". Do not forget this step is optional and is not always required to be run.\n", "\n", "The following snippet displays the content of the preprocessing python script in `user_question_1\\convert_floodmaps_to_wgs84.py`" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# user_question_1\\convert_floodmaps_to_wgs84.py\n", "\n", "import sys\n", "import gc\n", "import xarray as xr\n", "from pathlib import Path\n", "\n", "\n", "# Input and output folder paths\n", "_input_folder = Path(sys.argv[0])\n", "_output_folder = Path(sys.argv[1])\n", "\n", "# Iterate over each input GeoTIFF file\n", "for _input_tiff in list(_input_folder.glob(\"*.tif\")):\n", " _output_tiff = _output_folder.joinpath(_input_tiff.name)\n", "\n", " _input_dataarray = xr.open_dataarray(_input_tiff)\n", " _input_as_wgs84 = _input_dataarray.raster.reproject(dst_crs=4326)\n", "\n", " _input_as_wgs84.raster.to_raster(_output_tiff, driver=\"GTiff\", compress=\"lzw\")\n", "\n", " # Clean up\n", " del _input_dataarray, _input_as_wgs84\n", " gc.collect()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Creating the buckets" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We create buckets online with each containing our network configuration, network extent and only one hazard file. This way we spread the computation of each hazard overlay for enhanced performance." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Preparing the floodmaps" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# This script is the last section of `hazard_overlay_preprocessing.ipynb`\n", "# Reproject and save the floodmaps\n", "_output_folder = OUTPUT_FOLDER.joinpath(OUTPUT_FLOODMAP_FOLDER)\n", "if not _output_folder.exists():\n", " _output_folder.mkdir(parents=True)\n", "\n", "def check_projection(file) -> bool:\n", " _input_ds = gdal.Open(str(file))\n", " _source_proj = _input_ds.GetProjection()\n", " _srs = osr.SpatialReference(wkt=_source_proj)\n", " if not _srs.IsProjected:\n", " return False\n", " return _srs.GetAttrValue('geogcs') == 'WGS 84'\n", "\n", "for _region in [f for f in INPUT_FOLDER.iterdir() if f.is_dir()]:\n", " _models_dir = _region.joinpath(INPUT_FLOODMAP_FOLDER)\n", " for _floodmap in _models_dir.iterdir():\n", " _output_file = _output_folder.joinpath(_floodmap.name)\n", " if not check_projection(_floodmap):\n", " raise ValueError(f'Floodmap {_floodmap} is not in the right projection')\n", " copyfile(_floodmap, _output_file)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ1 Processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Running the hazard overlay" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In each bucket, we do a simple ra2ce run by modifying the `NetworkConfigData.hazard.hazard_map` property so that instead of having 'n' defined hazard files, contains only the name of the available hazard file for its executing \"bucket\".\n", "\n", "The script to be run (`user_question_1\\hazard_overlay_cloud_run.py`) is as follows:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# user_question_1\\hazard_overlay_cloud_run.py\n", "\n", "from pathlib import Path\n", "\n", "from ra2ce.ra2ce_handler import Ra2ceHandler\n", "\n", "\"\"\"\n", " This script runs a network WITHOUT an analysis,\n", " as we are only interested to retrieve the HAZARD overlay.\n", "\"\"\"\n", "\n", "# Create one network configuration per provided hazard.\n", "# We assume the whole input directory will be mounted in `/data`\n", "_root_dir = Path(\"/data\")\n", "assert _root_dir.exists()\n", "\n", "_network_file = _root_dir.joinpath(\"network.ini\")\n", "assert _network_file.exists()\n", "\n", "_tif_data_directory = Path(\"/input\")\n", "assert _tif_data_directory.exists()\n", "\n", "# Run analysis\n", "_handler = Ra2ceHandler(_network_file, analysis=None)\n", "_handler.input_config.network_config.config_data.hazard.hazard_map = [\n", " list(_tif_data_directory.glob(\"*.tif\"))[0]\n", "]\n", "\n", "# Try to get only RELEVANT info messages.\n", "import warnings\n", "\n", "warnings.filterwarnings(\"ignore\")\n", "\n", "_handler.configure()\n", "_handler.run_analysis()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "### UQ1 Post-processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Unifying the outputs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Because we ran ra2ce with one container per hazard file, it means we have our output spread over different containers. We then unify all the available outputs and export its content into both a `.json` (geojson) and a `.feather` file.\n", "\n", "This script can be found as `user_question_1\\post_processing.py`:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# user_question_1\\post_processing.py \n", "\n", "from pathlib import Path\n", "import pandas as pd\n", "import geopandas as gpd\n", "\n", "# set the required parameters\n", "cloud_output_folder = Path('/data')\n", "\n", "event_names = [folder.stem for folder in cloud_output_folder.iterdir() if folder.is_dir()]\n", "# save as a .py script\n", "aggregate_wl = \"max\" # this we should get from network_config or analysis_config\n", "\n", "aggregate_wls = {\n", " \"max\": \"ma\",\n", " \"mean\": \"me\",\n", " \"min\": \"mi\"\n", "}\n", "\n", "aggregate_wl_column = aggregate_wls[aggregate_wl]\n", "event_wl_column = \"EV1_\" + aggregate_wl_column\n", "event_fraction_column = \"EV1_fr\"\n", "fid_column = \"rfid\"\n", "\n", "for i, event_name in enumerate(event_names):\n", " overlay_output_path = cloud_output_folder / event_name / \"static\" / \"output_graph\" / \"base_graph_hazard_edges.gpkg\"\n", " overlay_output_gdf = gpd.read_file(overlay_output_path)\n", " \n", " # Define the mapping of EV1_ column names to event_name\n", " column_mapping = {event_wl_column: f\"{event_name}_\" + aggregate_wl_column, event_fraction_column: f\"{event_name}_fr\"}\n", " overlay_output_gdf = overlay_output_gdf.rename(columns=column_mapping)\n", "\n", " if i == 0:\n", " # create the base gdf that aggregate all results\n", " result_gdf = overlay_output_gdf\n", " else:\n", " filtered_overlay_output_gdf = overlay_output_gdf[[fid_column, f\"{event_name}_\" + aggregate_wl_column, f\"{event_name}_fr\"]]\n", " result_gdf = pd.merge(\n", " result_gdf,\n", " filtered_overlay_output_gdf,\n", " left_on=fid_column,\n", " right_on=fid_column,\n", " )\n", "\n", "gdf = result_gdf\n", "\n", "gdf.to_file('/output/result_gdf.geojson', driver=\"GeoJSON\")\n", "gdf.to_feather('/output/result_gdf.feather')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Expected result" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "During the hackathon the above steps where run through the cloud infrastructure, delivering us a geojson file (`user_question_1\\result_gdf.json`) which we summarized, due to its large size, in the following snippet:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "{\n", "\"type\": \"FeatureCollection\",\n", "\"crs\": { \"type\": \"name\", \"properties\": { \"name\": \"urn:ogc:def:crs:OGC:1.3:CRS84\" } },\n", "\"features\": [\n", " { \"type\": \"Feature\", \"properties\": { \"u\": 2312756550, \"v\": 9435564285, \"key\": 0, \"ref\": \"N324\", \"highway\": \"secondary\", \"oneway\": false, \"reversed\": [ false, true ], \"length\": 32701.151, \"rfid_c\": [ 32, 34, 36, 38, 40, 44, 46, 48, 54, 56, 1092, 68, 70, 74, 76, 78, 80, 84, 90, 92, 94, 96, 98, 100, 104, 106, 108, 110, 112, 114, 120, 122, 124, 1661, 126, 1663, 128, 1665, 1666, 130, 1668, 132, 134, 1671, 136, 138, 140, 141, 144, 146, 148, 150, 152, 154, 156, 158, 1183, 160, 1185, 162, 1187, 164, 1189, 166, 168, 170, 172, 174, 176, 178, 180, 182, 3255, 184, 3257, 186, 3259, 3260, 189, 3262, 188, 3264, 192, 3266, 194, 3268, 196, 3270, 3271, 200, 198, 202, 204, 206, 208, 210, 212, 214, 216, 218, 220, 1767, 1769, 1805, 1807, 1809, 1811, 1875, 1878, 1880, 1882, 1884, 1885, 1888, 1890, 1892, 1894, 1896, 1898, 1900, 1902, 1904, 1906, 1908, 1910, 1912, 1914, 1916, 1918, 1920, 1922, 1924, 1926, 1928, 1930, 1932, 1934, 1936, 981 ], \"rfid\": 10, \"bridge\": \"yes\", \"avgspeed\": 59.0, \"time\": 1995.0, \"TC_0064_ma\": 0.0, \"TC_0064_fr\": 0.0, \"junction\": \"nan\", \"lanes\": \"nan\", \"name\": \"nan\", \"maxspeed\": \"nan\", \"access\": \"nan\", \"width\": \"nan\", \"tunnel\": \"nan\", \"osmid_original\": [ 1021406339, 622806373, 622806374, 622806375, 622806376, 222294054, 617604301, 219115766, 219115775 ], \"TC_0098_ma\": 0.0, \"TC_0098_fr\": 0.0, \"TC_0120_ma\": 0.0, \"TC_0120_fr\": 0.0, \"TC_0142_ma\": 0.360931396484375, \"TC_0142_fr\": 0.0094489253302677528, \"TC_0145_ma\": 0.0, \"TC_0145_fr\": 0.0, \"TC_0147_ma\": 0.2004547119140625, \"TC_0147_fr\": 0.0017436712422498476, \"TC_0166_ma\": 0.0, \"TC_0166_fr\": 0.0, \"TC_0184_ma\": 0.0, \"TC_0184_fr\": 0.0, \"TC_0209_ma\": 0.0, \"TC_0209_fr\": 0.0, \"TC_0228_ma\": 0.0, \"TC_0228_fr\": 0.0, \"TC_0229_ma\": 0.0, \"TC_0229_fr\": 0.0, \"TC_0240_ma\": 0.0, \"TC_0240_fr\": 0.0, \"TC_0329_ma\": 0.0, \"TC_0329_fr\": 0.0, \"TC_0333_ma\": 0.0, \"TC_0333_fr\": 0.0, \"TC_0356_ma\": 0.0, \"TC_0356_fr\": 0.0, \"TC_0379_ma\": 0.51285552978515625, \"TC_0379_fr\": 0.023181703594162224, \"TC_0388_ma\": 0.0, \"TC_0388_fr\": 0.0, \"TC_0400_ma\": 0.0, \"TC_0400_fr\": 0.0, \"TC_0420_ma\": 0.0, \"TC_0420_fr\": 0.0, \"TC_0434_ma\": 0.0, \"TC_0434_fr\": 0.0, \"TC_0438_ma\": 0.0, \"TC_0438_fr\": 0.0, \"TC_0444_ma\": 0.0, \"TC_0444_fr\": 0.0, \"TC_0451_ma\": 0.0, \"TC_0451_fr\": 0.0, \"TC_0467_ma\": 0.0, \"TC_0467_fr\": 0.0, \"TC_0505_ma\": 0.0, \"TC_0505_fr\": 0.0, \"TC_0527_ma\": 0, \"TC_0527_fr\": 0.0, \"TC_0535_ma\": 0.0, \"TC_0535_fr\": 0.0, \"TC_0541_ma\": 0.0, \"TC_0541_fr\": 0.0, \"TC_0554_ma\": 0.0, \"TC_0554_fr\": 0.0, \"TC_0558_ma\": 0.0, \"TC_0558_fr\": 0.0, \"TC_0587_ma\": 0.0, \"TC_0587_fr\": 0.0, \"TC_0625_ma\": 0.0, \"TC_0625_fr\": 0.0, \"TC_0631_ma\": 0.0, \"TC_0631_fr\": 0.0, \"TC_0637_ma\": 0.16949844360351562, \"TC_0637_fr\": 0.0046076563923134204, \"TC_0642_ma\": 0.0, \"TC_0642_fr\": 0.0, \"TC_0653_ma\": 0.0, \"TC_0653_fr\": 0.0, \"TC_0673_ma\": 0.25518035888671875, \"TC_0673_fr\": 0.0066475256222551056, \"TC_0689_ma\": 0.0, \"TC_0689_fr\": 0.0, \"TC_0729_ma\": 0.0, \"TC_0729_fr\": 0.0, \"TC_0731_ma\": 0.0, \"TC_0731_fr\": 0.0, \"TC_0750_ma\": 0.0, \"TC_0750_fr\": 0.0, \"TC_0803_ma\": 0.0, \"TC_0803_fr\": 0.0, \"TC_0814_ma\": 0, \"TC_0814_fr\": 0.0, \"TC_0815_ma\": 0.16817092895507812, \"TC_0815_fr\": 0.0046076563923134204, \"TC_0821_ma\": 0.0, \"TC_0821_fr\": 0.0, \"TC_0862_ma\": 0.0, \"TC_0862_fr\": 0.0, \"TC_0866_ma\": 0.26370620727539062, \"TC_0866_fr\": 0.0054298714754633859, \"TC_0875_ma\": 0.0, \"TC_0875_fr\": 0.0, \"TC_0920_ma\": 0.0, \"TC_0920_fr\": 0.0 }, \"geometry\": { \"type\": \"LineString\", \"coordinates\": [ [ 39.137194, -16.832636 ], [ 39.1371437, -16.8326518 ], [ 39.1371132, -16.8326794 ], [ 39.1369741, -16.8328745 ], [ 39.1366267, -16.8334123 ], [ 39.136506, -16.8336999 ], [ 39.1364311, -16.8338932 ], [ 39.1361949, -16.8348038 ], [ 39.1357362, -16.8364776 ], [ 39.1355056, -16.8368987 ], [ 39.1353023, -16.8370004 ], [ 39.1350738, -16.8369684 ], [ 39.1345319, -16.8370039 ], [ 39.1337648, -16.8368524 ], [ 39.1334268, -16.8367523 ], [ 39.1330829, -16.8367434 ], [ 39.1327085, -16.8368687 ], [ 39.1310482, -16.8374011 ], [ 39.1274561, -16.8386137 ], [ 39.1208716, -16.8407742 ], [ 39.1182101, -16.8416181 ], [ 39.1152827, -16.8425892 ], [ 39.1115917, -16.8437746 ], [ 39.1096912, -16.8444536 ], [ 39.1022113, -16.8469119 ], [ 39.0944196, -16.8494248 ], [ 39.0943056, -16.8494689 ], [ 39.0941483, -16.8495183 ], [ 39.0939665, -16.8495819 ], [ 39.0937766, -16.8495923 ], [ 39.0935568, -16.8495637 ], [ 39.0932692, -16.8493897 ], [ 39.0902279, -16.847071 ], [ 39.0886028, -16.8459545 ], [ 39.087968, -16.8456507 ], [ 39.0855569, -16.8450996 ], [ 39.0845616, -16.8449492 ], [ 39.0838926, -16.8449646 ], [ 39.0834292, -16.8450533 ], [ 39.0824097, -16.8453503 ], [ 39.0788635, -16.8465999 ], [ 39.0776802, -16.8470814 ], [ 39.0759981, -16.8476734 ], [ 39.0745005, -16.8482498 ], [ 39.0734424, -16.8485277 ], [ 39.0689341, -16.8492033 ], [ 39.0649995, -16.8499947 ], [ 39.0628454, -16.8503453 ], [ 39.0610358, -16.8506179 ], [ 39.0606126, -16.8507383 ], [ 39.0554768, -16.8535026 ], [ 39.0525413, -16.8551644 ], [ 39.0500941, -16.8565197 ], [ 39.047061, -16.8582126 ], [ 39.046247, -16.8587215 ], [ 39.0354329, -16.8686812 ], [ 39.0339373, -16.8701001 ], [ 39.0320823, -16.8717448 ], [ 39.0314904, -16.872137 ], [ 39.0311382, -16.8723004 ], [ 39.0305902, -16.8724172 ], [ 39.0300749, -16.8724359 ], [ 39.0296894, -16.8723315 ], [ 39.028943, -16.8719533 ], [ 39.0267135, -16.8708386 ], [ 39.025173, -16.8697148 ], [ 39.0225867, -16.8667573 ], [ 39.0200935, -16.863839 ], [ 39.0178661, -16.8611622 ], [ 39.0164661, -16.8597913 ], [ 39.0148383, -16.8584645 ], [ 39.0137281, -16.8574891 ], [ 39.0126177, -16.856607 ], [ 39.0112606, -16.8552943 ], [ 39.0102452, -16.8542297 ], [ 39.0088561, -16.8524017 ], [ 39.0080964, -16.8512867 ], [ 39.0056434, -16.8485431 ], [ 39.0047725, -16.8476679 ], [ 38.9970797, -16.8423751 ], [ 38.9952291, -16.8411456 ], [ 38.9915802, -16.8389086 ], [ 38.9874645, -16.8359321 ], [ 38.9837594, -16.8331127 ], [ 38.9797207, -16.8295614 ], [ 38.9760362, -16.8261918 ], [ 38.9749893, -16.8253027 ], [ 38.9747789, -16.8251602 ], [ 38.9713304, -16.8239175 ], [ 38.9692777, -16.8229252 ], [ 38.9650132, -16.8205871 ], [ 38.9619701, -16.8183815 ], [ 38.961108, -16.8176299 ], [ 38.9581572, -16.8145155 ], [ 38.9543546, -16.81034 ], [ 38.9475448, -16.8035362 ], [ 38.9454228, -16.8017265 ], [ 38.9417331, -16.7983839 ], [ 38.9414238, -16.7981761 ], [ 38.9411688, -16.7980696 ], [ 38.9370629, -16.7966188 ], [ 38.9342252, -16.7954016 ], [ 38.9322771, -16.7946545 ], [ 38.9306564, -16.7941008 ], [ 38.9282553, -16.792988 ], [ 38.9273822, -16.7925439 ], [ 38.9257778, -16.79152 ], [ 38.9235896, -16.790031 ], [ 38.9217342, -16.7889287 ], [ 38.9212267, -16.7887249 ], [ 38.9207083, -16.7885995 ], [ 38.919395, -16.7884001 ], [ 38.9159268, -16.7879089 ], [ 38.9138784, -16.7876128 ], [ 38.9127607, -16.7873219 ], [ 38.9123591, -16.7871686 ], [ 38.9118002, -16.7869244 ], [ 38.9111979, -16.786649 ], [ 38.9103129, -16.7861005 ], [ 38.9097174, -16.785755 ], [ 38.9083823, -16.7852157 ], [ 38.907728, -16.7850387 ], [ 38.907199, -16.7849816 ], [ 38.9057204, -16.7849712 ], [ 38.9049688, -16.7850751 ], [ 38.9035499, -16.7853452 ], [ 38.9018461, -16.785714 ], [ 38.9010051, -16.7858959 ], [ 38.9002726, -16.7862024 ], [ 38.8989676, -16.7868206 ], [ 38.8981347, -16.7871167 ], [ 38.8967348, -16.7875712 ], [ 38.8930587, -16.7888231 ], [ 38.8926951, -16.7888725 ], [ 38.8922287, -16.7888543 ], [ 38.8896919, -16.7886747 ], [ 38.886702, -16.7884608 ], [ 38.8853816, -16.7884456 ], [ 38.880539, -16.7886388 ], [ 38.8796807, -16.7886803 ] ] } },\n", " { \"type\": \"Feature\", \"properties\": { \"u\": 2312756550, \"v\": 2283157832, \"key\": 0, \"ref\": \"N324\", \"highway\": \"secondary\", \"oneway\": false, \"reversed\": \"False\", \"length\": 35108.418, \"rfid_c\": [ 3, 5, 7, 11, 13, 15, 19, 21, 24, 25, 27, 29, 41, 49, 51, 57, 59, 62, 63, 65, 71, 81, 85, 87, 101, 9836, 115, 117, 230, 1672, 1674, 1676, 1678, 1680, 1682, 1684, 1686, 1688, 1178, 1690, 1180, 1692, 1694, 1696, 1698, 1700, 1702, 1704, 1706, 1708, 1710, 1712, 1714, 1716, 1718, 1720, 1722, 1724, 1726, 1728, 1730, 1732, 1734, 1736, 1738, 9420, 1740, 9422, 1742, 9424, 1744, 9426, 1746, 9428, 1748, 9430, 1750, 9432, 1752, 9434, 1754, 9436, 1756, 9438, 1758, 9440, 1760, 9442, 1762, 9444, 1764, 9446, 10982, 9448, 10984, 10986, 9450, 10988, 9452, 10990, 9454, 10992, 9456, 10994, 9458, 10996, 9460, 10998, 9462, 11000, 9464, 11002, 9466, 9468, 11004, 11006, 9470, 11008, 9472, 11010, 9474, 11012, 9476, 11014, 9478, 11016, 9480, 11018, 9482, 11020, 9484, 11022, 9486, 11024, 9488, 11026, 9490, 9492, 9494, 9500, 9502, 9504, 9510, 1843, 1845, 9526, 1847, 9528, 1849, 1851, 9532, 1853, 9535, 1855, 1857, 9538, 1859, 9540, 1861, 9542, 1863, 9545, 1865, 9547, 1867, 9549, 1869, 9551, 1871, 9553, 1873, 9555, 9557, 9559, 9561, 9563, 9565, 9567, 9571, 9573, 9579, 9581, 9583, 9585, 9591, 11128, 9593, 11130, 9595, 9597, 9599, 9603, 9609, 1937, 1939, 9621, 1941, 1943, 9625, 1945, 9629, 9631, 9634, 9635, 9637, 9639, 9645, 9649, 9651, 9653, 9655, 982, 984, 986, 988, 990, 992, 994, 996 ], \"rfid\": 63, \"bridge\": \"yes\", \"avgspeed\": 59.0, \"time\": 2142.0, \"TC_0064_ma\": 0.0, \"TC_0064_fr\": 0.0, \"junction\": \"nan\", \"lanes\": \"nan\", \"name\": \"nan\", \"maxspeed\": \"nan\", \"access\": \"nan\", \"width\": \"nan\", \"tunnel\": \"nan\", \"osmid_original\": [ 622773825, 622773827, 222294053, 222294054, 219115757, 219115763, 219115770, 219115772, 219115773 ], \"TC_0098_ma\": 0.0, \"TC_0098_fr\": 0.0, \"TC_0120_ma\": 0.0, \"TC_0120_fr\": 0.0, \"TC_0142_ma\": 0.23963165283203125, \"TC_0142_fr\": 0.0069957230103332236, \"TC_0145_ma\": 0.0, \"TC_0145_fr\": 0.0, \"TC_0147_ma\": 0.11718368530273438, \"TC_0147_fr\": 0.00087339052119900717, \"TC_0166_ma\": 0.0, \"TC_0166_fr\": 0.0, \"TC_0184_ma\": 0.0, \"TC_0184_fr\": 0.0, \"TC_0209_ma\": 0.0, \"TC_0209_fr\": 0.0, \"TC_0228_ma\": 0.0, \"TC_0228_fr\": 0.0, \"TC_0229_ma\": 0.0, \"TC_0229_fr\": 0.0, \"TC_0240_ma\": 0.0, \"TC_0240_fr\": 0.0, \"TC_0329_ma\": 0.0, \"TC_0329_fr\": 0.0, \"TC_0333_ma\": 0.0, \"TC_0333_fr\": 0.0, \"TC_0356_ma\": 0.0, \"TC_0356_fr\": 0.0, \"TC_0379_ma\": 0.88681221008300781, \"TC_0379_fr\": 0.021559251267715285, \"TC_0388_ma\": 0.0, \"TC_0388_fr\": 0.0, \"TC_0400_ma\": 0.0, \"TC_0400_fr\": 0.0, \"TC_0420_ma\": 0.0, \"TC_0420_fr\": 0.0, \"TC_0434_ma\": 0.0, \"TC_0434_fr\": 0.0, \"TC_0438_ma\": 0.0, \"TC_0438_fr\": 0.0, \"TC_0444_ma\": 0.0, \"TC_0444_fr\": 0.0, \"TC_0451_ma\": 0.0, \"TC_0451_fr\": 0.0, \"TC_0467_ma\": 0.0, \"TC_0467_fr\": 0.0, \"TC_0505_ma\": 0.0, \"TC_0505_fr\": 0.0, \"TC_0527_ma\": 0, \"TC_0527_fr\": 0.0, \"TC_0535_ma\": 0.0, \"TC_0535_fr\": 0.0, \"TC_0541_ma\": 0.0, \"TC_0541_fr\": 0.0, \"TC_0554_ma\": 0.0, \"TC_0554_fr\": 0.0, \"TC_0558_ma\": 0.0, \"TC_0558_fr\": 0.0, \"TC_0587_ma\": 0.0, \"TC_0587_fr\": 0.0, \"TC_0625_ma\": 0.0, \"TC_0625_fr\": 0.0, \"TC_0631_ma\": 0.0, \"TC_0631_fr\": 0.0, \"TC_0637_ma\": 0.12184333801269531, \"TC_0637_fr\": 0.0040787747774758479, \"TC_0642_ma\": 0.0, \"TC_0642_fr\": 0.0, \"TC_0653_ma\": 0.0, \"TC_0653_fr\": 0.0, \"TC_0673_ma\": 0.19067573547363281, \"TC_0673_fr\": 0.0058483074494670257, \"TC_0689_ma\": 0.0, \"TC_0689_fr\": 0.0, \"TC_0729_ma\": 0.0, \"TC_0729_fr\": 0.0, \"TC_0731_ma\": 0.0, \"TC_0731_fr\": 0.0, \"TC_0750_ma\": 0.0, \"TC_0750_fr\": 0.0, \"TC_0803_ma\": 0.0, \"TC_0803_fr\": 0.0, \"TC_0814_ma\": 0, \"TC_0814_fr\": 0.0, \"TC_0815_ma\": 0.11027336120605469, \"TC_0815_fr\": 0.0012757311553735769, \"TC_0821_ma\": 0.0, \"TC_0821_fr\": 0.0, \"TC_0862_ma\": 0.0, \"TC_0862_fr\": 0.0, \"TC_0866_ma\": 0.11719322204589844, \"TC_0866_fr\": 0.0049521652986748554, \"TC_0875_ma\": 0.0, \"TC_0875_fr\": 0.0, \"TC_0920_ma\": 0.0, \"TC_0920_fr\": 0.0 }, \"geometry\": { \"type\": \"LineString\", \"coordinates\": [ [ 38.5808761, -16.8126744 ], [ 38.5817182, -16.8128379 ], [ 38.5824952, -16.8128752 ], [ 38.5841725, -16.8128918 ], [ 38.5857502, -16.8129104 ], [ 38.5865055, -16.8128462 ], [ 38.5900851, -16.8123842 ], [ 38.5933228, -16.8119409 ], [ 38.5949719, -16.8117586 ], [ 38.595645, -16.811626 ], [ 38.5968937, -16.8113069 ], [ 38.5991618, -16.8106523 ], [ 38.5998024, -16.8105114 ], [ 38.6015836, -16.8102504 ], [ 38.6020684, -16.8102255 ], [ 38.6025207, -16.8103312 ], [ 38.6029514, -16.8105176 ], [ 38.6032717, -16.8107331 ], [ 38.6037453, -16.8111861 ], [ 38.6045616, -16.8118787 ], [ 38.6050983, -16.8122247 ], [ 38.6056869, -16.8125189 ], [ 38.6076542, -16.8134532 ], [ 38.6084095, -16.8136998 ], [ 38.6089701, -16.8138344 ], [ 38.6096669, -16.8139712 ], [ 38.6103703, -16.8140147 ], [ 38.6110542, -16.8139836 ], [ 38.6129192, -16.8139009 ], [ 38.613898, -16.8139836 ], [ 38.6144499, -16.8141307 ], [ 38.6150723, -16.8144033 ], [ 38.6177348, -16.8159768 ], [ 38.6188129, -16.8166561 ], [ 38.6192912, -16.8170642 ], [ 38.6197915, -16.817554 ], [ 38.6201272, -16.8178827 ], [ 38.6207754, -16.8184995 ], [ 38.6212518, -16.8189457 ], [ 38.6217801, -16.8193078 ], [ 38.6244247, -16.8205839 ], [ 38.6274481, -16.8219139 ], [ 38.627749, -16.8219491 ], [ 38.6281645, -16.8218952 ], [ 38.6291254, -16.8215865 ], [ 38.6296145, -16.8213048 ], [ 38.6329474, -16.8192249 ], [ 38.6361158, -16.8174474 ], [ 38.6372066, -16.8167555 ], [ 38.6381892, -16.8160263 ], [ 38.6385571, -16.8158647 ], [ 38.6421448, -16.8144328 ], [ 38.6435301, -16.8140745 ], [ 38.6471264, -16.8136867 ], [ 38.6474891, -16.8136091 ], [ 38.6502017, -16.8127891 ], [ 38.6512744, -16.8124197 ], [ 38.6538212, -16.8111232 ], [ 38.6544077, -16.8109274 ], [ 38.6569081, -16.8108055 ], [ 38.6584169, -16.8106356 ], [ 38.6590009, -16.8105278 ], [ 38.6616644, -16.8100618 ], [ 38.6622293, -16.8099023 ], [ 38.6632746, -16.8094527 ], [ 38.6657137, -16.80816 ], [ 38.6676636, -16.8068216 ], [ 38.6688366, -16.8061566 ], [ 38.6709359, -16.8048679 ], [ 38.6719293, -16.8042402 ], [ 38.672215, -16.8040806 ], [ 38.6733988, -16.80366 ], [ 38.6742775, -16.8034591 ], [ 38.6753466, -16.8032312 ], [ 38.6759547, -16.8031607 ], [ 38.677777, -16.8029971 ], [ 38.679372, -16.8028645 ], [ 38.6798503, -16.8027961 ], [ 38.6807074, -16.8025537 ], [ 38.6819691, -16.8021787 ], [ 38.6823673, -16.8021062 ], [ 38.6848454, -16.8019218 ], [ 38.685328, -16.8019404 ], [ 38.6864988, -16.8021497 ], [ 38.6871719, -16.8023797 ], [ 38.6912276, -16.8041179 ], [ 38.6917947, -16.8042733 ], [ 38.6929265, -16.8045364 ], [ 38.6937857, -16.8046524 ], [ 38.6945735, -16.8047084 ], [ 38.6956491, -16.8046794 ], [ 38.6989128, -16.8044743 ], [ 38.6995577, -16.8045074 ], [ 38.7015607, -16.8047735 ], [ 38.7025854, -16.8050171 ], [ 38.703159, -16.8052222 ], [ 38.7051868, -16.806256 ], [ 38.7064998, -16.8071007 ], [ 38.7077732, -16.8080241 ], [ 38.7083482, -16.808279 ], [ 38.7120366, -16.8094817 ], [ 38.7125885, -16.8097076 ], [ 38.7141359, -16.8104741 ], [ 38.717235, -16.8125893 ], [ 38.7178259, -16.8129167 ], [ 38.7183712, -16.8131404 ], [ 38.7187954, -16.8132709 ], [ 38.7192824, -16.8133745 ], [ 38.7199619, -16.8134077 ], [ 38.7204294, -16.8133621 ], [ 38.7257923, -16.8121108 ], [ 38.7303991, -16.8111074 ], [ 38.7316403, -16.8108665 ], [ 38.7318105, -16.8108328 ], [ 38.7326309, -16.8106394 ], [ 38.7360512, -16.8099197 ], [ 38.7388149, -16.8093492 ], [ 38.7433945, -16.808235 ], [ 38.7458606, -16.8076325 ], [ 38.7479388, -16.8070715 ], [ 38.7501038, -16.8064092 ], [ 38.7529656, -16.8055187 ], [ 38.7544295, -16.8050698 ], [ 38.7558239, -16.8046246 ], [ 38.7616694, -16.8026719 ], [ 38.7648898, -16.8015785 ], [ 38.7685334, -16.8002539 ], [ 38.7713197, -16.7993059 ], [ 38.7804653, -16.7958139 ], [ 38.7837399, -16.794449 ], [ 38.7859076, -16.7935997 ], [ 38.7891714, -16.792166 ], [ 38.7923945, -16.7905608 ], [ 38.7989817, -16.7872102 ], [ 38.8000316, -16.7865167 ], [ 38.8026009, -16.7845452 ], [ 38.804112, -16.7833426 ], [ 38.8043345, -16.7832049 ], [ 38.8048391, -16.7829815 ], [ 38.8096086, -16.7811062 ], [ 38.8119857, -16.7800968 ], [ 38.8158649, -16.778545 ], [ 38.8202152, -16.7767034 ], [ 38.8212095, -16.7762981 ], [ 38.8213547, -16.7761877 ], [ 38.8213832, -16.7760618 ], [ 38.821375, -16.7759643 ], [ 38.8212204, -16.7754137 ], [ 38.8208989, -16.7743655 ], [ 38.8208433, -16.7741058 ], [ 38.8208284, -16.7738187 ], [ 38.8208948, -16.7734096 ], [ 38.8211797, -16.7724238 ], [ 38.8215025, -16.7714744 ], [ 38.821687, -16.7711244 ], [ 38.821862, -16.7708692 ], [ 38.8220709, -16.7707159 ], [ 38.8224128, -16.7704613 ], [ 38.8227709, -16.7702626 ], [ 38.823011, -16.7700782 ], [ 38.823205, -16.7699353 ], [ 38.8233976, -16.7698743 ], [ 38.823829, -16.7698535 ], [ 38.8242766, -16.7698483 ], [ 38.824472, -16.7698717 ], [ 38.8245723, -16.7699249 ], [ 38.8247107, -16.7700951 ], [ 38.8250471, -16.770773 ], [ 38.8254569, -16.7713794 ], [ 38.8256478, -16.7715985 ], [ 38.8258261, -16.7719744 ], [ 38.826106, -16.7724773 ], [ 38.8266166, -16.7733265 ], [ 38.8273301, -16.7744019 ], [ 38.82771, -16.7748149 ], [ 38.8285646, -16.7752877 ], [ 38.8292591, -16.775589 ], [ 38.8301083, -16.775854 ], [ 38.8309574, -16.7761215 ], [ 38.831983, -16.7765579 ], [ 38.8339174, -16.7777112 ], [ 38.8356049, -16.778719 ], [ 38.8366412, -16.7792645 ], [ 38.8373331, -16.7795451 ], [ 38.8381334, -16.7797555 ], [ 38.8387357, -16.7798516 ], [ 38.8405154, -16.7800853 ], [ 38.8432258, -16.7803607 ], [ 38.8460093, -16.7806568 ], [ 38.8481363, -16.7808256 ], [ 38.8489909, -16.780862 ], [ 38.8497696, -16.7809711 ], [ 38.8529167, -16.7820594 ], [ 38.8538798, -16.7824023 ], [ 38.854578, -16.7826157 ], [ 38.8557925, -16.7830465 ], [ 38.8575397, -16.783779 ], [ 38.8598837, -16.7846491 ], [ 38.8625344, -16.7856361 ], [ 38.8643494, -16.7863011 ], [ 38.8645909, -16.7864154 ], [ 38.8647048, -16.7865193 ], [ 38.8648622, -16.7866725 ], [ 38.8650683, -16.7869244 ], [ 38.8652732, -16.7870881 ], [ 38.8653966, -16.7871296 ], [ 38.865467, -16.7871285 ], [ 38.8656942, -16.7871025 ], [ 38.8659946, -16.787042 ], [ 38.8661807, -16.7870231 ], [ 38.8663535, -16.787042 ], [ 38.8667423, -16.7871582 ], [ 38.8693115, -16.7879582 ], [ 38.8704591, -16.7883089 ], [ 38.8716257, -16.7886465 ], [ 38.8721087, -16.78874 ], [ 38.8724803, -16.7887608 ], [ 38.8753914, -16.788818 ], [ 38.8765879, -16.7887894 ], [ 38.8796807, -16.7886803 ] ] } },\n", " { ... },\n", " { \"type\": \"Feature\", \"properties\": { \"u\": 4791770241, \"v\": 4791770247, \"key\": 0, \"ref\": \"nan\", \"highway\": \"secondary_link\", \"oneway\": true, \"reversed\": \"False\", \"length\": 50.769, \"rfid_c\": [ 641472, 641473, 641793, 641475, 641796, 641469, 641470, 641471 ], \"rfid\": 6268, \"bridge\": \"nan\", \"avgspeed\": 59.0, \"time\": 3.0, \"TC_0064_ma\": 0.0, \"TC_0064_fr\": 0.0, \"junction\": \"nan\", \"lanes\": \"nan\", \"name\": \"nan\", \"maxspeed\": \"nan\", \"access\": \"nan\", \"width\": \"nan\", \"tunnel\": \"nan\", \"osmid_original\": \"486589275\", \"TC_0098_ma\": 0.0, \"TC_0098_fr\": 0.0, \"TC_0120_ma\": 0.0, \"TC_0120_fr\": 0.0, \"TC_0142_ma\": 0.0, \"TC_0142_fr\": 0.0, \"TC_0145_ma\": 0.0, \"TC_0145_fr\": 0.0, \"TC_0147_ma\": 0.0, \"TC_0147_fr\": 0.0, \"TC_0166_ma\": 0.0, \"TC_0166_fr\": 0.0, \"TC_0184_ma\": 0.0, \"TC_0184_fr\": 0.0, \"TC_0209_ma\": 0.0, \"TC_0209_fr\": 0.0, \"TC_0228_ma\": 0.0, \"TC_0228_fr\": 0.0, \"TC_0229_ma\": 0.0, \"TC_0229_fr\": 0.0, \"TC_0240_ma\": 0.0, \"TC_0240_fr\": 0.0, \"TC_0329_ma\": 0.0, \"TC_0329_fr\": 0.0, \"TC_0333_ma\": 0.0, \"TC_0333_fr\": 0.0, \"TC_0356_ma\": 0.0, \"TC_0356_fr\": 0.0, \"TC_0379_ma\": 0.0, \"TC_0379_fr\": 0.0, \"TC_0388_ma\": 0.0, \"TC_0388_fr\": 0.0, \"TC_0400_ma\": 0.0, \"TC_0400_fr\": 0.0, \"TC_0420_ma\": 0.0, \"TC_0420_fr\": 0.0, \"TC_0434_ma\": 0.0, \"TC_0434_fr\": 0.0, \"TC_0438_ma\": 0.0, \"TC_0438_fr\": 0.0, \"TC_0444_ma\": 0.0, \"TC_0444_fr\": 0.0, \"TC_0451_ma\": 0.0, \"TC_0451_fr\": 0.0, \"TC_0467_ma\": 0.0, \"TC_0467_fr\": 0.0, \"TC_0505_ma\": 0.0, \"TC_0505_fr\": 0.0, \"TC_0527_ma\": 0, \"TC_0527_fr\": 0.0, \"TC_0535_ma\": 0.0, \"TC_0535_fr\": 0.0, \"TC_0541_ma\": 0.0, \"TC_0541_fr\": 0.0, \"TC_0554_ma\": 0.0, \"TC_0554_fr\": 0.0, \"TC_0558_ma\": 0.0, \"TC_0558_fr\": 0.0, \"TC_0587_ma\": 0.0, \"TC_0587_fr\": 0.0, \"TC_0625_ma\": 0.0, \"TC_0625_fr\": 0.0, \"TC_0631_ma\": 0.0, \"TC_0631_fr\": 0.0, \"TC_0637_ma\": 0.0, \"TC_0637_fr\": 0.0, \"TC_0642_ma\": 0.0, \"TC_0642_fr\": 0.0, \"TC_0653_ma\": 0.0, \"TC_0653_fr\": 0.0, \"TC_0673_ma\": 0.0, \"TC_0673_fr\": 0.0, \"TC_0689_ma\": 0.0, \"TC_0689_fr\": 0.0, \"TC_0729_ma\": 0.0, \"TC_0729_fr\": 0.0, \"TC_0731_ma\": 0.0, \"TC_0731_fr\": 0.0, \"TC_0750_ma\": 0.0, \"TC_0750_fr\": 0.0, \"TC_0803_ma\": 0.0, \"TC_0803_fr\": 0.0, \"TC_0814_ma\": 0, \"TC_0814_fr\": 0.0, \"TC_0815_ma\": 0.0, \"TC_0815_fr\": 0.0, \"TC_0821_ma\": 0.0, \"TC_0821_fr\": 0.0, \"TC_0862_ma\": 0.0, \"TC_0862_fr\": 0.0, \"TC_0866_ma\": 0.0, \"TC_0866_fr\": 0.0, \"TC_0875_ma\": 0.0, \"TC_0875_fr\": 0.0, \"TC_0920_ma\": 0.0, \"TC_0920_fr\": 0.0 }, \"geometry\": { \"type\": \"LineString\", \"coordinates\": [ [ 32.7750115, -15.6132508 ], [ 32.7749806, -15.6131661 ], [ 32.7749761, -15.6131362 ], [ 32.7749799, -15.6131034 ], [ 32.775004, -15.6130498 ], [ 32.7750349, -15.6130117 ], [ 32.7750821, -15.6129876 ], [ 32.7751442, -15.6129826 ], [ 32.7752301, -15.6130036 ] ] } },\n", " { \"type\": \"Feature\", \"properties\": { \"u\": 4791770247, \"v\": 4791770248, \"key\": 0, \"ref\": \"N301\", \"highway\": \"secondary\", \"oneway\": true, \"reversed\": \"False\", \"length\": 36.976, \"rfid_c\": [ 641474, 639862 ], \"rfid\": 6267, \"bridge\": \"nan\", \"avgspeed\": 59.0, \"time\": 2.0, \"TC_0064_ma\": 0.0, \"TC_0064_fr\": 0.0, \"junction\": \"nan\", \"lanes\": \"nan\", \"name\": \"nan\", \"maxspeed\": \"nan\", \"access\": \"nan\", \"width\": \"nan\", \"tunnel\": \"nan\", \"osmid_original\": \"171468493\", \"TC_0098_ma\": 0.0, \"TC_0098_fr\": 0.0, \"TC_0120_ma\": 0.0, \"TC_0120_fr\": 0.0, \"TC_0142_ma\": 0.0, \"TC_0142_fr\": 0.0, \"TC_0145_ma\": 0.0, \"TC_0145_fr\": 0.0, \"TC_0147_ma\": 0.0, \"TC_0147_fr\": 0.0, \"TC_0166_ma\": 0.0, \"TC_0166_fr\": 0.0, \"TC_0184_ma\": 0.0, \"TC_0184_fr\": 0.0, \"TC_0209_ma\": 0.0, \"TC_0209_fr\": 0.0, \"TC_0228_ma\": 0.0, \"TC_0228_fr\": 0.0, \"TC_0229_ma\": 0.0, \"TC_0229_fr\": 0.0, \"TC_0240_ma\": 0.0, \"TC_0240_fr\": 0.0, \"TC_0329_ma\": 0.0, \"TC_0329_fr\": 0.0, \"TC_0333_ma\": 0.0, \"TC_0333_fr\": 0.0, \"TC_0356_ma\": 0.0, \"TC_0356_fr\": 0.0, \"TC_0379_ma\": 0.0, \"TC_0379_fr\": 0.0, \"TC_0388_ma\": 0.0, \"TC_0388_fr\": 0.0, \"TC_0400_ma\": 0.0, \"TC_0400_fr\": 0.0, \"TC_0420_ma\": 0.0, \"TC_0420_fr\": 0.0, \"TC_0434_ma\": 0.0, \"TC_0434_fr\": 0.0, \"TC_0438_ma\": 0.0, \"TC_0438_fr\": 0.0, \"TC_0444_ma\": 0.0, \"TC_0444_fr\": 0.0, \"TC_0451_ma\": 0.0, \"TC_0451_fr\": 0.0, \"TC_0467_ma\": 0.0, \"TC_0467_fr\": 0.0, \"TC_0505_ma\": 0.0, \"TC_0505_fr\": 0.0, \"TC_0527_ma\": 0, \"TC_0527_fr\": 0.0, \"TC_0535_ma\": 0.0, \"TC_0535_fr\": 0.0, \"TC_0541_ma\": 0.0, \"TC_0541_fr\": 0.0, \"TC_0554_ma\": 0.0, \"TC_0554_fr\": 0.0, \"TC_0558_ma\": 0.0, \"TC_0558_fr\": 0.0, \"TC_0587_ma\": 0.0, \"TC_0587_fr\": 0.0, \"TC_0625_ma\": 0.0, \"TC_0625_fr\": 0.0, \"TC_0631_ma\": 0.0, \"TC_0631_fr\": 0.0, \"TC_0637_ma\": 0.0, \"TC_0637_fr\": 0.0, \"TC_0642_ma\": 0.0, \"TC_0642_fr\": 0.0, \"TC_0653_ma\": 0.0, \"TC_0653_fr\": 0.0, \"TC_0673_ma\": 0.0, \"TC_0673_fr\": 0.0, \"TC_0689_ma\": 0.0, \"TC_0689_fr\": 0.0, \"TC_0729_ma\": 0.0, \"TC_0729_fr\": 0.0, \"TC_0731_ma\": 0.0, \"TC_0731_fr\": 0.0, \"TC_0750_ma\": 0.0, \"TC_0750_fr\": 0.0, \"TC_0803_ma\": 0.0, \"TC_0803_fr\": 0.0, \"TC_0814_ma\": 0, \"TC_0814_fr\": 0.0, \"TC_0815_ma\": 0.0, \"TC_0815_fr\": 0.0, \"TC_0821_ma\": 0.0, \"TC_0821_fr\": 0.0, \"TC_0862_ma\": 0.0, \"TC_0862_fr\": 0.0, \"TC_0866_ma\": 0.0, \"TC_0866_fr\": 0.0, \"TC_0875_ma\": 0.0, \"TC_0875_fr\": 0.0, \"TC_0920_ma\": 0.0, \"TC_0920_fr\": 0.0 }, \"geometry\": { \"type\": \"LineString\", \"coordinates\": [ [ 32.7750115, -15.6132508 ], [ 32.7748811, -15.6131478 ], [ 32.7747438, -15.6130408 ] ] } },\n", " { \"type\": \"Feature\", \"properties\": { \"u\": 4791770254, \"v\": 4791770260, \"key\": 0, \"ref\": \"nan\", \"highway\": \"secondary_link\", \"oneway\": true, \"reversed\": \"False\", \"length\": 41.625, \"rfid_c\": [ 641483, 641484, 641485, 641486, 641487, 641489, 641787, 641788 ], \"rfid\": 6272, \"bridge\": \"nan\", \"avgspeed\": 59.0, \"time\": 3.0, \"TC_0064_ma\": 0.0, \"TC_0064_fr\": 0.0, \"junction\": \"nan\", \"lanes\": \"nan\", \"name\": \"nan\", \"maxspeed\": \"nan\", \"access\": \"nan\", \"width\": \"nan\", \"tunnel\": \"nan\", \"osmid_original\": \"486589277\", \"TC_0098_ma\": 0.0, \"TC_0098_fr\": 0.0, \"TC_0120_ma\": 0.0, \"TC_0120_fr\": 0.0, \"TC_0142_ma\": 0.0, \"TC_0142_fr\": 0.0, \"TC_0145_ma\": 0.0, \"TC_0145_fr\": 0.0, \"TC_0147_ma\": 0.0, \"TC_0147_fr\": 0.0, \"TC_0166_ma\": 0.0, \"TC_0166_fr\": 0.0, \"TC_0184_ma\": 0.0, \"TC_0184_fr\": 0.0, \"TC_0209_ma\": 0.0, \"TC_0209_fr\": 0.0, \"TC_0228_ma\": 0.0, \"TC_0228_fr\": 0.0, \"TC_0229_ma\": 0.0, \"TC_0229_fr\": 0.0, \"TC_0240_ma\": 0.0, \"TC_0240_fr\": 0.0, \"TC_0329_ma\": 0.0, \"TC_0329_fr\": 0.0, \"TC_0333_ma\": 0.0, \"TC_0333_fr\": 0.0, \"TC_0356_ma\": 0.0, \"TC_0356_fr\": 0.0, \"TC_0379_ma\": 0.0, \"TC_0379_fr\": 0.0, \"TC_0388_ma\": 0.0, \"TC_0388_fr\": 0.0, \"TC_0400_ma\": 0.0, \"TC_0400_fr\": 0.0, \"TC_0420_ma\": 0.0, \"TC_0420_fr\": 0.0, \"TC_0434_ma\": 0.0, \"TC_0434_fr\": 0.0, \"TC_0438_ma\": 0.0, \"TC_0438_fr\": 0.0, \"TC_0444_ma\": 0.0, \"TC_0444_fr\": 0.0, \"TC_0451_ma\": 0.0, \"TC_0451_fr\": 0.0, \"TC_0467_ma\": 0.0, \"TC_0467_fr\": 0.0, \"TC_0505_ma\": 0.0, \"TC_0505_fr\": 0.0, \"TC_0527_ma\": 0, \"TC_0527_fr\": 0.0, \"TC_0535_ma\": 0.0, \"TC_0535_fr\": 0.0, \"TC_0541_ma\": 0.0, \"TC_0541_fr\": 0.0, \"TC_0554_ma\": 0.0, \"TC_0554_fr\": 0.0, \"TC_0558_ma\": 0.0, \"TC_0558_fr\": 0.0, \"TC_0587_ma\": 0.0, \"TC_0587_fr\": 0.0, \"TC_0625_ma\": 0.0, \"TC_0625_fr\": 0.0, \"TC_0631_ma\": 0.0, \"TC_0631_fr\": 0.0, \"TC_0637_ma\": 0.0, \"TC_0637_fr\": 0.0, \"TC_0642_ma\": 0.0, \"TC_0642_fr\": 0.0, \"TC_0653_ma\": 0.0, \"TC_0653_fr\": 0.0, \"TC_0673_ma\": 0.0, \"TC_0673_fr\": 0.0, \"TC_0689_ma\": 0.0, \"TC_0689_fr\": 0.0, \"TC_0729_ma\": 0.0, \"TC_0729_fr\": 0.0, \"TC_0731_ma\": 0.0, \"TC_0731_fr\": 0.0, \"TC_0750_ma\": 0.0, \"TC_0750_fr\": 0.0, \"TC_0803_ma\": 0.0, \"TC_0803_fr\": 0.0, \"TC_0814_ma\": 0, \"TC_0814_fr\": 0.0, \"TC_0815_ma\": 0.0, \"TC_0815_fr\": 0.0, \"TC_0821_ma\": 0.0, \"TC_0821_fr\": 0.0, \"TC_0862_ma\": 0.0, \"TC_0862_fr\": 0.0, \"TC_0866_ma\": 0.0, \"TC_0866_fr\": 0.0, \"TC_0875_ma\": 0.0, \"TC_0875_fr\": 0.0, \"TC_0920_ma\": 0.0, \"TC_0920_fr\": 0.0 }, \"geometry\": { \"type\": \"LineString\", \"coordinates\": [ [ 32.7757113, -15.6138013 ], [ 32.7757155, -15.613735 ], [ 32.7757248, -15.6136963 ], [ 32.775747, -15.613651 ], [ 32.7757671, -15.6136232 ], [ 32.775802, -15.6135968 ], [ 32.7758445, -15.6135798 ], [ 32.7758854, -15.6135747 ], [ 32.775945, -15.6135767 ] ] } }\n", "]}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ1 Results visualization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Last, we can now manually download the results ( `.json` and `.feather`) and visualize them locally. We will break down here the snippets within the jupyter notebook at `user_question_1\\visualise_unified_cloud_results.ipynb`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ1 Results - Show the base graph network" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import geopandas as gpd\n", "import matplotlib.pyplot as plt\n", "from pathlib import Path\n", "import folium\n", "import pandas as pd\n", "\n", "root_dir = Path(\"user_question_1\")\n", "assert root_dir.exists()\n", "path_csv = root_dir.joinpath(\"event_weights.csv\")\n", "\n", "file_path_result = root_dir.joinpath(\"result_gdf.json\")\n", "gdf = gpd.read_file(file_path_result, driver=\"JSON\")\n", "gdf" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ1 Results - Import scenario names and their weights" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# import the scenario names and their respective weights:\n", "scenarios_weights_df = pd.read_csv(path_csv)\n", "number_of_scenarios = len(scenarios_weights_df)\n", "\n", "\n", "aggregate_wl = \"max\" # this we should get from network_config or analysis_config\n", "\n", "aggregate_wls = {\n", " \"max\": \"ma\",\n", " \"mean\": \"me\",\n", " \"min\": \"mi\"\n", "}\n", "\n", "aggregate_wl_column = aggregate_wls[aggregate_wl]\n", "\n", "columns_to_count = []\n", "columns_to_count_fraction = []\n", "for name in scenarios_weights_df['name'].to_list():\n", " actual = name + f\"_{aggregate_wl_column}\"\n", " actual_fraction = name + \"_fr\"\n", " if actual in gdf.columns:\n", " columns_to_count.append(actual)\n", " columns_to_count_fraction.append(actual_fraction)\n", "\n", "\n", "weights_dict = {}\n", "for _, row in scenarios_weights_df.iterrows():\n", " weights_dict[row['name'] + f\"_{aggregate_wl_column}\"] = row['weight']" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ1 Results - Modify the dataframe" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Parametrization\n", "threshold = 0.5 # would be nice to make this threshold into a slider.\n", "fraction_threshold = 0.0\n", "color_on_probability = False\n", "\n", "# Add a new column with the count of non-zero values across specified columns\n", "gdf['count'] = 0\n", "for i, row in gdf.iterrows():\n", " row_count = 0\n", " for hazard_depth, hazard_fraction in zip(columns_to_count, columns_to_count_fraction):\n", " if row[hazard_fraction] > fraction_threshold:\n", " if row[hazard_depth] > threshold:\n", " row_count += 1\n", " gdf.at[i, 'count'] = row_count\n", "\n", "\n", "for index, row in gdf.iterrows():\n", " total_weight = 0\n", " for col in columns_to_count:\n", " if row[col] > threshold:\n", " # Find the corresponding weight and add to total\n", " weight = weights_dict[col]\n", " total_weight += weight\n", " gdf.at[index, 'cum_pf'] = total_weight\n", "\n", "temp = gdf[gdf['count'] > 0]\n", "temp" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ1 Results - Create a color map" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from branca.colormap import LinearColormap\n", " \n", "# Create a Folium pop_diff_map centered around the mean of the geometry\n", "center = [\n", " gdf['geometry'].centroid.y.mean(),\n", " gdf['geometry'].centroid.x.mean()\n", "]\n", "pop_diff_map = folium.Map(location=center, zoom_start=5, control_scale=True, tiles=\"cartodbpositron\")\n", " \n", " \n", "if color_on_probability:\n", " var = 'cum_pf'\n", "else:\n", " var = \"count\"\n", "\n", "vmin=(gdf[var]).min()\n", "vmax=(gdf[var]).max()\n", "# Classify values into ranges\n", "ranges = [vmin, vmin+((vmax-vmin)/4), vmin+2*((vmax-vmin)/4), vmin+3*((vmax-vmin)/4) , vmax]\n", "\n", "ranges=sorted(ranges)\n", "# Create a colormap from green to green to red using the overall min and max values\n", "colormap = LinearColormap(colors=['lightgreen', 'orange', 'darkred'],\n", " vmin=0,\n", " vmax=vmax\n", " )\n", " \n", "colormap.caption = 'Affected people-Optimal routes length difference (people x km)'\n", " \n", "\n", "\n", " \n", "\n", " \n", " # Add target_optimal_routes_with_hazard_gdf with created Colormap to feature_group\n", "for idx, row in gdf.iterrows():\n", " value = row['count']\n", " cum_pf = row['cum_pf']\n", " # color = 'blue' if (row['count'] > 0) else 'lightgrey' \n", "\n", " if row[var] == 0:\n", " color = 'lightgrey'\n", " else:\n", " color = colormap(row[var])\n", "\n", " # Extracting coordinates from MultiLineString\n", " coordinates = []\n", " if row['geometry'].geom_type == 'MultiLineString':\n", " for line in row['geometry']:\n", " coords = [(coord[1], coord[0]) for coord in line.coords]\n", " coordinates.extend(coords)\n", " else:\n", " coordinates = [(coord[1], coord[0]) for coord in row['geometry'].coords]\n", " # Create a popup with data\n", " popup_content = f\"Nb scenario disrupting {value}/{len(columns_to_count)}, cumulative pf: {cum_pf}\"\n", " popup = folium.Popup(popup_content, max_width=300)\n", "\n", " folium.PolyLine(\n", " locations=coordinates,\n", " color=color,\n", " weight=3,\n", " opacity=1,\n", " popup=popup\n", " ).add_to(pop_diff_map)\n", "\n", "\n", " # from branca.colormap import LinearColormap\n", "\n", "\n", "# colormap = branca.colormap.linear.YlOrRd_09.scale(0, 8500)\n", "colormap = colormap.to_step(index=ranges)\n", "colormap.caption = 'Number of scenarios for which link is disrupted'\n", "colormap.add_to(pop_diff_map)\n", "\n", "#pop_diff_map.save(root_dir / \"map_figure.html\")\n", "\n", "pop_diff_map" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## User Question 2" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ2 Description" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "> __Disclaimer!:__ This user question requires further work to be fully complete. Take the following sections as an initial approach." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "UQ2: Where are the flood impacts most disruptive in terms of accessibility? (detour length) = multi-link redundancy .\n", "\n", "The difference between this and [UQ.1](#user-question-1) lays mostly in the post-processing (and therefore interpretation) of the results. The information regarding handling of the input (preprocessing) and processing of the data (cloud workflow) is defined in the aforementioned chapter.\n", "\n", "- __Input__: tbd\n", "- __Expected output__: tbd\n", "- __Expected post-processed results__: tbd \n", "- __Acceptance level__: tbd\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ2 workflow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For this user question we will do a slight modification from the [UQ.1 workflow](#uq1-workflow). This file can also be found at `user_question_2\\hackathon_workflow.yaml`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "vscode": { "languageId": "yaml" } }, "outputs": [], "source": [ "# Content of `user_question_2\\hackathon_workflow.yaml`\n", "apiVersion: argoproj.io/v1alpha1\n", "kind: Workflow\n", "metadata:\n", " generateName: ra2ce-hackathon-uq2-\n", "spec:\n", " entrypoint: scenario-workflow\n", " templates:\n", " - name: scenario-workflow\n", " steps:\n", " - - name: define-subdirs\n", " template: read-members\n", " - - name: run-scenario\n", " template: run-scenario\n", " arguments:\n", " parameters:\n", " - name: member\n", " value: \"{{item}}\"\n", " withParam: \"{{steps.define-subdirs.outputs.result}}\"\n", " - - name: post-processing\n", " template: post-processing\n", "\n", " - name: read-members\n", " script:\n", " image: 798877697266.dkr.ecr.eu-west-1.amazonaws.com/boto3:latest\n", " workingDir: /data\n", " command: [python]\n", " source: |\n", " import boto3\n", " import json\n", "\n", " bucket = 'ra2ce-data'\n", " prefix = 'sfincs_floodmaps_sub/'\n", " \n", " client = boto3.client('s3')\n", " result = client.list_objects(Bucket=bucket, Prefix=prefix, Delimiter='/')\n", "\n", " members = []\n", " for o in result.get('Contents'):\n", " mem = o.get('Key').split('/')[1].split('.')[0]\n", " if mem != \"\":\n", " members.append(mem)\n", " print(json.dumps(members))\n", "\n", " - name: run-scenario\n", " container:\n", " image: containers.deltares.nl/ra2ce/ra2ce:latest\n", " command: [\"python\", \"/scripts/user_question_2/hazard_overlay_cloud_run_analysis.py\"]\n", " nodeSelector:\n", " beta.kubernetes.io/instance-type: \"m5.xlarge\"\n", " inputs:\n", " parameters:\n", " - name: member\n", " artifacts:\n", " - name: input\n", " path: /input/{{inputs.parameters.member}}.tif\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: sfincs_floodmaps_sub/{{inputs.parameters.member}}.tif\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: data\n", " path: /data\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/data\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: scripts\n", " path: /scripts\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/scripts\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " outputs:\n", " artifacts:\n", " - name: ra2ce-output\n", " path: /data\n", " s3:\n", " bucket: ra2ce-data\n", " endpoint: s3.amazonaws.com\n", " region: eu-west-1\n", " key: beira_qualimane_sfincs_fm/output_q2/{{inputs.parameters.member}}/\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", "\n", " - name: post-processing\n", " container:\n", " image: containers.deltares.nl/ra2ce/ra2ce:latest\n", " command: [\"python\", \"/scripts/user_question_2/post_processing.py\"]\n", " #command: [sh, \"-c\", \"for I in $(seq 1 1000) ; do echo $I ; sleep 1s; done\"]\n", " nodeSelector:\n", " beta.kubernetes.io/instance-type: \"m5.xlarge\"\n", " inputs:\n", " artifacts:\n", " - name: output\n", " path: /data\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/output_q2\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " - name: scripts\n", " path: /scripts\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/scripts\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}\n", " outputs:\n", " artifacts:\n", " - name: pp_output\n", " path: /output\n", " s3:\n", " endpoint: s3.amazonaws.com\n", " bucket: ra2ce-data\n", " key: beira_qualimane_sfincs_fm/postprocessing_output_q2\n", " region: eu-west-1\n", " accessKeySecret:\n", " name: my-s3-credentials\n", " key: accessKey\n", " secretKeySecret:\n", " name: my-s3-credentials\n", " key: secretKey\n", " archive:\n", " none: {}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ2 Pre-processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Same as in [UQ1 preprocessing](#uq1-pre-processing)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ2 Processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Yet another slight modification of the step in the [preprocessing of UQ.1](#uq1-pre-processing) that can be found at `user_question_2\\hazard_overlay_cloud_run_analysis.py`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# user_question_2\\hazard_overlay_cloud_run_analysis.py\n", "\n", "import logging\n", "from pathlib import Path\n", "\n", "from ra2ce.ra2ce_handler import Ra2ceHandler\n", "\n", "\"\"\"\n", " This script runs a network WITHOUT an analysis,\n", " as we are only interested to retrieve the HAZARD overlay.\n", "\"\"\"\n", "\n", "# Create one network configuration per provided hazard.\n", "# We assume the whole input directory will be mounted in `/data`\n", "_root_dir = Path(\"/data\")\n", "assert _root_dir.exists()\n", "\n", "_network_file = _root_dir.joinpath(\"network.ini\")\n", "assert _network_file.exists()\n", "\n", "_analysis_file = _root_dir.joinpath(\"analysis.ini\")\n", "assert _analysis_file.exists()\n", "\n", "_tif_data_directory = Path(\"/input\")\n", "assert _tif_data_directory.exists()\n", "\n", "# Run analysis\n", "_handler = Ra2ceHandler(_network_file, _analysis_file)\n", "_handler.input_config.network_config.config_data.hazard.hazard_map = [\n", " list(_tif_data_directory.glob(\"*.tif\"))[0]\n", "]\n", "\n", "# Try to get only RELEVANT info messages.\n", "import warnings\n", "\n", "warnings.filterwarnings(\"ignore\")\n", "\n", "_handler.configure()\n", "_handler.run_analysis()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ2 Post-processing" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As seen in the [workflow](#uq2-workflow), the results are still stored in the cloud. In particular, each of the hazard file combinations with our network has been stored at a different directory.\n", "\n", "In this [user question 2](#uq2-description), we will generate a unified overlay result and unified redundancy result. The two of them will be represented in both `.geojson` and `feather` formats.\n", "\n", "Check the contents of the `user_question_2\\post_processing.py` file:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# user_question_2\\post_processing.py\n", "\n", "from pathlib import Path\n", "import pandas as pd\n", "import geopandas as gpd\n", "\n", "# set the required parameters\n", "cloud_output_folder = Path(\"/data\")\n", "\n", "event_names = [\n", " folder.stem for folder in cloud_output_folder.iterdir() if folder.is_dir()\n", "]\n", "# save as a .py script\n", "aggregate_wl = \"max\" # this we should get from network_config or analysis_config\n", "\n", "aggregate_wls = {\"max\": \"ma\", \"mean\": \"me\", \"min\": \"mi\"}\n", "\n", "aggregate_wl_column = aggregate_wls[aggregate_wl]\n", "event_wl_column = \"EV1_\" + aggregate_wl_column\n", "event_fraction_column = \"EV1_fr\"\n", "fid_column = \"rfid\"\n", "\n", "for i, event_name in enumerate(event_names):\n", " overlay_output_path = (\n", " cloud_output_folder\n", " / event_name\n", " / \"static\"\n", " / \"output_graph\"\n", " / \"base_graph_hazard_edges.gpkg\"\n", " )\n", " overlay_output_gdf = gpd.read_file(overlay_output_path)\n", " output_path = (\n", " cloud_output_folder\n", " / event_name\n", " / \"output\"\n", " / \"multi_link_redundancy\"\n", " / \"example_redundancy_multi.gpkg\"\n", " )\n", " output_gdf = gpd.read_file(output_path)\n", "\n", " # Column mapping\n", " column_mapping_overlay = {\n", " event_wl_column: f\"{event_name}_\" + aggregate_wl_column,\n", " event_fraction_column: f\"{event_name}_fr\",\n", " }\n", " overlay_output_gdf = overlay_output_gdf.rename(columns=column_mapping_overlay)\n", "\n", " column_mapping = {\n", " \"diff_length\": f\"{event_name}_diff_length\",\n", " \"connected\": f\"{event_name}_connected\",\n", " }\n", " output_gdf = output_gdf.rename(columns=column_mapping)\n", " output_gdf.fillna(0, inplace=True)\n", "\n", " if i == 0:\n", " # create the base gdf that aggregate all results\n", " overlay_result_gdf = overlay_output_gdf\n", " redundancy_result_gdf = output_gdf\n", " else:\n", " filtered_overlay_output_gdf = overlay_output_gdf[\n", " [fid_column, f\"{event_name}_\" + aggregate_wl_column, f\"{event_name}_fr\"]\n", " ]\n", " overlay_result_gdf = pd.merge(\n", " overlay_result_gdf,\n", " filtered_overlay_output_gdf,\n", " left_on=fid_column,\n", " right_on=fid_column,\n", " )\n", "\n", " filtered_output_gdf = output_gdf[\n", " [fid_column, f\"{event_name}_diff_length\", f\"{event_name}_connected\"]\n", " ]\n", " redundancy_result_gdf = pd.merge(\n", " redundancy_result_gdf,\n", " filtered_output_gdf,\n", " left_on=fid_column,\n", " right_on=fid_column,\n", " )\n", "\n", "\n", "# Ensure output directory exists\n", "_output_path = Path(\"/output\")\n", "_output_path.mkdir()\n", "\n", "overlay_result_gdf.to_file(\"/output/overlay_result.geojson\", driver=\"GeoJSON\")\n", "overlay_result_gdf.to_feather(\"/output/overlay_result.feather\")\n", "\n", "redundancy_result_gdf.to_file(\"/output/redundancy_result.geojson\", driver=\"GeoJSON\")\n", "redundancy_result_gdf.to_feather(\"/output/redundancy_result.feather\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### UQ2 Results visualization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "__Disclaimer!:__ This part is not fully validated and it might require future work." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import geopandas as gpd\n", "import matplotlib.pyplot as plt\n", "from pathlib import Path\n", "import folium\n", "import pandas as pd\n", "from branca.colormap import LinearColormap\n", "\n", "root_dir = Path(\"user_question_2\")\n", "path_result = root_dir.joinpath(\"example_redundancy_multi.gpkg\")\n", "path_csv = root_dir.joinpath(\"event_weights.csv\")\n", "\n", "\n", "gdf = gpd.read_file(path_result, driver=\"JSON\")\n", "gdf = gdf.rename(columns={\"diff_length\": \"TC_0002_diff_length\"})\n", "gdf.fillna(0, inplace=True)\n", "\n", "gdf.head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ2 Results - import scenario names and their weights" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# import the scenario names and their respective weights:\n", "scenarios_weights_df = pd.read_csv(path_csv)\n", "number_of_scenarios = len(scenarios_weights_df)\n", "\n", "columns_to_count = []\n", "for name in scenarios_weights_df['name'].to_list():\n", " actual = name + \"_diff_length\"\n", " print(actual)\n", " if actual in gdf.columns:\n", " \n", " columns_to_count.append(actual)\n", "\n", "\n", "weights_dict = {}\n", "highest_weight = 0\n", "for _, row in scenarios_weights_df.iterrows():\n", " weights_dict[row['name']+ \"_diff_length\"] = row['weight']\n", " if row['weight'] > highest_weight:\n", " highest_weight = row['weight']\n", " most_likely_event = row['name']+ \"_diff_length\"\n", "\n", "print(scenarios_weights_df['name'].to_list())\n", "print(columns_to_count)\n", "\n", "print(weights_dict)\n", "print(most_likely_event, highest_weight)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ2 Results - modify the dataframe" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Parametrization:\n", "result_type = \"max_detour\" #One of \"max_detour\" , \"most_likely_event\" , \"most_likely_detour\" , \"expected_detour\"\n", "\n", "# Add a new column with the count of non-zero values across specified columns\n", "gdf['total_risk'] = 0\n", "for index, row in gdf.iterrows():\n", " total_risk = 0\n", " for col in columns_to_count:\n", " weight = weights_dict[col]\n", " risk = weight * row[col]\n", " total_risk += risk\n", " gdf.at[index, 'total_risk'] = total_risk" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### UQ2 Results - create a color map" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "### RESULT map for total risk \n", "# Create a Folium pop_diff_map centered around the mean of the geometry\n", "center = [\n", " gdf['geometry'].centroid.y.mean(),\n", " gdf['geometry'].centroid.x.mean()\n", "]\n", "pop_diff_map = folium.Map(location=center, zoom_start=5, control_scale=True, tiles=\"cartodbpositron\")\n", " \n", " \n", "var = 'total_risk'\n", "\n", "vmin=(gdf[var]).min()\n", "vmax=(gdf[var]).max()\n", "# Classify values into ranges\n", "ranges = [vmin, vmin+((vmax-vmin)/4), vmin+2*((vmax-vmin)/4), vmin+3*((vmax-vmin)/4) , vmax]\n", "\n", "ranges=sorted(ranges)\n", "# Create a colormap from green to green to red using the overall min and max values\n", "colormap = LinearColormap(colors=['lightgreen', 'orange', 'darkred'],\n", " vmin=0,\n", " vmax=vmax\n", " )\n", " \n", " \n", "\n", "# Add target_optimal_routes_with_hazard_gdf with created Colormap to feature_group\n", "for idx, row in gdf.iterrows():\n", " value = row[var]\n", " # color = 'blue' if (row['count'] > 0) else 'lightgrey' \n", "\n", " if row[var] == 0:\n", " color = 'lightgrey'\n", " else:\n", " color = colormap(row[var])\n", "\n", " # Extracting coordinates from MultiLineString\n", " coordinates = []\n", " if row['geometry'].geom_type == 'MultiLineString':\n", " for line in row['geometry']:\n", " coords = [(coord[1], coord[0]) for coord in line.coords]\n", " coordinates.extend(coords)\n", " else:\n", " coordinates = [(coord[1], coord[0]) for coord in row['geometry'].coords]\n", " # Create a popup with data\n", " popup_content = f\"total risk: {value} meters\"\n", " popup = folium.Popup(popup_content, max_width=300)\n", "\n", " folium.PolyLine(\n", " locations=coordinates,\n", " color=color,\n", " weight=3,\n", " opacity=1,\n", " popup=popup\n", " ).add_to(pop_diff_map)\n", "\n", "\n", " # from branca.colormap import LinearColormap\n", "\n", "\n", "# colormap = branca.colormap.linear.YlOrRd_09.scale(0, 8500)\n", "colormap = colormap.to_step(index=ranges)\n", "colormap.caption = 'Total risk (m)'\n", "colormap.add_to(pop_diff_map)\n", "\n", "#pop_diff_map.save(root_dir / \"map_figure.html\")\n", "\n", "pop_diff_map" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "##\n", "from branca.colormap import LinearColormap\n", " \n", "# Create a Folium pop_diff_map centered around the mean of the geometry\n", "center = [\n", " gdf['geometry'].centroid.y.mean(),\n", " gdf['geometry'].centroid.x.mean()\n", "]\n", "pop_diff_map = folium.Map(location=center, zoom_start=5, control_scale=True, tiles=\"cartodbpositron\")\n", " \n", " \n", "var = most_likely_event\n", "\n", "vmin=(gdf[var]).min()\n", "vmax=(gdf[var]).max()\n", "# Classify values into ranges\n", "ranges = [vmin, vmin+((vmax-vmin)/4), vmin+2*((vmax-vmin)/4), vmin+3*((vmax-vmin)/4) , vmax]\n", "\n", "ranges=sorted(ranges)\n", "# Create a colormap from green to green to red using the overall min and max values\n", "colormap = LinearColormap(colors=['lightgreen', 'orange', 'darkred'],\n", " vmin=0,\n", " vmax=vmax\n", " )\n", " \n", " \n", "\n", "\n", " \n", "\n", " \n", " # Add target_optimal_routes_with_hazard_gdf with created Colormap to feature_group\n", "for idx, row in gdf.iterrows():\n", " value = row[var]\n", " # color = 'blue' if (row['count'] > 0) else 'lightgrey' \n", "\n", " if row[var] == 0:\n", " color = 'lightgrey'\n", " else:\n", " color = colormap(row[var])\n", "\n", " # Extracting coordinates from MultiLineString\n", " coordinates = []\n", " if row['geometry'].geom_type == 'MultiLineString':\n", " for line in row['geometry']:\n", " coords = [(coord[1], coord[0]) for coord in line.coords]\n", " coordinates.extend(coords)\n", " else:\n", " coordinates = [(coord[1], coord[0]) for coord in row['geometry'].coords]\n", " # Create a popup with data\n", " popup_content = f\"Diff length = {value} m (event {var})\"\n", " popup = folium.Popup(popup_content, max_width=300)\n", "\n", " folium.PolyLine(\n", " locations=coordinates,\n", " color=color,\n", " weight=3,\n", " opacity=1,\n", " popup=popup\n", " ).add_to(pop_diff_map)\n", "\n", "\n", " # from branca.colormap import LinearColormap\n", "\n", "\n", "# colormap = branca.colormap.linear.YlOrRd_09.scale(0, 8500)\n", "colormap = colormap.to_step(index=ranges)\n", "colormap.caption = 'Diff length (m)'\n", "colormap.add_to(pop_diff_map)\n", "\n", "#pop_diff_map.save(root_dir / \"map_figure.html\")\n", "\n", "pop_diff_map" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.13" } }, "nbformat": 4, "nbformat_minor": 2 }