A few weeks ago, NOAA (the US National Oceanic and Atmospheric Administration) issued a spring outlook predicting elevated flood risk through May for much of the continental United States.
The flood risk is highest in the Midwest Mississippi and Missouri River basins, which have already seen heavy flooding from early spring storms on top of snow melt.
This forecast is the result of several factors including above-average rainfall and higher spring snowpack that has led to higher than normal soil moisture and river levels. We have already begun to see the devastation that these conditions can result in: the Nebraska Emergency Management Agency estimated flood damage over $1.3 billion in late March 2019.
Sadly, many flood models are outdated and are of little use in cases like this season with critically high antecedent conditions. One study found no correlation between flood damage claim payouts and designated flood plains (CNT, 2014). This is a tragic statistic for our industry. Even if an accurate 2D model was created for an area at one time, it may be difficult to update over time from new data, and any inquiry of real-time flood estimates may be out of the question due to amount of time needed for telemetry processing, model updates, and runtimes with proper boundary conditions.
One of the main problems with trying to model flooding is the tradeoff between accuracy, speed, and manageability. If you're trying to solve everything using 1D representations, it can be very tedious to assemble all the flow paths where water can go with the correct data. In many cases a coupled 1D-2D model is essential, but if you're using a gridded 2D mesh, then you need to be able to use a small enough resolution to capture surface features. This can result in prohibitively slow run times unless leveraging GPU processing (which we offer in both XPSWMM and InfoWorks ICM).
Traditional modeling approaches suffer from a lack of connection between models and sensor data. Our industry, like nearly all others, is now being revolutionized by increases in sensor and data availability. One city I visited recently plans to increase their in-network sensors from 70 to around 3,000 in the next couple years. That amount of data was unimaginable in our industry only a few years ago. With sensor data streaming in continuously, it makes sense to leverage the sunk investments in calibrated models and sensor data to combine them into a smart water network system. Without a fixed and living connection to incoming data, modelers are stuck in an endless and expensive cycle of pulling data, processing it, and recalibrating the model to a selected point in time.
During my time on the Innovyze team, I’ve had the privilege of working with several communities to implement real-time modeling systems using ICMLive. ICMLive is an off-the-shelf solution that connects an ICM model with telemetry data in an automated environment to produce forecasts and customized alerts. It can retrieve multiple sources of rainfall, including NEXRAD and NOAA forecast data to stitch together simulations which track antecedent conditions and boundary conditions like USGS stream gauges. Most users I have spoken with then use ICMLive to export results and alerts to online dashboards so no one necessarily even needs to open the software. Modelers then no longer need to retrieve or process timeseries data; every day offers a new comparison of model and measurements which can be used to tweak model inputs for accuracy. It is definitely time for us to rethink how we approach modeling and data access.
In summary, if you have been disappointed by your current flood readiness analyses or ability to offer warning or proactive protection measures, then consider improving your models! Deploying an adaptive, accurate flood forecast real time decision support system can of course take quite a bit of effort. However, tools are available to turn any model into an ICMLive pilot project with relative ease through available scripts using radar, forecasts, and boundary conditions.