Transforming network operations: From a hydraulic model to an automated, operational digital twin in six easy steps
Hosted by David Fortune, Vice President of Innovation (Innovyze) and Rebecca Willey, Solutions Engineer (Innovyze)
The topic was introduced by David Fortune who outlined his perception of the elements needed for the creation of a digital twin and why they are so powerful. Digital twins are important because our networks impact people. Being able to efficiently manage operations in real-time is therefore essential to reduce the impact on customers. Ultimately assets are out of sight; they are below ground and so a lot of the time they cannot be seen. The digital twin aims to unite this real-world asset data, whether storm, sewer or water networks, with control into a hydraulic model. This enables improved visibility and foresight, delivering the power to see how the network will perform in the hours and days ahead, acting as the catalyst to move from reactive to proactive management.
Constantly we are told there isn’t enough data, but often there is and it’s more about using that data more intelligently. When discussing this further we acknowledged that depending on the solution and purpose for network operations the amount of data required varies. When looking at calibration of a hydraulic model prior to incorporating it within a digital twin, the level of calibration and live data should be consistent to achieve the calibration levels needed. This helps you answer the questions you have about the network and understand why it is behaving in such a way. Standard calibration techniques and tolerances were discussed and the CIWEM UDG guidelines used within the UK and in Europe were outlined as a process for best practice.
Numerous participants questioned whether a hydraulic model is required for a digital twin if you have the other corporate systems connected. The consensus was that it very much depends on the purpose and the need for a hydraulic model. This will vary but ultimately some network questions, responses to incidents, and scenarios can only be assessed with hydraulics. For one participant the work with a digital twin enabled the use of electrical conductivity instrumentation to further calibrate those models in real-time.
By using a hydraulic model as part of your digital twin, alarms can be generated from comparisons of live data and the model allowing you to see changes and differences in tolerance before the telemetry alarms tolerances are met. In this way data can be used more intelligently with alerts only generated when tolerances have already been met, allowing the digital twin to deliver proactive alarm management.
The amount of data and when its best to use specific data was also discussed. More alarms can be more of a hinderance than a help. This can detract from larger problems and it is hard to distinguish whether the data points are connected or in isolation with no standardised approach for operators to be made aware of the alarms they should ignore or action. Since the purpose of the digital twin varies so does the amount of data and telemetry data required, for example someone just concerned with a water quality digital twin will have vastly different data frequency and quality to a user focusing on DMA pressure.
As well as live telemetry data, for a digital twin to be an asset twin we need the network asset data to be mirrored into the live twin. It is therefore essential to have the GIS and asset registry teams embedded into the project from the outset, at every level. This enables you to understand what and how frequently changes will move from all corporate data systems. Often these projects may start in modelling, operations, networks, but there is no doubt that a ‘big business picture’ is required to see how this focus can improve change in other areas of the company, such as improving data access for all, and centralising data in one place.