Home / Software Posts / Where Did Real-Time Modeling and Digital Twins Come From?

Where Did Real-Time Modeling and Digital Twins Come From?

Tom Walski Profile Image

Tom Walski, Ph.D, P.E, Senior Product Manager, Water

Industrial complex with overlay of digital data analytics interfaces.
Industrial complex with overlay of digital data analytics interfaces.


I recently returned from the 2022 AWWA Annual Conference. Two popular trends discussed at the conference were real-time modeling and digital twins. Which made me want to go back in time and find out where those concepts originated.

The first paper I could find was presented by a guy named Tom Walski at the 1987 AWWA Annual Conference called “Modeling and Planning Techniques—The Next Five Years.” It was part of a session organized by Walter Grayman, Ph.D., P.E., D. (retired)with a title along the lines of “Computers in the Water Industry—The Next Five Years.” It featured speakers like Walter, Bob Clark from the U.S. Environmental Protection Agency (EPA), and the late Rolf Deininger, professor emeritus of environmental health sciences, School of Public Health, University of Michigan.

Up to that time, there had been some work on optimizing pump scheduling by people like Uri Shamir, Professor Emeritus, Israel Institute of Technology, , Bryan Coulbeck, C.H. Orr from the UK, Lindell Ormsbee, Executive Director, Tracy Farmer Institute for Sustainability and the Environment, Don Chase, Director of Undergraduate Studies, University of Dayton, and me in the U.S. I had been talking about integrating modeling into operations for some time, but this was the first paper I could find discussing operational modeling in general.

During my walk down memory land, it became clear that some of the quotes from the paper sound as if they could have been written last month. For example:

  • “Very often water system operators would like to know the effect of a decision such as turning on a pump or resetting a pressure reducing valve before they carry out the action. While experienced operators generally have a “feel” for the effects of their decisions, someday they will have access to models which can simulate the effects of such a decision immediately before it is put into effect.”
  • “Another possible application of real-time control is analyzing a shutdown of a system for emergency or routine repairs. Using a model, an operator can identify potential problems and develop and analyze possible interim solutions.”
  • “Both simulation and optimization models can be used to identify the least costly combination of pumps to operate over time to minimize energy costs… A real-time control program, linked with telemetered data, can be used to decide which pumps should be running at any given time.”

Additionally, there were sections in the paper on topics like forecasting and databases.

Another note, the paper was not just about promoting the use of models in operations. It contained some precautionary messages about the pitfalls that many have fallen into along the way. For example:

  • “… a computer model can solve problems that no one could several years ago. On the other hand, given poor information, computer models can generate large sets of ridiculous answer in a short time.”
  • “A utility, however, should not rush into developing computerized data bases. The data base must be developed by a computer programmer working, not in isolation, but in close coordination with those who will use the results and those responsible for data entry.”

There were also sections on increased use of transient models, reliability analysis, and water quality analysis. These predictions have pretty much all come through.

It’s important to note, that with any 35-year forecast, the paper wasn’t perfect. I talked about “telemetry” instead of “SCADA,” “expert systems” which are a subset of “artificial intelligence;” “data bases” instead of “data lakes,” and more 2022-ish terms. Notably, while I talked about tying these ideas together, the concept of a digital twin wasn’t first introduced until a 2002 paper titled “Origins of the Digital Twin Concept” by Michael Grieves.

In reflection, I’m amazed how generally accurate this 1987 paper was. In fact, the biggest inaccuracy was predicting that these changes would happen in five years. When the more accurate estimate would have been 35 years. Clearly, my optimism about the rate of progress was unjustified. But this is a very conservative industry.

If you’d like to see a copy of this paper, send me an email (tom.walski@bentely.com). I’ll send it along.

As a side note to this blog, I was able to review the paper because the article was published in a hardcopy paper conference proceedings book. A different way of discovery. For example, if I try to find a paper that was published digitally in say the early 2000s, it is often on media that is not readily readable. For example, those documents are likely on CD, and most of today’s computers don’t have readers. Another deterrent could be the media deteriorated to the point that it is unreadable.

Good news in this case. Walter Grayman actually gave me a recording of my talk. Try to do that for a paper presented in 2015.

Relevant Tags

I didn’t get much response to the question I posed in a recent blog about the best placement of isolation ...

We all know the meanings of the R words listed in the title of this blog (at least we think ...

Last year, I wrote a blog on “How many valves are enough?” I’m following up on that now with a ...