Home / Software Posts / The Evolution of Geotechnical Information Management

The Evolution of Geotechnical Information Management

Bentley Expert Profile Image

Bentley Expert

Professional analyzing data with digital graphs and artificial intelligence icons overlaying the image.
Professional analyzing data with digital graphs and artificial intelligence icons overlaying the image.

Share

Why Cloud-Based Management is Growing Fast

All earth scientists know the geological time scale. From the Archean era of the Earth’s formation billions of years ago, through the Paleozoic and Mesozoic eras of shifting land formations and bizarre primitive creatures, to the relatively recent Holocene epoch we live in now, the geological time scale helps to describe the timing and relationships between geological events throughout our planet’s history. A similar time scale can help us understand the rapid evolution of geotechnical information management (GIM). While studying the Earth and its geology has always generated large amounts of critical and interesting data, it was only in the past four decades that GIM was anything more than gathering piles of papers into filing cabinets and hoping nothing would become misplaced. Since then, GIM has evolved rapidly, allowing geoscientists to do their jobs better and faster, as well as enabling them to find innovative new uses for the same data. Now more than ever, geotechnical work relies on advanced information management that allows data to be collected easily, analyzed quickly, and stored securely. What’s more, it must be reliable and accessible to allow users to connect from anywhere (including directly in the field), streamline workflows, and ensure business continuity. For these reasons, it’s unsurprising that a shift to cloud-based GIM is in motion. But how much do you know about what GIM used to look like? Understanding the challenges throughout GIM’s evolution makes it clear why cloud-based management is growing fast.

The Pre-Computer Era

From the very beginning of geoscience through to the relatively ancient times of the early 1980s, geoscientists did not have easy access to computers. The few pieces of computing equipment available were the size of entire rooms, incredibly expensive, lacked reliable storage, and resided far from the field. Salvatore Caronna, founder of gINT, said nearly every geotechnical activity had to be done manually. Effectively taking measurements sometimes required making personal sacrifices. “Today, a lot of data reading is automated,” he said. “Back then, the tests could last several days. I had to record at specific intervals the whole time. I had a cot at the site, I would wake up, I would take a reading, and I would go back to sleep.”

Salvatore CaronnaSalvatore Caronna

In this era, everything was recorded by hand, such as borehole logs, lab testing, and section diagrams. Back at the office, a typist had to transcribe the information onto reports, which introduced errors. Caronna said field workers and typists had to continually proofread and revise – which sometimes meant having to retype the entire report. Additionally, typists had to carefully align the paper to type information into specific fields on forms, which was time-consuming and costly. Information management barely existed in this era. For the most part, it was simply organizing overflowing folders in filing cabinets. Comparing data from previous projects or nearby sites meant diving into reams of paper. Making matters worse, these papers had a habit of multiplying, as the same information would have to be copied and recopied by different people anytime they needed a different type of report that displayed information in a different way. As a result, errors frequently cropped up during the copying process. “QA and QC was a nightmare,” Caronna said. “There wasn’t a single source of truth. The same data was represented in different ways by different people.”

The DOS and Floppy Disk Era

In August 1981, the IBM Personal Computer was born. No longer did computers have to be the size of an elephant and cost more than a lavish sports car. Suddenly, computing became convenient, affordable, and readily available. Many geotechnical departments quickly embraced the PC and took advantage of its ability to store up to 360 kilobytes of data on a 5.25-inch floppy disk. Entire groups of data and reports could be easily gathered into one place for easy access. Caronna said the floppy disk also made accessing older data much easier. “When we started a new project, we had to write a proposal,” he said. “We would grab as much data as we could from projects nearby. I spent many an hour going through paper reports to try to find information that was relevant to the proposal. With information on floppy disks, I could more easily pull data.” Though the technology was groundbreaking, change didn’t happen overnight. Caronna said many organizations resisted PCs, as they believed condensing data into disks would make it easier to steal (realistically, large projects could require hundreds of disks – hardly easy to pocket). And while PCs made data entry easier and improved collaboration, they didn’t immediately solve the problem of having to manually input the same data into every form that used it. In some cases, convenient access to PCs made data entry errors worse. “All of a sudden, executives, engineers, and management were typing their own stuff,” he said. “And that lead to a lot of complaints.” Many geotechnical organizations tried to come up with in-house solutions for geotechnical information management, but most were quickly dropped. In their place arose numerous companies that specialized in geotechnical software. Caronna said he was inspired to transition from geotechnical work and form gINT by the desire to solve the tedium and inaccuracy of repeated data entry. By 1986, gINT eliminated the data entry issue by allowing data to be reused wherever it is needed. “You could put one point in the database, and it’s sent to log, graph, section, and exports,” he said. “That replaced the work of five or six people.”

The Windows and Network Drive Era

Through the 1980s and early 1990s, geotechnical software continued to advance. Data collection became automated and capable of recording data at faster intervals, eliminating the need for geoscientists to manually record data at the expense of sleep. PCs advanced as well. Not only did floppy disks evolve into a 3.5-inch size with a capacity of 1.44 megabytes, but computers also started to store data locally on hard drives. Eventually, organizations gained the ability to store data on a centralized network of numerous connected PCs. These networks allowed data to be easily recorded and accessed by anyone without having to dig through a filing cabinet for the right folder or floppy disk. “Having a network allowed you to take a single project and access it remotely at a decent speed,” Caronna said. “You were able to share and collaborate among offices and personnel.” Network drives changed the way geoscientists worked. Previously, data had to be processed relatively close to project sites, and geoscientists were forced to move to where the projects were. But by accessing data remotely, increasing numbers of geoscientists could process data in the comfort of their home base. Clients could also access data in real time using the rapidly developing internet without getting their shoes dirty. At the same time, Microsoft Windows made computers increasingly accessible. Windows NT and Windows 95 eventually eliminated the need for working within DOS entirely and enabled software to become more intuitive. The new interface helped GIM applications advance and extend their capabilities. “The database in the DOS version of gINT could be extended, but it was crude and limited,” Caronna said. “In the Windows version, everything could be customized yourself. Smart reports allowed you to write the rule to make the report be anything you want.” For example, if moisture content tests were not run on a borehole, a smart report could automatically suppress the column for moisture data and adjust the rest of the columns appropriately, eliminating the need to generate a new report. Geologists discovered numerous new ways to mine value out of data. However, the divide between the site and the office remained. The rough, dirty conditions would overwhelm most electronics, and the early field devices were awkward to use, Caronna said. Despite the great leap in data management and analysis, field workers were still stuck with pen and paper.

The On-Premises Application Server Age

By the mid-1990s, many segments of geoscience developed the ability to share data among teams and clients across the world. But the process often wasn’t easy at first. Roger Chandler, Co-Founder of Keynetix and Business Development Director at Bentley Systems, said his team had a particularly rough experience working onsite in the Philippines, supported by an engineering term in the United Kingdom, and providing materials to a client in the Netherlands. “Getting information off the vessel via satellite was a nightmare,” he said.

roger chandlerRoger Chandler

Specialized yet primitive on-site devices evolved into ruggedized versions of the same devices found back in the office. Though some could run versions of the same applications as off-site researchers, many of them were still too bulky and unwieldy to bring into project areas, and they didn’t easily connect with devices back at the office. Worldwide data sharing via online servers improved as technology advanced, though more data to share meant more demand for that data. Contractors began requesting the raw data behind the charts and reports. As they received increasing amounts of data from different organizations, they wanted to come up with their own spreadsheet standards for everyone to fill in, rather than having to interpret different spreadsheet standards used by different teams. Data organization became standardized across geotechnical projects. At first, online servers were just depositories of data – virtual filing cabinets. Even as applications began being hosted on drives to unify standards, users treated them as just another place to store information, Chandler said. Users could access a project if they wanted to, but at first there was no way to allow full collaboration on a project with the knowledge of how the data was entered or changed. “Up through 2010, GIM was still thought of as an efficient way to create pieces of paper,” Chandler said. “Paper sent through the email was thought of as efficient.” Gradually, geotechnical desktop solutions started to enable limited collaboration. For example, HoleBASE began to allow more than one person to work on a project in the same office. Soon, all project data could be stored, accessed, and changed in the same place and in the same way. The ability to do even more with the same data caused specialist data management roles to arise. Close to three decades after the PC became widely available, nearly everything outside of the project sites themselves became digitized. “We were able to take things much further than paper,” Chandler said.

The Cloud Era

Eventually, both data and digital processes moved online, forming what we know as the cloud. Geotechnical processes became digital from the start, transforming the way work was carried out. Storage cabinets finally vanished from offices. But that transformation wasn’t painless, as version control was still an issue at first. “Every week, clients would take information from the application and export it as the latest version of the data, then every company imported it,” Chandler said. “This drove everyone nuts. No one knew what the latest info was, or what the differences between the data sets were.” Thankfully, applications evolved to eliminate these headaches, and cloud-based computing became the final piece of the digital transformation puzzle. Applications such as OpenGround were no longer bound to local computers and specific locations. Instead of having to retrieve data stored online for use within local applications, geoscientists could access data and carry out their work in the same online location, complete with version tracking that clearly showed which person changed what information. With no need to work on data offline, version control problems reduced significantly. The process of transferring data also improved. Before applications migrated to the cloud, sharing data meant sharing files back and forth. Now, team members can find and access the data within each application without having to decipher code. “You’ve taken the data transfer and you’ve put it behind the scenes of the applications where it belongs. In the same way HTML makes your web browser work without having to email HTML files and import them to your browser,” Chandler said. The rise of smartphones and tablets provided the breakthrough on-site geoscientists were waiting for. Just add a ruggedized case, and they can bring convenient and familiar technology with them. Not only can they work with mobile versions of geotechnical applications used throughout the team, they can collaborate with far-flung team members via the cloud. The last vestiges of pen and paper use are starting to fade away. “These devices work in the rain, unlike pen and paper,” Chandler said.

Making the Move: Migration Paths for Desktop GIM Users

While the benefits of cloud-based GIM (and the limitations of legacy software) are clear, planning the move from a trusted and time-tested desktop application can be daunting. “Most organizations are worried about the downtime and risk associated with a migration, making them hesitant to rock the boat,” Chandler explained. “The irony is that, once the migration is complete, a cloud-based solution ultimately reduces downtime and risk.” As with any major technological shift, there eventually comes a time when legacy product can no longer keep up with the needs of its consumers, who are often left without a path forward. It was for this reason that OpenGround was designed to provide a smooth migration plan for users of desktop GIM products. “Generally, migration from HoleBASE to OpenGround takes only a weekend,” said Chandler.

Looking Ahead to the Future of GIM

Though the jump to the cloud was a huge evolutionary step, GIM continues (and will continue) to change and improve. For example, geotechnical companies are making great use of sensors and data collection devices, making it easier than ever to monitor sites and gather information in near real-time throughout the project. Digital twins can produce a detailed yet intuitive model of any project, improving both visibility and decision-making. And GIM will continue to evolve in new directions, some of which are in development and others we might not be able to imagine today. The new eras certain to emerge could make today’s advancements look ancient.

Relevant Tags

This blog is a part of the series Tunnel your way to success with PLAXIS. Introduction The Traditional Method of ...

Atkins is a multinational engineering, design, planning, project management, and consulting services company headquartered in London, UK. Part of the ...

How to Set Up the Fluidity Parameter γ of the Hoek-Brown Model with the Softening Model In the framework of ...

Subscribe to The Bentley Brief

Stay ahead of the curve with the latest infrastructure news and insights.