Why Cloud-Based Management is Growing Fast
All earth scientists know the geological time scale. From the Archean era of the Earth’s formation billions of years ago, through the Paleozoic and Mesozoic eras of shifting land formations and bizarre primitive creatures, to the relatively recent Holocene epoch we live in now, the geological time scale helps to describe the timing and relationships between geological events throughout our planet’s history. A similar time scale can help us understand the rapid evolution of geotechnical information management (GIM). While studying the Earth and its geology has always generated large amounts of critical and interesting data, it was only in the past four decades that GIM was anything more than gathering piles of papers into filing cabinets and hoping nothing would become misplaced. Since then, GIM has evolved rapidly, allowing geoscientists to do their jobs better and faster, as well as enabling them to find innovative new uses for the same data. Now more than ever, geotechnical work relies on advanced information management that allows data to be collected easily, analyzed quickly, and stored securely. What’s more, it must be reliable and accessible to allow users to connect from anywhere (including directly in the field), streamline workflows, and ensure business continuity. For these reasons, it’s unsurprising that a shift to cloud-based GIM is in motion. But how much do you know about what GIM used to look like? Understanding the challenges throughout GIM’s evolution makes it clear why cloud-based management is growing fast.The Pre-Computer Era
From the very beginning of geoscience through to the relatively ancient times of the early 1980s, geoscientists did not have easy access to computers. The few pieces of computing equipment available were the size of entire rooms, incredibly expensive, lacked reliable storage, and resided far from the field. Salvatore Caronna, founder of gINT, said nearly every geotechnical activity had to be done manually. Effectively taking measurements sometimes required making personal sacrifices. “Today, a lot of data reading is automated,” he said. “Back then, the tests could last several days. I had to record at specific intervals the whole time. I had a cot at the site, I would wake up, I would take a reading, and I would go back to sleep.”Salvatore Caronna
The DOS and Floppy Disk Era
In August 1981, the IBM Personal Computer was born. No longer did computers have to be the size of an elephant and cost more than a lavish sports car. Suddenly, computing became convenient, affordable, and readily available. Many geotechnical departments quickly embraced the PC and took advantage of its ability to store up to 360 kilobytes of data on a 5.25-inch floppy disk. Entire groups of data and reports could be easily gathered into one place for easy access. Caronna said the floppy disk also made accessing older data much easier. “When we started a new project, we had to write a proposal,” he said. “We would grab as much data as we could from projects nearby. I spent many an hour going through paper reports to try to find information that was relevant to the proposal. With information on floppy disks, I could more easily pull data.” Though the technology was groundbreaking, change didn’t happen overnight. Caronna said many organizations resisted PCs, as they believed condensing data into disks would make it easier to steal (realistically, large projects could require hundreds of disks – hardly easy to pocket). And while PCs made data entry easier and improved collaboration, they didn’t immediately solve the problem of having to manually input the same data into every form that used it. In some cases, convenient access to PCs made data entry errors worse. “All of a sudden, executives, engineers, and management were typing their own stuff,” he said. “And that lead to a lot of complaints.” Many geotechnical organizations tried to come up with in-house solutions for geotechnical information management, but most were quickly dropped. In their place arose numerous companies that specialized in geotechnical software. Caronna said he was inspired to transition from geotechnical work and form gINT by the desire to solve the tedium and inaccuracy of repeated data entry. By 1986, gINT eliminated the data entry issue by allowing data to be reused wherever it is needed. “You could put one point in the database, and it’s sent to log, graph, section, and exports,” he said. “That replaced the work of five or six people.”The Windows and Network Drive Era
Through the 1980s and early 1990s, geotechnical software continued to advance. Data collection became automated and capable of recording data at faster intervals, eliminating the need for geoscientists to manually record data at the expense of sleep. PCs advanced as well. Not only did floppy disks evolve into a 3.5-inch size with a capacity of 1.44 megabytes, but computers also started to store data locally on hard drives. Eventually, organizations gained the ability to store data on a centralized network of numerous connected PCs. These networks allowed data to be easily recorded and accessed by anyone without having to dig through a filing cabinet for the right folder or floppy disk. “Having a network allowed you to take a single project and access it remotely at a decent speed,” Caronna said. “You were able to share and collaborate among offices and personnel.” Network drives changed the way geoscientists worked. Previously, data had to be processed relatively close to project sites, and geoscientists were forced to move to where the projects were. But by accessing data remotely, increasing numbers of geoscientists could process data in the comfort of their home base. Clients could also access data in real time using the rapidly developing internet without getting their shoes dirty. At the same time, Microsoft Windows made computers increasingly accessible. Windows NT and Windows 95 eventually eliminated the need for working within DOS entirely and enabled software to become more intuitive. The new interface helped GIM applications advance and extend their capabilities. “The database in the DOS version of gINT could be extended, but it was crude and limited,” Caronna said. “In the Windows version, everything could be customized yourself. Smart reports allowed you to write the rule to make the report be anything you want.” For example, if moisture content tests were not run on a borehole, a smart report could automatically suppress the column for moisture data and adjust the rest of the columns appropriately, eliminating the need to generate a new report. Geologists discovered numerous new ways to mine value out of data. However, the divide between the site and the office remained. The rough, dirty conditions would overwhelm most electronics, and the early field devices were awkward to use, Caronna said. Despite the great leap in data management and analysis, field workers were still stuck with pen and paper.The On-Premises Application Server Age
By the mid-1990s, many segments of geoscience developed the ability to share data among teams and clients across the world. But the process often wasn’t easy at first. Roger Chandler, Co-Founder of Keynetix and Business Development Director at Bentley Systems, said his team had a particularly rough experience working onsite in the Philippines, supported by an engineering term in the United Kingdom, and providing materials to a client in the Netherlands. “Getting information off the vessel via satellite was a nightmare,” he said.Roger Chandler