The future of instrumentation and monitoring is exciting but there is still a lot to learn from the past, according to Geotechnical Observations’ Andrew Ridley
In the same way that the internet has revolutionised publishing since GE was launched in 1968, the geotechnical instrumentation and monitoring sector has undergone a similar step change. But while the digital era has sped up and added automation to data acquisition, the challenge still lies in interpreting the data and turning it into useful information.
Despite the internet revolution and technological advances in instrumentation and monitoring, Geotechnical Observations managing director Andrew Ridley believes that much benefit can still be derived from sometimes forgotten techniques.
Ridley points to a number of technical papers published in GE over its 50 year history, which he says charts developments in instrumentation and demonstrates cutting edge monitoring techniques that could still benefit modern construction projects.
“While some of these papers are decades old, there is still a lot we can learn from them,” he says. “Take Littlejohn’s 1973 paper on introducing micrometer sticks – I remember using these during a research project on the Jubilee Line Extension in the 1990s and they are still used today to get high quality data for research purposes, but they are rarely used in industry.
“We can also learn a lot from the treatment of the data, which is still relevant today.”
Ridley also points to Jackson and Kirby’s 1974 paper on settlement plates that incorporates a deep datum as another example. “It’s a great technique that is not used in practice, but it could be,” he says. “West, Heath and McCaul’s 1981 paper on the use of photogrammetry is a primitive application of a very up to date technique and Whittle, Gutmanis and Shilston’s paper from 1983 used satellite images to search for faults – it’s a very good early presentation of the technology.
“There is a lot more to remote monitoring than has yet been embraced by the UK construction sector.”
Ridley believes that GE has been at the forefront of presenting innovative work involving instrumentation and monitoring in geotechnics and its papers are often as relevant today as they were when they were published.
“The industry could gain significantly from looking back at these techniques and seeing how they could be re-applied using modern technology,” he says.
When GE launched in the 1960s, nearly everything was monitored manually by people who had to visit sites,” says Ridley. “Yet today many measurements are gathered automatically and some people rarely see where the instrument is.
“When I was an undergraduate, we did surveying field courses in North Wales and I remember that we had an early electronic distance measurement, a forefather of the total stations, which today are everywhere and work automatically.
“Some things come and go whereas others persist.”
Vibrating wire sensors started to develop in the 1950s and 1960s but it was not until the 1980s or 1990s that they became accepted and the technology is one of the most popular in use today.
In the 1980s and 1990s electrolevels were commonly used for monitoring tilt but have now largely been superseded by microelectromechanical systems (MEMS) sensors, which evolved in the late 1990s, changing the speed at which measurements can be taken and reducing the cost of instruments and collecting data.
Ridley says that automatic systems have become much more reliable in recent years, although he still questions the time it takes to analyse and make sense of some monitoring techniques.
“Monitoring needs to give results in a realistic timescale for a project itself, otherwise it is just being reviewed so that it benefi ts the next generation of projects.”
Ridley believes that we must also remember to learn from past mistakes, such as the Heathrow Express tunnel collapse in 1994 and Singapore’s Nicoll Highway collapse in 2004. “Systems are undoubtedly much better these days and can automatically issue alarms that can help to prevent disasters but the data still needs to be properly understood by those
transmitting and those receiving the data,” he says.
While the equipment itself has evolved, Ridley says that it is the internet that really changed everything when it comes to instrumentation and monitoring.
Nonetheless, not all the changes have been wholly beneficial.
“There is a lot of focus on getting the data quickly, but the emphasis should be on getting the information from that data rather than just the data itself,” he says.
“Some clients are now expecting to get data less than a second after a reading is taken.”
Ridley believes that is a tall order, even in the long term.
“Servers can help speed up the data processing but processing, checking and validating the data is only part of the system, data also needs to be interpreted for it to be useful and that can take time,” he says.
“Artificial intelligence could help make it possible to understand the data quicker but that may take years to develop. Until then, we, as in humans, still need to make sense of the data and people will need to be trained in how to do that.
“Systems that not only deliver the data but also present it in a way that is easy to understand and fits with the way we design our projects will help
“Machine learning has potential to be used here to assist in interpreting the data.”
To get good data you not only need to have good instruments but they have to be installed properly, so Ridley believes we need to think about proper training at the grass roots level. The development of recognised standards will help.
“Monitoring needs to give results in a realistic timescale for a project itself, otherwise it is just being reviewed so that it benefits the next generation”
“Crossrail had a tunnelling academy but I strongly believe that we need something similar for instrumentation and monitoring, particularly with HS2 on the horizon,” he suggests.
“HS2 estimates that it will spend four times as much as Crossrail on instrumentation and monitoring, which suggest an investment of over £650M, and we need skilled professionals to install, read and interpret the results.”
Sharing of data is also something that must be addressed, according to Ridley. “Where has all the data from Crossrail gone?” he asks. “By the time Crossrail 2 or other projects need it, maybe 20 years down the line, how will the data be found and accessed?
“The industry needs something similar to the British Geological Survey’s borehole library for collecting monitoring data in the way they collect borehole logs. We need an industry wide system.”
Ridley concludes: “When the big technology companies, such as Google and Amazon, realise they could become part of the construction industry through the data side of the sector, then we will see a big change and fast-paced change too.
“They currently host large amounts of data but they could be driving it.
When they realise the potential, it will be a game changer and will change the whole sector.”
This article was produced in association with: