Today, test and measurement is moving more towards software-defined approach. Geological T&M is no exception as it enables scientists to make the best use of the time and effort that they put in taking these measurements. This approach enables engineers to get more flexibility and greater performance at lower cost
Ashwin Gopinath
Thursday, June 27, 2013: Geological testing or geotechnical investigations are performed by geotechnical engineers or engineering geologists to unearth information about the physical properties of soil and rock around a site. The information is crucial when trying to design earthworks and foundations for the proposed structures and for repair of distress to earthworks and structures caused by surface conditions. Geotechnical investigations are also used to measure other parameters like the thermal resistivity of soil, back-fill materials required for underground transmission lines, oil and gas pipelines, radioactive waste disposal and solar thermal storage facilities.
A geotechnical investigation is a process in which the physical properties of a site are assessed for the purpose of determining what kind of work can be done on the site safely. Before land can be developed or redeveloped, a geotechnical investigation of the concerned site is recommended as a safety procedure. This process is also required or recommended in the wake of incidents like earthquakes, the emergence of foundation cracks on land that was thought to be solid and so forth. The goal of such an investigation is to confirm that the land is safe to work on.
Sometimes geophysical methods are used to obtain data about sites. Sub-surface exploration usually involves soil sampling and laboratory tests of the soil samples retrieved. This can include complex processes of geological mapping, geophysical methods and photogrammetry, or it can be as simple as a walking around the site to observe its physical conditions.
Need for geotechnical investigation
Geotechnical investigation is gaining prominence for a number of reasons. Main among them is the increased activity in construction—not just in terms of volume but also complexity. Gone are the days when soil was a decisive factor, determining the kind of construction to be done at a place. Nowadays, the soil is made suitable for the project.
Geotechnical services encompass different functions, each as varied as the next, all of which involve measuring the effect of earth and soil on construction projects. Geotechnical laboratories may analyse soil samples at a proposed construction site to help assess whether the ground can support the needed amount of weight. Other types of inspections may also be conducted on a site to estimate the likelihood of potential damage from natural disasters such as earthquakes and landslides.
Majority of engineering construction projects begin with a thorough analysis of the proposed building site, which includes soil assessment. Geotechnical service professionals often run laboratories that conduct detailed analysis of the earth on location to provide a list of all the soil components and their proportions. The research also contains information on other soil characteristics such as whether it is rocky, holds a lot of moisture or is very compacted. Engineers use this data to determine earth parameters such as its weight capacity, permeability and shear strength. They design construction plans for the proposed building taking into account these parameters.
Latest trends in geological T&M
The amount of data collected is increasing at a spectacular rate.
Satish Mohanram, technical marketing manager, National Instruments-India, elaborates, “Scientists and engineers doing test and measurement end up collecting huge amounts of data. To give some examples, seismic information taken at high sampling rates, structural studies done on historical monuments, property testing of objects, etc create data, and making sense out of this information is the objective of the exercise. This could become intimidating, considering the amount of data collected.”
Tools and technology used in geological testing |
Dropping pipettes like digital micropipettes are used for testing as these provide a number of additional features over traditional pipettes, including larger holding area for fatigue-free operation, and digital counters as well as push-button sample extraction. The Nichipet EX autoclavable digital micro pipette is one such tool that also ensures that testing is not affected by the temperature of the technician’s hand. Magnetometer geophysical instruments like cryogenic magnetometers, proton magnetometers and spinner magnetometers are used to detect ferrous metals at a very large depth. Newer quantum magnetometers take advantage of the spin of subatomic particles (nuclei and unpaired valence electrons). Through a process of polarisation, particles are caused to precess in the earth’s ambient magnetic field. The resulting frequency of precession can be translated directly into magnetic field units. Quantum results are scalar (total field intensity) as opposed to vector (i.e., from fluxgate geophysical instruments or GEM’s suspended dIdD technology). Rugged water-level indicators are now available, protected in Kevlar and featuring sturdy cables incorporating multiple conductors to fit easily into standpipes and wells. Firms like Durham Geo also supply laser-marked cables for high-contrast measurements that do not wear off with use. Downhole cameras are used to inspect problems encountered in well drilling. These cameras are heavy-duty oceanographic-type cameras used for high pressure deployment. The camera is suspended on an aluminium heavy-duty cable reel so that data can be carried to the technician without sacrificing the strength of the cable. Naeva Geophysics’ Well-Vu WV-C1000 comes with a tri-legged stand and offers real-time video output to the field computer. Analytical or scientific software like Landmark Graphics GeoGraphix software, Parallel Geoscience SPW software and Seismic Micro-Technology KINGDOM are also available. Computer-aided design (CAD) software like Autodesk AutoCAD and Midland Valley 2DMove are also used by geological test technicians. |
Another emerging trend in geotechnical investigation is a greater emphasis on in-situ tests. In-situ literally means ‘in position.’ In-situ tests are tests conducted on the site. These tests can be effectively used to predict foundation behaviour with a high reliability.
Products developed for regulated industries such as automotive, aerospace and medical must comply with rigorous development standards and certifications.Revisions to these standards place increased scrutiny on the quality and accuracy of test tools—creating an increased burden of proof to demonstrate that testers have been qualified for use. Though businesses outside these industries will not feel an immediate impact from these trends, they can benefit from detecting defects earlier in the life cycle and drive down product development cost.
Use of test automation software has increased rapidly over the last decade due to the need for highly customisable, flexible and capable measuremen systems. Software-centric test solutions are the only viable approach for delivering complex technologies under aggressive timelines, limited resources and constant product churn.
Geotechnical investigation is gaining prominence for a number of reasons. Main among them is the increased activity in construction—not just in terms of volume but also complexity. Gone are the days when soil was a decisive factor determining the kind of construction to be done at a place. Nowadays, the soil is made suitable for the project |
Mohanram says, “Test and measurement, in today’s scenario, is moving more towards a software-defined approach. Geological T&M is no exception to it as it enables scientists to make the best use of the time and effort that they put in taking these measurements. This approach enables engineers to get more flexibility and greater performance at lower cost. The innovative platform-based Graphical System Design approach enables this new paradigm of test and measurement.”
Increasing product complexity and capability has a direct impact on the reliability, performance and accuracy of test systems. Consequently, “there is increased focus on ensuring quality and reliability of test software through life-cycle management and development practices formerly reserved for embedded systems. Some organisations are voluntarily applying these development practices to improve test software and build more feature-rich and defect-free test solutions, but a growing number of industries will be required to use similar practices to comply with regulatory standards,” explains a senior engineer from this field.
These standards set a high bar for process and quality, but best practices in software engineering ensure that test systems meet increasingly demanding feature and performance requirements.
Challenges in the field
As mentioned earlier, the amount of data recorded by geological test and measurement instruments is pretty high in cases like seismic activities. Drawing accurate and meaningful conclusion along with knowledgeable patterns from such large amounts of data (referred to as Big Data) is a growing problem. Big Data processing brings new challenges to data analysis, search, data integration, reporting and system maintenance.
In-situ tests |
In-situ tests can greatly increase the volume of geomaterial investigated at a foundation site and save cost as compared to sampling and lab testing. Historically, these tests have been developed to evaluate specific parameters for geotechnical design. Some tests, such as plate load test and pile load test, measure the response to a particular type of load. These tests verify design assumptions, and help to determine soil or rock properties by inversion. Standard penetration test (SPT). The standard penetration test is an in-situ dynamic penetration test primarily designed to provide information on geological engineering properties of the soil in question. The main information gained from this test pertains to the relative density of granular deposits, such as sand and gravel, from which it is almost virtually impossible to obtain undisturbed samples. The main reasons for its widespread use are relatively low cost and simplicity. The usefulness of SPT results depends on the soil type. Fine-grained sands give the most useful results, while coarser sands and silty sands give reasonably useful results. Cone penetration test (CPT). This in-situ test is performed using an instrumented probe with a conical tip, pushed into the soil hydraulically at a constant rate. The early applications of CPT mainly determined the bearing capacity of soil. The original cone penetrometers involved simple mechanical measurements of the total penetration resistance to pushing a tool with a conical tip into the soil. Latest electronic CPT cones now also employ a pressure transducer with a filter to gather pore water pressure data. |
Studies estimate that the amount of data being created is doubling every two years. A small data set often limits the accuracy of conclusions and predictions. A classical example is that of a gold mine where only 20 per cent of the gold is visible. Analysing Big Data is akin to finding the remaining 80 per cent which is in the dirt, hidden from view. This analogy leads to the term ‘digital dirt,’ which means that digitised data, more often than not, has concealed value. Hence, Big Data analytics, i.e., data mining, is required to achieve new insights.
“In test and measurement field, data can be acquired at astronomical rates (as high as several terabytes per day). Big Analog Data issues are growing challenges for automated test and analysis systems. When there are many devices under test, distributed automated test nodes (DATNs) are needed, which are often connected to computer networks in parallel. Since DATNs are effectively computer systems with software drivers and images, the need arises for remote network-based systems management tools to automate their configurations, maintenance and upgrades,” explains a whitepaper on the issue.
The volume of test and measurement data is fuelling a growing need in global companies to offer access to this data to many more engineers than in the past. This requires network gear and data management systems that can accommodate multi-user access, which, in turn, drives the need to geo-graphically distribute the data and its access. A growing approach to providing this distributed data access is the use of cloud technologies.
It is a known fact that Big Analog Data applications create strong dependency on IT equipment such as servers, networking and, not to mention, storage. In addition, requisite software is needed to manage, organise and analyse the data. Thus traditional IT technologies are being seen as part of the total solution post data capture to ensure efficient data movement, archiving, and execution of analytics and visualisation for both in-motion and at-rest data.
Several vendors, viz, Averna, Virinco, National Instruments and OptimalTest, already have solutions to help manage Big Analog Data. In order to analyse, manage and organise billions of data points from millions of files, engineers and scientists use software tools like NI’s DIAdem to quickly locate, load, visualise and report measurement data collected during data acquisition and/or generated during simulation. These software are designed to meet the demands of today’s testing environments, in which quick access, processing and report on large volumes of scattered data in multiple custom formats are required to make informed decisions. These software solutions help to interface the data collected with existing IT solutions or create new servers that can be accessed globally to make faster decisions.
Reliability is always king
Needless to say, geotechnical investigations play a vital role in ensuring adequate performance of a structure. Geotechnical engineers are under pressure to develop reliable and economical designs for heavier loads and diffiult soil conditions, which, in turn, necessitates strong performance of test and measurement tools used.
The need of the hour is to predict the behaviour of structures to a very high degree of reliability. This has resulted in advanced in-situ testing methods to predict the behaviour more rationally and accurately for developing the most stable, economical foundation designs. With increased dataflow set to become an industry defaul rather than a differentiator in test and measurement domain, engineers have to work on solutions capable of collecting, processing and analysing huge amounts of data, helping geotechnical investigators to gain useful insights.
The author is a tech correspondent at EFY Bengaluru