Prior to the design and installation of underground electrical cables, it is important to understand the thermal properties of the in-situ soil or made ground, to ensure the heat produced by current flowing through an underground power cable is properly dissipated and avoid premature failures.
​
Often estimations are made, but where conditions are uncertain or variable along a cable route, it is important that a proper quantitative assessment is made of the ground.
Above : Cable route showing 5 locations where soil thermal properties need measuring.
​
Soil thermal resistivity testing measures the capacity of the ground to conduct or dissipate heat. The thermal resistivity of a soil will determine whether a buried power cable remains stable or overheats. A build-up of heat around the cable can reduce transmission efficiency, or in the worst cases cause it to melt.
Above : In-situ thermal resistivity testing with a needle probe.
​
The testing procedure involves taking measurements along the cable route at intermediate or transitional locations. A needle probe is inserted in a pit or an open trench at the proposed cable depth. Thermal resistivity and conductivity readings can then be recorded and presented in a tabulated format to be used for accurate calculations by the installation engineers, prior to determining the capacity of the cable to be laid and the required sub-structure.
Above : Example results of thermal resistivity testing at 3 locations.
​
The thermal resistivity of soils can vary greatly, from 0.3°C.m/watt to as high as 40 °C.m/watt, with the higher the reading, the more likely a cable is to overheat. The current rating of the cable should therefore be based on both the range and the highest value of soil thermal resistivity measured.
Potential problems can be identified by measuring the thermal resistivity of the in-situ soil. Remedial measures that can be taken, including changing the capacity and insulation of the cables, or installing corrective thermal backfills in the cable trench.