Soil evaporative loss is an important but usually only coarsely estimated component of the catchment water budget of dryland streams. Most estimates reflect a potential maximal evaporative (reference) loss as they are based on the energy budget (Penman-Monteith) or weather parameters (Hargreaves-Samani) but rarely account for the actual soil moisture content. While these estimates are credible in wet climates with evenly distributed precipitation throughout the year, they are less accurate in more arid regions where rainfall can be strongly seasonal and potential pan evaporation can exceed precipitation ten-fold.
In this study, we used an RTG weighing lysimeter (Umwelt-Geräte-Technik, Germany) to quantify soil evaporative loss in situ and to assess how evaporative loss varied between wet and dry cycles. Our experimental site was in the semiarid and subtropical Hamersley Basin of the Pilbara region of northwest Australia. Measured mean daily evaporative losses during dry cycles were 0.33 mm day-1 (2016, over 86 days) and 0.25 mm day-1 (2017, over 73 days). These rates were three (0.95 mm day-1 in 2016) to five (1.21 mm day-1 in 2017) times lower than estimated potential reference evaporative losses using common theoretical calculations. During the wet cycle (2018 over 81 days), the measured evaporative loss rates were significantly higher (3.64 mm day-1 81 days) and similar to the potential reference evaporative loss (3.61 mm day-1); however, rates varied greatly from 0 to 13.04 mm day-1 and increased significantly in the days following rainfall events occurring during hot summers. The difference between the calculated theoretical potential reference evaporation and the actual measured evaporation at the scale of our study catchment (4,000 km2) lead to the daily overestimation of ~ 5 GL during dry cycles. A new proposed correction factor applied to the Hargreaves-Samani method significantly improved the accuracy of soil evaporative estimations based on weather parameters.