Quantifying seepage losses from unlined irrigation canals is necessary to improve water use and conservation. The use of heat as a tracer is widely used in quantifying seepage rates across the sediment–water interface. In this study, field observations and two-dimensional numerical models were used to simulate seepage losses during the 2018 and 2019 irrigation season in the Truckee Canal system. Nineteen transects were instrumented with temperature probes and stage recording devices for inverse modeling to derive seepage flux and volumetric losses over the 39 km length of canal. The numerical models for each transect were calibrated and validated using the two-year dataset. Soil zones and observation data were used in each numerical model to help guide calibration of vertical and lateral heat and fluid fluxes. Model simulations were used to derive multivariable regression equations that consider stage, temperature, and hydraulic gradient. The results demonstrate the value of long-term datasets that illustrate the seasonality of groundwater levels, siltation, stage, and temperature on seepage rates. Seepage rates estimated by the numerical models range from 0.16 to 4.6 m3/d m−1. Total annual volumetric losses estimated for 2018 and 2019 were 1.6 × 10-2 to 1.2 × 10-2 km3, respectively. The seepage losses estimated by this study account for 32 % to 41 % of the inflow volumes. Regression models were able to reproduce seepage time-series simulated by the numerical models reasonably well. In arid environments, water diverted into irrigation canals may be influenced by seasonal variations in temperature sufficient to influence the water accounting of conveyed surface flows.