Government-run residential energy saving programs have included insulating the first couple of metres of outlet pipe from hot water tanks, stemming from the beliefs that (1) this would reduce standing heat losses, and (2) Insulating greater lengths would give diminishing returns. I think both of these are false, and would appreciate this forum's help in confirming or correcting my understanding... Consider a hot water tank fully heated to 65 degrees C, with no pipe insulation, and no hot water being used (i.e. only standing losses need be considered). Presumably, the temperature of the outlet pipe as it exits the tank would also be about 65 degC, and as you move along the pipe the temperature would decrease (exponentially?) until it reached ambient temperature. Now, if you add 2 metres of pipe insulation, isn't the same heat loss just being shifted further down the pipe? As a thought experiment, imagine the outlet pipe had 2 metres of perfect insulation (i.e. infinite R value), with the remainder uninsulated. Then, as you moved along the pipe, its temperature would be 65 degC until you reached the end of the insulation, at which point it would decrease until it reached ambient temperature. i.e. You'd have the same temperature profile along the same length of pipe, giving exactly the same rate of heat loss. Now, if the heat loss is exactly the same for no insulation as it is for perfect insulation, then I think it's safe to assume that the heat loss would be the same for any R-value insulation. In conclusion, adding insulation to the first 2m of the outlet pipe does not reduce standing losses. What about usage losses? To simplify things, assume a constant rate of hot water usage, small enough for the tank element to maintain supply of 65 degC hot water. Adding 2m of pipe insulation would help to reduce heat losses, and the water at the tap would be hotter than it would be without pipe insulation. In practical situations, energy savings would depend on whether the volume of hot water used is adjusted for temperature. Two examples: (1) Showering/bathing, the person would mix less hot water in to achieve the desired temperature, and so energy savings would be realised. (2) Filling a washing machine with hot water, no energy would be saved as the volume of hot water used is not affect by hot water temperature. Given the vast majority of household hot water in used for showering/bathing, there may be a case for insulating hot water pipes in this way (if benefits outweigh costs). How would increasing the length of pipe insulated affect temperature at the tap? Would the returns be diminishing, linear, or increasing? I believe the temperature along the length of the pipe would follow an exponential decay from 65 degC towards ambient. Adding X metres of perfect insulation would shift the curve X metres to the right, effectively cutting off the last X metres of the curve. So, as X is increased, the temperature at the tap would increase exponentially (albeit probably with a very low rate of growth that makes it 'look' more or less linear). In conclusion, if it is worthwhile insulating the first 2 metres of outlet pipe, then it is even more worthwhile insulating the rest of it. Am I right? If not, where am I going wrong, and what is an accurate way of thinking about it?