Wireless Sensor Network (WSN) nodes require components with ultra-low power consumption, as they must operate without an external power supply. One technique for reducing consumption of a system is to scale it to a smaller technology; however, in recent technologies is not clear whether the decrease in dynamic power consumption outweighs the increase in static power consumption (due to leakage currents). Here this is considered by examining the power consumption of three implementations of an analog to digital converter (ADC). One in 350 nm, one in 180 nm, and one in 90 nm, were simulated and compared. The results show that the dynamic power consumption was reduced by a factor of four over the three technologies, but standby power consumption increased by an order of magnitude. The power consumption of the 180 nm implementation was always lower than the 350 nm implementation. However, assuming a WSN application with a duty cycle of 1%, the effective power consumption of the 90 nm ADC was higher than both the 180 nm and the 350 nm implementation. This highlights the dominance of leakage current in determining the effective power consumption in lowthroughput nodes.