While I agree that power is the primary issue leakage does have a correlation with "degrading the off" value.
Specifically the effects of drain induced barrier lowering becomes very significant in the sub-threshold region as the channel length is reduced. This has the effect of both increasing drain current (leakage) and making the transistor more difficult to turn off, that is a larger gate bias is required to turn the transistor off.
While "leakage" isn't the cause, you do get leakage and a harder to turn off transistor at the same time, at least for this type of leakage.
Hmm. In a CMOS structure with a weakly turned off device, you'd need the effective off resistance to be more than 1/20th of the effective on for the logic value to degrade by roughly 5%. Did we have such leaky transistors for 130nm/90nm when the shift happened?
A quick calculation I did with a 90nm model for I have some parameters with vt=0.3, vdd=1.2, subthreshold slope=90mV/decade and dibl coeff=0.1 seems to suggest that leakage would still be < 1/100th of the on current.
I don't think so. The effect of DIBL is an increase in leakage current with the drain to source voltage.
The first-order model I was taught in school was that leakage current not considering DIBL used to be proportional to exp(vgs-vt) but thanks to DIBL leakage is now proportional to exp(vgs - vt + eta*vds). The eta here is the DIBL coefficient, which AFAIK is 0.1 or thereabouts.
I would guess the concerns are less about corrupting results, and more about leakage power and switching speed. If you have a very leaky pulldown, switching to "1" is going to be slower.
I'm not convinced for the same reason I posted above. The on-current is at least an order of magnitude higher than the off-current for all technology nodes that I know of.
The only way I see this having a significant effect is through the vt vs. speed trade-off I mentioned above. Lower vt's exponentially increase leakage power, so we push the vt up to combat leakage, but this reduces transistor speed because frequency is roughly proportional to the on-current which is roughly proportional to gate overdrive vdd - vt.
Specifically the effects of drain induced barrier lowering becomes very significant in the sub-threshold region as the channel length is reduced. This has the effect of both increasing drain current (leakage) and making the transistor more difficult to turn off, that is a larger gate bias is required to turn the transistor off.
While "leakage" isn't the cause, you do get leakage and a harder to turn off transistor at the same time, at least for this type of leakage.