I'm a bit surprised that Schneier is advocating for "no award". Even if the SHA-3 candidates are not fundamentally better than SHA-512, we really do need a standardized algorithm that has built in protection from length extension attacks.
A hash function immune to length extension attacks is more fool-proof than HMAC because there is no additional construct required. It's also faster because it requires no additional passes over data.
If I remember correctly (I may not), two SHA-2 functions (SHA-224 and SHA-384?) aren't vulnerable to length extension attacks.
While I agree with you that this is an immediately important feature, I don't think Bruce's premise (that SHA-2 is still pretty good) is invalid.
Perhaps like you though, I don't understand why a new standard can't be incremental. I think it's silly to wait until something major happens to change.
The problem with a new standard is that it may induce many to start using it on the grounds that "newer is better", which may not actually be the case: the SHA-2 algorithms have withstood more scrutiny so far.
> If I remember correctly (I may not), two SHA-2 functions (SHA-224 and SHA-384?) aren't vulnerable to length extension attacks.
Interesting, is that because they only return part of the final state (by slicing sha-256 and sha-512) where unsliced 256 and 512 return all of the algorithm's running state as its result?
That's the only reason I can think of why they would be immune to length extension attacks. With SHA-224 one could just brute force the missing 32 bits of state, though.