Not too familiar with D, but isn't 0xFF ΓΏ (Latin Small Letter Y with diaeresis) in unicode? It's not valid UTF-8 or ascii, but it's still a valid codepoint in unicode.
I'm a fan of the idea in general, and don't think there's a better byte to use as an obviously-wrong default.
It's the same idea for pointers, which default initialize to null.