Hacker News new | past | comments | ask | show | jobs | submit login

The reason the "smearing" argument doesn't hold water when looking at jitter from something like a USB cable is that this interface is being used to send a bitstream, not timing information (at least not directly).

The way it usually works is that first the computer sends some sort of mode-setting message. This is where the computer tells the Audio DAC controller what output sampling rate to use.

The DAC then uses its own internal circuitry to generate the timing signals. This circuitry is what determines the jitter.

The computer then sends the audio data over the cable. This data is captured, buffered, and then finally sent to the DAC when the internally generated timing system is ready for it.

So the only thing that cable smearing can do is introduce errors into the digital messages that the computer sends. If it's particularly bad, the mode-setting message won't make it intact and you won't hear anything. If the mode does get set correctly, but there are occasional bit errors in the bitstream, you'll hear occasional (but obvious) pops. If the computer can't send the bitstream at the expected rate, the buffers will over or underrun and everything will stop.

But what you won't get is more jitter.

The original argument assumes that the cable is sending a signal whose edges are used for clock recovery, and that this recovered clock is used as the timebase for the sampling system. But nobody actually does this [1]. Reasonably high jitter / phase noise on the bitstream signals is fine, as long as the data can still be decoded.

[1] Okay fine, HDMI sort of does this, but they're almost always using a more sophisticated retiming system.




OK yeah, I'm not sure what I was thinking when I wrote about cabling.

I agree that clock recovery and PLL filtering can take care of jitter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: