What is the origin and relevance of the 12 kHz to 20 kHz jitter bandwidth often specified for clock and oscillator products?
Oftentimes, clock and oscillator products are specified for RMS phase jitter, calculated by integrating phase noise, over the 12 kHz - 20 MHz bandwidth. This particular specification originated as a jitter generation requirement from SONET (Synchronous Optical Network) and related telecom standards for Optical Carrier network OC-48. (OC-48 networks transmit data up to 2488.32 Mbps using 155.52 MHz reference clocks.)
Industry adopted this measurement as a handy way to compare any clock or oscillator, even those not intended for SONET type applications. It is still generally useful for this purpose. However, non-SONET applications such as frequency synthesis, other serial I/O standards, or ADC reference clocks may want to consider different or wider jitter bandwidths.
Finally, it is not always the case that the 12 kHz to 20 MHz jitter bandwidth can be measured, or even makes sense, if the carrier or clock frequency is too low. Practical systems have digital sampling, or equivalent mixing and filtering limitations, which require the minimim carrier frequency to be roughly 2X or more than the max offset frequency. Even if this was not the case, you could not measure a clock with a carrier frequency < 20 MHz directly, with offsets out to 20 MHz, without violating the double sideband phase noise assumption.