We present a method for predicting data-dependent jitter (DDJ) introduced by a general linear time-invariant LTI system based on the system’s unit step response. We express the exact DDJ of a first-order system and verify the validity of the solution experimentally. We then propose a perturbation technique to generalize the analytical expression for DDJ. We highlight the significance of the unit step response in characterizing DDJ and emphasize that bandwidth is not a complete measure for predicting DDJ. We separate the individual jitter contributions of prior bits and use the result to predict the DDJ of a general LTI system. In particular, we identify a dominant prior bit that signifies the well-known distribution of deterministic jitter, the two impulse functions. We also show a jitter minimization property of high-order LTI systems. We verify our generalized analytical expression of DDJ for several real systems including an integrated CMOS 10-Gb/s trans-impedance amplifier by comparing the theory and measurement results. The theory predicts the jitter with as low as only 7.5% error.