A new calibration technique for time-interleaved analog-to-digital converters is proposed, based on Hermitianity-preserving complex Taylor approximations of the frequency response of the correction filters. Calibration is interpreted as approximating these filters with linear combinations of base filters obtained by the proposed Taylor expansion. Known calibration techniques are reinterpreted in this way and compared in terms of accuracy, computational complexity, numerical stability, and convergence time. The new technique is shown to be accurate and to require few hardware resources. The limited number of parameters to estimate enables good performance in fixed-point arithmetic and fast convergence. This is important in background calibration schemes in which parameters need to be estimated in real time.