Parameterization based on truncated singular value decomposition (TSVD) of the dimensionless sensitivity matrix has been shown to be an efficient approach for history matching. With TSVD parameterization, the search direction is computed as a linear combination of a few principal right singular vectors. As the sensitivity matrix is not explicitly computed, this parameterization is appropriate for large-scale history-matching problems. Moreover, previous work presented theoretical evidence that TSVD of the dimensionless sensitivity matrix provides the optimal parameterization in terms of uncertainty reduction. TSVD has been used in the randomized maximum likelihood (RML) framework to generate multiple conditional realizations of reservoir models. In this work, we investigate the effect of TSVD in the search direction obtained by the application of the Gauss–Newton and the Levenberg–Marquardt (LM) methods. In particular, we show that the TSVD-based LM algorithm converges to appropriate estimates because it gradually resolves the important features of the true model. We also introduce an improved implementation of a TSVD-based LM algorithm for generating multiple realizations of reservoir models conditioned to production data. Our experiments indicate that the computational cost of the new implementation is on the order of 2/3 of the cost of the previous implementation.