The effect of the phase-error of local oscillators (LOs) on the performance of wavelength-division multiplexing (WDM) optical transmission systems employing digital back-propagation (DBP) is investigated. A simple model describing the performance degradation due to the phase-error is developed, and the validity of it is demonstrated by numerical simulations of 20-GHz-spaced-nine-channel orthogonal frequency division multiplexing (OFDM)-modulated transmissions. It is also shown that, whereas the variance of the phase-error should be less than 5° for the full benefit of the DBP, the performance is significantly improved by the DBP even when there is no control on the phase of LOs.