For linear time-invariant system model, this paper analyzes the convergence of parameter estimations as the length of the input-output data tends to infinity through prediction error method. It is known that the sequence of the prediction errors, called criterion functions, converges uniformly in the parameter with probability one as the data length tends to infinity. Given an input-output data of fixed length, the parameter estimation is represented by a set in general, instead of by a single point, on which the criterion function takes its minimum. Thus a mathematical feature of the convergence problem of parameter estimation is in that we are needed, from the convergence of a sequence of functions, to infer the convergence of the sequence of their sets of minimizing arguments. The Hausdorff metric, as a rational due tool, is suggested to measure the distance between sets and then is used to discuss the convergence problem here. We show that, according to Hausdorff metric, the convergence of parameter estimation can not be guaranteed in general, and give some conditions to guarantee.