The base classifier, which is trained by AdaBoost ensemble learning algorithm, has a constant weight for all test instances. From the view of iterative process of AdaBoost, every base classifier has good classification performance in a certain small area of input space, so the constant weight for different test samples is unreasonable. An improved AdaBoost algorithm based on adaptive weight adjusting is presented. The classifiers’ selection and their weights are determined by full information behavior correlation which describes the correlation between test sample and base classifier. The method makes use of all scalars of base classifier’s full information behavior, overcomes the problem of information losing. The results of simulated experiments show that the ensemble classification performance is improved greatly.