This paper focuses on support vector machines (SVMs) with radial basis function (RBF) kernels to solve the large-scale classification problems. We decompose a large-scale learning problem into multiple two-class problems with the one-verse-all decomposition technique, and then propose an adoptively clustering method. An initial support vector (SV) coincides with a certain clustering center, and its width is equal to the max Euclid distance in the clustering region. Therefore, the initial number of SVs is equal to that of the clustering centers, and different RBF kernels are with different widths. The optimization of SVMs is only to determine the Lagrange multipliers. The resulting kernel space for optimization becomes relatively lower in dimensionality, and the final SVs are from a part of the clustering centers. The experimental results for the letter and the handwritten digit recognitions show that the proposed methods are effective.