In this paper, based on least trimmed squares-support vector regression (LTS-SVR), a robust radial basis function network (RRBFN) is proposed in the modeling problem to deal with training data sets that may contain outliers and noises. There are two stages in the proposed RRBFN approach. In stage I, outliers and large noises will be trimmed via the LTS-SVR procedure so that the influences of outliers and large noises can be reduced. In other words, with the use of the LTS-SVR procedure, the proposed approach can obtain an appropriate initial structure for the RRBFN to avoid possible overfitting phenomena. It can also be found that with this procedure it may result in a fast convergence speed. After stage I, the remaining data of the training data set are directly used to adjust the parameters of the RRBFN through the gradient-descent kind of learning algorithm. Hence, it does not need to take extra time to compute the weights of the robust cost function like that considered in the M-estimate kind of approaches. From our simulation results, the performances of the proposed system are superior to that of using the annealing robust RBFN and that of using the Wilcoxon generalized RBFN when training data contain outliers and noises.