The Self-Splitting Competitive Learning (SSCL) based on the one-prototype-take-one-cluster (OPTOC) learning paradigm is a powerful algorithm that solves the difficult problems of determining the number of clusters and the sensitivity to prototype initialization in clustering. The SSCL algorithm iteratively partitions the data space into natural clusters without a priori information on the number of clusters. However, SSCL fails to estimate the correct cluster number in some cases such as when there is a cluster whose centroid is coincided with the global centroid in the data set, and the speed of learning process is slow. In this paper, we propose the Robust Self-Splitting Competitive Learning (RSSCL) algorithm with a new update scheme and a new split-validity criterion to solve the problems mentioned above. We compare the performance of RSSCL to SSCL in synthesized Gaussian data set and the universal text collection, Reuters-21578. Experiments show that RSSCL improves the precision to estimate correct cluster number and is more efficient and has more adaptability than SSCL. Results on text clustering show that RSSCL performs better than SSCL on the high-dimensional dataset.