Optimal reverse prediction (ORP) has recently been proposed as a semi-supervised framework to unify supervised and unsupervised training methods such as supervised least square, principal component analysis (PCA), k-means clustering and normalized graph-cut. ORP has an ability to deal with classification tasks in which the labeled data are insufficient. But, the performance of ORP and its kernelized version is still not satisfactory for classification applications. To further improve performance of ORP, motivated by recently proposed orthogonal k-means clustering, in this paper we propose an orthogonal optimal reverse prediction (OORP), together with its kernelized and Laplacian regularized extensions. With only limited additional computations, our algorithms can greatly enhance the classification performance, compared to the original ORP.Extensive experiments on synthetic and benchmark data collections consistently prove the effectiveness and efficiency of our OORP in comparison with several competing approaches.