The paper deals with the task of robust nonlinear regression in the presence of outliers. The problem is dealt in the context of reproducing kernel Hilbert spaces (RKHS). In contrast to more classical approaches, a recent trend is to model the outliers as a sparse vector noise component and mobilize tools from the sparsity-aware/compressed sensing theory to impose sparsity on it. In this paper, three of the most popular approaches are considered and compared. These represent three major directions in sparsity-aware learning context; that is, a) a greedy approach b) a convex relaxation of the sparsity-promoting task via the l\ norm-based regularization of the least-squares cost and c) a Bayesian approach making use of appropriate priors, associated with the involved parameters.