Gaussian processes are a powerful non-parametic tool for Bayesian inference but limited by their cubic scaling problem. This paper aims to develop the single-task and multitask sparse Gaussian processes for both regression and classification problems. Firstly, we apply a manifold-preserving graph reduction algorithm to construct the single-task sparse Gaussian processes from a sparse graph perspective. Then, we propose a multitask sparsity regularizer to simultaneously sparsify multiple Gaussian processes from related tasks. The regularizer can encourage the global structures of retained points from closely related tasks to be more similar, while disencourage those from loosely related tasks. Experimental results show that our single-task sparse Gaussian processes are comparable to one state-of-the-art method, and our multitask sparsity regularizer can generate multitask sparse Gaussian processes which are more effective than those obtained from other methods.