Current multi-output regression method usually ignores the relationship among response variables, and thus it is challenging to obtain an effective coefficient matrix for predicting the response variables with the features. We address these problems by proposing a novel multi-output regression method, which combines sparse feature selection and low-rank linear regression in a unified framework. Specifically, we first utilize a hypergraph Laplacian regularization term to preserve the high-order structure among all the samples, and then use a low-rank constraint to respectively discover the hidden structure among the response variables and explore the relationship among different features in a least square regression framework. As a result, we integrate subspace learning with sparse feature selection to select useful features for multi-output regression. We tested our proposed method using several public data sets, and the experimental results showed that our method outperformed other comparison methods.