A forward stagewise neural network algorithm is presented for multi-class classification. Unlike most neural-net models, which choose the sigmoid or other nonlinear functions as the activation functions, the algorithm employs two types of simple linear functions instead. In this work, a novel weak learner framework called composite stump is proposed, which can improve convergence speed and share features. Moreover, some sparsity constraints are imposed on the iterative process that further assist in improving the classification performance. With these optimization techniques, the classification problem is solved by a simple but effective classifier. Experimental results show that the new method outperforms previous approaches on a number of datasets.