In this paper, a new learning algorithm which encodes a priori information into feedforward neural networks is proposed for function approximation problem. The algorithm incorporates two kinds of constraints into single hidden layered feedforward neural networks, which are architectural constraints and connection weight constraints, respectively, from a priori information of function approximation problem. On one hand, the activation functions of the hidden neurons are a class of specific polynomial functions based on a priori information from Taylor series expansions of the approximated functions. On the other hand, the connection weight constraints are obtained from the first-order derivatives of the approximated functions. The new learning algorithm has been shown by theoretical justifications to have better generalization performance and faster convergence rate than other algorithms. Finally, several experimental results are given to verify the efficiency and effectiveness of our proposed learning algorithm.