A three-layer feed forward artificial neural network with trigonometric hidden-layer units is constructed. The essential order of approximation for the network which can simultaneously approximate function and its derivatives is estimated and a theorem of saturation (the largest capacity of simultaneous approximation) is proved. These results can precisely characterize the approximation ability of the network and the relationship among the rate of simultaneous approximation, the topological structure of hidden-layer and the properties of approximated functions.