To obtain the inverses of time-varying matrices in real time, a special kind of recurrent neural networks has recently been proposed by Zhang et al. It is proved that such a Zhang neural network (ZNN) could globally exponentially converge to the exact inverse of a given time-varying matrix. To find out the effect of time-derivative term on global convergence as well as for easier hardware-implementation purposes, the ZNN model without exploiting time-derivative information is investigated in this paper for inverting online matrices. Theoretical results of both constant matrix inversion case and time-varying matrix inversion case are presented for comparative and illustrative purposes. In order to substantiate the presented theoretical results, computer-simulation results are shown, which demonstrate the importance of time derivative term of given matrices on the exact convergence of ZNN model to time-varying matrix inverses.