Starting from the definition of mutual information, one promptly realizes that the probabilities inferred by Bayesian tracking can be used to compute the Shannon information between the state and the measurement of a dynamic system. In the Gaussian and linear case, the information rate can be evaluated from the probabilities computed by the Kalman filter. When the probability distributions inferred by Bayesian tracking are nontractable, one is forced to resort to approximated inference, which gives only an approximation to the wanted probabilities. We propose upper and lower bounds to the information rate between the hidden state and the measurement based on approximated inference. Application of these bounds to multiplicative communication channels is discussed, and experimental results for the discrete-time phase noise channel and for the Gauss-Markov fading channel are presented.