An extension of independent component analysis from one to multiple datasets, independent vector analysis, has recently become a subject of significant research interest. Since in many applications, latent sources are non-Gaussian, have sample dependence, and have dependence across multiple data sets, it is desirable to exploit all these properties jointly. Mutual information rate, which leads to the minimization of entropy rate, provides a natural cost for the task. In this paper, we present a new algorithm by using an effective entropy rate estimator, which takes all these properties into account. Experimental results show that the new method accounts for these properties effectively.