Dynamic textures are sequences of images of moving scenes in time that are common in natural scenes and play an important role in video content analysis. This paper presents a new dynamic Bayesian framework for segmentation of dynamic textures. First, we formulate the problem in the Bayesian framework using mixture model theory. The major advantage of our approach is that it provides a natural way to cluster data based on the components of the mixture that generated it. Second, in order to model the distribution of observed data, only grayscale information is taken into consideration of the existing mixture models. In order to overcome this problem, a new distribution is presented in this paper. The advantage of the proposed distribution is that it has the flexibility to fit different kinds of observed data and is more reliable for changes of noise and contrast levels. Finally, expectation maximization (EM) algorithm is adopted to maximize the lower bound on the data log-likelihood and to optimize the parameters. The proposed model is successfully compared to the state of the arts dynamic texture segmentation approaches. Numerous experiments are presented where our model is tested on various simulated and natural real-world dynamic textures.