A new, likelihood-based non-uniform allocation of Gaussian kernels in scalar (feature) dimension is proposed to compress complex, Gaussian mixture-based, continuous density HMMs into computationally efficient, small footprint models. Different from the objective of the previously proposed Kullback-Leibler divergence-based (KLD-based) allocation (Li et al., 2005), which is to make a better representation of the original model, the objective of the likelihood-based approach is to make the current compressed model be a better representation of the training data. It is implemented based on the unequal likelihood contributions of different features with uniform representation resolutions. Our experiments on the resource management database show that likelihood-based allocation outperforms uniform allocation and KLD-based non-uniform allocation due to its better representation of the training data.