This paper addresses parameter estimation of superimposed signals jointly with their number within the Bayesian framework. We combine sparse Bayesian machine learning methods with the state of the art SAGE-based parameter estimation algorithm. Existing sparse Bayesian methods allow to assess model order through priors over model parameters, but do not consider models nonlinear in parameters. SAGE-based parameter estimation does allow nonlinear model structures, but lacks a mechanism for model order estimation. Here we show how Gaussian and Laplace priors can be applied to enforce sparsity and determine the model order in case of superimposed signals, as well as develop an EM-based learning algorithm that efficiently estimate parameters of the superimposed signals as well as prior parameters that control the sparsity of the learned models. Our work extends the existing approaches to complex data and models nonlinear in parameters. We also present new analytical and empirical studies of the Laplace sparsity priors applied to complex data. The performance of the proposed algorithm is analyzed using synthetic data.