In the context of Maximum Likelihood (ML) source separation in a semi-blind scenario, where the spectra of the sources are known and distinct, the likelihood equations amount to a set of matrix decompositions (known as the "Sequentially Drilled" Joint Congruence Transformation (SeDJoCo)). However, quite often multiple solutions of SeDJoCo exist, only one of which is the optimal solution, corresponding to the global maximum. In this paper we characterize the different solutions and propose a procedure for detecting whether a given solution is sub-optimal. Moreover, for such sub-optimal solutions we propose a procedure for re-initializing an iterative solver so as to converge to the optimal solution. Using simulation, we present the empirical probability to encounter a sub-optimal solution (by a given iterative algorithm), as well as the resulting separation improvement when applying our proposed re-initialization approach in such cases.