We consider fault detection through apparent changes in the bus susceptance parameters of modern power grids. We formulate the problem using a linear errors-invariables model and derive its corresponding generalized likelihood ratio (GLRT) based on the total least squares (TLS) methodology. Next, we propose a competing detection technique based on the recently proposed total maximum likelihood (TML) framework. We derive the so called TML-GLRT, and show that it can be interpreted as a regularized TLS-GLRT. Numerical simulations in a noisy smart grid setting illustrate the advantages of TML-GLRT over TLS-GLRT with no additional computational costs.