This paper proposes a new technique for the optimization of multi-frequency tests for linear analog circuits. Fault simulation is used to obtain the frequency intervals for the detection of each fault. New efficient algorithms are then presented for the selection of the optimal set of test frequencies within these intervals for the detection of all faults. Numerical simulations with randomly generated problem instances demonstrate the good time complexity of the proposed algorithms, with a large improvement over previous approaches (Mir et al 1996).