This paper addresses the scalable optimization of sensor networks for distributed detection applications. In the general case, the jointly optimum solution for the local sensor decision rules and the fusion rule is extremely difficult to obtain and does not scale with the number of sensors. In this paper, we consider optimization of distributed detection systems based on a local metric for sensor detection performance. Derived from the asymptotic error exponents in binary hypothesis testing, the Chernoff information emerges as an appropriate metric for sensor detection quality. By locally maximizing the Chernoff information at each sensor and thus decoupling the optimization problem, scalable solutions are obtained which are also robust with respect to the underlying prior probabilities. By considering the problem of detecting a deterministic signal in the presence of Gaussian noise, a detailed numerical study illustrates the feasibilty of the proposed approach.