In this paper, we consider the problem of constrained global optimization of a continuous multivariable function. We propose a global descent function technique which requires an easily adjustable single parameter. The characteristic property of the proposed function is that each of its local minimizers verifying constraints is a better local minimizer of the objective function, or at less, an approximated local minimizer with a given tolerance. Several other properties of the new function are investigated, in order to establish a corresponding optimization algorithm. We have performed numerical experiments on a set of standard test problems using this algorithm; the results illustrate the efficiency of our approach.