The paper considers a large class of additive neural networks where the neuron activations are modeled by discontinuous functions or by non-Lipschitz functions. A result is established guaranteeing that the state solutions and output solutions of the neural network are globally convergent in finite time toward a unique equilibrium point. The obtained result, which generalizes previous results on convergence in finite time in the literature, is of interest for designing neural networks aimed at solving global optimization problems in real time