In this paper, we present a general class of neural networks with discontinuous neuron activations and varying coefficients, where the neuron activation function is a discontinuous monotone increasing and bounded function. By using the fixed point theorem in differential inclusion theory and constructing suitable Lyapunov functions, a condition is derived which ensures the existence and global exponential stability of a unique periodic solution for the neural network. Furthermore, under certain conditions global convergence in finite time of the state is investigated. The obtained results show that Forti’s conjecture for neural networks without delays is true. Finally, two numerical examples are given to demonstrate the effectiveness of the results obtained in this paper.