This paper proposes a saliency-based attention model based on pulsed cosine transform that simulates the lateral surround inhibition of neurons with similar visual features. The model can be extended to Hebbian-based neural networks. The visual saliency can be represented in binary codes, which agrees with the firing pulse of neurons in human brain. In addition, motion saliency can be directly generated by these pulse codes. Due to its good performance in eye fixation prediction and low computational complexity, our model can be used in real-time system such as robot navigation, virtual human system, and intelligent auto-focus system embedded in digital camera.