Mood classification of music is an emerging field of music information retrieval. In the approach presented here features extracted from an audio file are used to map a song onto a psychologically based emotion space. The motivation behind this system is the lack of intuitive and contextually aware playlist generation tools available to music listeners. The need for such tools is made obvious by the fact that digital music libraries are constantly expanding, thus making it increasingly difficult to recall a particular song in the library or to create a playlist for a specific event. In this paper, we compared the performance of the proposed W-D-KNN classification method with that of other popular classifiers by applying them to a music database consisting of 60 famous popular songs from English albums. Each song annotated by 40 participants. The emotions of these songs distribute roughly uniformly in each quadrant of the emotion plane. The experimental results show that the proposed W-D-KNN classifier achieves a recognition rate of more than 96% and outperforms KNN and SVM classifiers.