In this paper we present a systematic exploration to determine several EEG based features for classifying three emotional states (happy, fearful and neutral) pertaining to face perception. EEG data were acquired through a 19-channel wireless system from eight adults under two conditions — in a constrained position and involving head-body movements. The movement EEG data was pre-processed using an artifact reduction algorithm and both datasets were processed to extract neurophysiological features — ERP components and from functional connectivity measures. The functional connectivity measures were processed using a brain connectivity toolbox and gray level co-occurrence matrices to generate a total of 463 features. The feature set was split into: training dataset comprising of constrained and movement EEG data and test dataset comprising of only movement EEG data. A retrospective cross-validation approach was run on the training dataset in conjunction with two classifiers (LDA and SVM) and the ranked feature set, to select the best features using a sequential forward selection algorithm. The best features were further used to prospectively classify the three emotions in the test dataset. Our results show that we can successfully classify the emotions using LDA with an accuracy of 89% and using top 17 ranked features.