The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Human machine interaction fieldhas potentialapplications in different domainssuch as medicine therapies for vulnerable persons. Thus, allowing the machine to identify and understand emotional states is one of the primordial stages for affective interactivity with Humans. Recent studies have proved that physiological signals contribute to recognize the emotion. In this paper, we aim to classify the...
This paper aims to recognize the human emotional states into three defined areas in arousal-valence evaluation: Corresponding to calm, medium aroused, and excited, unpleasant, neutral valence and pleasant. And thanks to the relevance of the peripheral physiological signals in emotion recognition issue, we used in our contribution the multimodal dataset MAHNOB-HCI. In this database, there are emotional...
This paper pulls together the advances of recognizing emotion theory with advances in speech feature in order to improve understand of emotion under real life condition. It presents the application ofarecently proposed feature extraction method based on spectral features, used forspeech emotion recognition purposes. Specifically, the performance of the proposed approach is evaluated on real condition...
Emotion recognition becomes an investigated topic in affective computing for several applications. The presented paper aims to recognize human emotions using peripheral physiological signals as well as electrocardiogram (ECG), galvanic skin response (GSR), Skin Temperature (Temp) and respiration volume (RV). To achieve this purpose, we develop our work with the multimodal database MAHNOB-HCI. The...
Under real life condition, speech signal is often, corrupted with several noise types. To attenuate this issue, a noise reduction phase is performed before analyzing emotional speech using enhancement algorithms. Three speech enhancement algorithms are introduced for improved emotion classification; spectral subtraction, wiener filter and MMSE. Experiments were prepared with MFCC as feature vectors...
This paper propose a new framework for emotion recognition and classification using a Continuous Wavelet Transform (CWT) for features extraction from physiological signal. Data from the emotional corpus recorded at Augsburg university were used in our study. In the first phase four wavelet families were chosen to analyze EMG RESP SC and ECG signals in order to extract emotional features in multi level...
In this paper we propose a new system for human emotion recognition based on multi resolution analysis of physiological signals. In our study we have used four kinds of bio signals EMG, RESP, ECG and SC recorded at the University of Augsburg. Daubechies Symlet, Haar and Morlet wavelet transform were applied to analyze the non-stationary signals. Physiological features was extracted from the most relevant...
This paper presents a dialogue emotion recognition system using Hidden Markov Model (HMM). We have compared accuracy of Mel-frequency cepstral coefficients (MFCC), Energy, and wavelet sub-band energies and their first derivative and all possible combination. Based on our experiment, MFCC show better performance in comparison with the other studied features. We have also evaluated the impact of gender...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.