Serwis Infona wykorzystuje pliki cookies (ciasteczka). Są to wartości tekstowe, zapamiętywane przez przeglądarkę na urządzeniu użytkownika. Nasz serwis ma dostęp do tych wartości oraz wykorzystuje je do zapamiętania danych dotyczących użytkownika, takich jak np. ustawienia (typu widok ekranu, wybór języka interfejsu), zapamiętanie zalogowania. Korzystanie z serwisu Infona oznacza zgodę na zapis informacji i ich wykorzystanie dla celów korzytania z serwisu. Więcej informacji można znaleźć w Polityce prywatności oraz Regulaminie serwisu. Zamknięcie tego okienka potwierdza zapoznanie się z informacją o plikach cookies, akceptację polityki prywatności i regulaminu oraz sposobu wykorzystywania plików cookies w serwisie. Możesz zmienić ustawienia obsługi cookies w swojej przeglądarce.
The micro-Doppler (m-D) feature is regarded as a unique characteristic for target recognition. Sparse recovery based approaches for m-D parameter estimation using the single measurement vector (SMV) model have shown their effectiveness recently. However, SMV only fits for narrowband m-D signals, and accurate parameters can hardly be estimated using SMV in strong noise. The signals with the same sparse...
Existing clustering algorithms need to specify the number of clusters and to select initial points using human input, which lead to inferior clustering and optimisation outputs. Here, an improved grey decision-making model based on the thought of affinity propagation algorithm and grey correlation analysis is proposed to solve these problems. According to the panel data class and the inter-class candidate...
This paper proposes a hybrid negative correlation learning in which each individual neural network in an neural network ensemble would either learn a data point by negative correlation learning or learn to be different to the neural network ensemble. The implementation is through randomly splitting the training set into two subsets for each individual neural network in learning. On one subset of the...
In the ensemble learning methods for training individual learners in a committee machine, two learning items should be optimized, including minimization of both the squared difference between the target and the learner's output and the estimated correlation between the learner and the rest of learners in the ensemble. The first term is to force each learner to learn the given data. The second term...
It is certain that the individual learners should be different from each other in order for a committee machine to reach the better performance. However, differences alone among the individual learners are not enough for the committee machine to predict well on the unknown data. It would be essential for each individual learner to be able to decide whether to learn to be different or not to the other...
Negative correlation learning is an ensemble learning approach that is able to create negatively correlated learners simultaneously and cooperatively in a committee machine. One problem in negative correlation learning is that the learning error functions are defined in the same way for all individual learners. Learners have little choice in making their own decisions on how to learn a given data...
Negative correlation learning has been proposed to create a set of negatively correlated artificial neural networks (ANNs) in a committee machine. In negative correlation learning, the error signals for each ANN on a given data are not only decided by the error differences between the output of ANN and the targets. Two terms are optimized at the same time. The first one is to minimize the error between...
Two different implementations of negative correlation learning with λ > 1 are discussed in this paper. In the first implementation, every learner is forced to learn to be different to the ensemble on every data point no matter what have been learned by the ensemble and itself. In the second implementation, every learner is selectively to learn to be different to the ensemble on every data point...
Different to independent and sequential learning, negative correlation learning trains all learners in an ensemble simultaneously and cooperatively with direct interactions. In negative correlation learning, each learner can be learned by the error signals only based on the differences between the output of the ensemble and the target output on a given example without considering whether itself has...
Self-awareness is a kind of ability of recognizing oneself as an individual being different from the environment and other individuals. This paper proposes negative correlation learning with self-awareness in order for each artificial neural network (ANN) in a committee machine to be self-aware in learning so that it could decide by itself to learn more or less. On one hand, when the learning would...
Previous feature learning based blind image quality assessment (BIQA) methods invariably require large codebook or codebook updating procedure to obtain satisfying performance. In this paper, we propose a novel general purpose BIQA method, local feature aggregation (LFA) model, which requires only a much smaller codebook without the need for codebook updating. The proposed model consists of three...
Different to other re-sampling ensemble learning, negative correlation learning trains all individual models in an ensemble simultaneously and cooperatively. In negative correlation learning, each individual could see all training data, and adapt its target function based on what the rest of individuals in the ensemble have learned. In this paper, two error bounds are introduced in negative correlation...
Balanced ensemble learning is developed from negative correlation learning by shifting the learning targets. Compared to the negative correlation learning, balanced ensemble learning is able to learn faster and achieve the higher accuracy on the training sets for a number of the tested classification problems. However, it has been found that the higher accuracy balanced ensemble learning obtained...
Ensemble learning system could lessen the degree of overfitting that often appear in the supervised learning process for a single learning model. However, overfitting had still been observed in negative correlation learning that is an ensemble learning method with correlation-based penalty. Two constraints were introduced into negative correlation learning in order to conquer such overfitting. One...
Hyperspectral image denoising and unmixing are two separate stages in traditional works. Unmixing algorithm is implemented after denoising. The performance of unmixing will be promoted if noise in hyperspectral image is removed well. But the result of unmixing can not be used to improve the result of denoising. In this paper we propose a joint denoising and unmixing algorithm for hyperspectral image...
Online videos, e.g., YouTube videos, are important topics for social interactions among users of online social networking sites (OSN), e.g., Facebook. This opens up the possibility of exploiting video-related user social interaction information for better video recommendation. Towards this goal, we conduct a case study of recommending YouTube videos to Facebook users based on their social interactions...
Researchers tend to agree that an increasing quantity of data has caused the complexity and difficulty for information discovery, management and reuse. An essential factor relates to the increasing channels for information sharing. Finding information, especially those meaningful or useful one, that meets ultimate task of user becomes harder then it is used to be. In this research, issues concerning...
It has been proved that there is a bias-variance-covariance trade-off among the trained neural network ensembles. In this paper, extra learning on random data points was proposed to control the variations of the correlations in the negative correlation learning (NCL). Without the control of the correlations, NCL might have arbitrary values on the unknown data points after learning too much on the...
In this paper, we present a cluster algorithm which is an improvement of the multi-objective clustering ensemble algorithm (MOCLE), which is denoted as IMOCLE for short. First, we introduce a new clustering objective function to measure the individual difference in the optimization process so as to remain the diversity of the population. Then, a clustering ensemble technique is applied to MOCLE to...
In neural network learning, it has been often observed that some data have been learned extremely well while others have been barely learned. Such unbalanced learning often lead to the learned neural networks or neural network ensembles that could be too strongly biased on those learned-well data. The stronger bias could contribute to the larger variance and the poorer generalization on the unseen...
Podaj zakres dat dla filtrowania wyświetlonych wyników. Możesz podać datę początkową, końcową lub obie daty. Daty możesz wpisać ręcznie lub wybrać za pomocą kalendarza.