Serwis Infona wykorzystuje pliki cookies (ciasteczka). Są to wartości tekstowe, zapamiętywane przez przeglądarkę na urządzeniu użytkownika. Nasz serwis ma dostęp do tych wartości oraz wykorzystuje je do zapamiętania danych dotyczących użytkownika, takich jak np. ustawienia (typu widok ekranu, wybór języka interfejsu), zapamiętanie zalogowania. Korzystanie z serwisu Infona oznacza zgodę na zapis informacji i ich wykorzystanie dla celów korzytania z serwisu. Więcej informacji można znaleźć w Polityce prywatności oraz Regulaminie serwisu. Zamknięcie tego okienka potwierdza zapoznanie się z informacją o plikach cookies, akceptację polityki prywatności i regulaminu oraz sposobu wykorzystywania plików cookies w serwisie. Możesz zmienić ustawienia obsługi cookies w swojej przeglądarce.
This paper discusses how to apply the ensemble learning for the individual learners on the randomly splitting data. Rather than letting the individual learners learn independently on the different subsets, it would be better for the individual learners to learn cooperatively by exchanging the learned values. In this way, the individual learners could learn the whole given data together while they...
This paper proposes a hybrid negative correlation learning in which each individual neural network in an neural network ensemble would either learn a data point by negative correlation learning or learn to be different to the neural network ensemble. The implementation is through randomly splitting the training set into two subsets for each individual neural network in learning. On one subset of the...
Data mining can find some interest information from large amounts of data. Data association (association rules) can find associations among data items. Data classification distinguishes every data from a data set or group, and it also can combine data association. Formal concept analysis is a data analyzing theory which discovers concept structure in data sets. It can transform formal context into...
It is certain that the individual learners should be different from each other in order for a committee machine to reach the better performance. However, differences alone among the individual learners are not enough for the committee machine to predict well on the unknown data. It would be essential for each individual learner to be able to decide whether to learn to be different or not to the other...
Different to independent and sequential learning, negative correlation learning trains all learners in an ensemble simultaneously and cooperatively with direct interactions. In negative correlation learning, each learner can be learned by the error signals only based on the differences between the output of the ensemble and the target output on a given example without considering whether itself has...
Distance metric learning (DML) is an important technique to improve similarity search in content-based image retrieval. Despite being studied extensively, most existing DML approaches typically adopt a single-modal learning framework that learns the distance metric on either a single feature type or a combined feature space where multiple types of features are simply concatenated. Such single-modal...
Different to other re-sampling ensemble learning, negative correlation learning trains all individual models in an ensemble simultaneously and cooperatively. In negative correlation learning, each individual could see all training data, and adapt its target function based on what the rest of individuals in the ensemble have learned. In this paper, two error bounds are introduced in negative correlation...
Two error bounds were introduced in the learning process of balanced ensemble learning. They are the lower bound of error rate (LBER) and the upper bound of error output (UBEO) on the training set, respectively. These two error bounds would decide whether a training data point should be further learned or not after balanced ensemble learning has reached certain stage. Before the error rates are higher...
Outlier detection is a method to improve performances of machine learning models. In this paper, we use an outlier detection method to improve the performance of our proposed algorithm called decision boundary making (DBM). The primary objective of DBM algorithm is to induce compact and high performance machine learning models. To obtain this model, the DBM reconstructs the performance of support...
Balanced ensemble learning is developed from negative correlation learning by shifting the learning targets. Compared to the negative correlation learning, balanced ensemble learning is able to learn faster and achieve the higher accuracy on the training sets for a number of the tested classification problems. However, it has been found that the higher accuracy balanced ensemble learning obtained...
Ensemble learning system could lessen the degree of overfitting that often appear in the supervised learning process for a single learning model. However, overfitting had still been observed in negative correlation learning that is an ensemble learning method with correlation-based penalty. Two constraints were introduced into negative correlation learning in order to conquer such overfitting. One...
It has been proved that there is a bias-variance-covariance trade-off among the trained neural network ensembles. In this paper, extra learning on random data points was proposed to control the variations of the correlations in the negative correlation learning (NCL). Without the control of the correlations, NCL might have arbitrary values on the unknown data points after learning too much on the...
It has been shown that as the number of weak learners in a majority voting model is increased so does its generalization if those weak learners are uncorrelated or negatively correlated. Although some learning algorithms including bagging and boosting have been developed to create such weak learners, learners trained by these learning algorithms are actually not so weak in many applications. This...
Ensemble methods that train multiple learners and then combine their predictions have been shown to be very effective in supervised learning. But bagging does not work very well in some case, such as k-nearest neighbor (kNN). At the same time, query learning strategies using bagging is also not work very well. From features view, we introduce bagging features active learning (ALBF) for kNN and apply...
Podaj zakres dat dla filtrowania wyświetlonych wyników. Możesz podać datę początkową, końcową lub obie daty. Daty możesz wpisać ręcznie lub wybrać za pomocą kalendarza.