The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Residual network(ResNet) is an effective instance and a significant extension of deep convolutional neural network. ResNet utilizes skip-connection between input layers and output layers to solve the vanishing gradient problem. Due to the powerfulness of skip-connection, the gradient can flow directly through the identity function from later layers to the earlier layers. However, skip-connection makes...
This paper discusses how to apply the ensemble learning for the individual learners on the randomly splitting data. Rather than letting the individual learners learn independently on the different subsets, it would be better for the individual learners to learn cooperatively by exchanging the learned values. In this way, the individual learners could learn the whole given data together while they...
This paper proposes a hybrid negative correlation learning in which each individual neural network in an neural network ensemble would either learn a data point by negative correlation learning or learn to be different to the neural network ensemble. The implementation is through randomly splitting the training set into two subsets for each individual neural network in learning. On one subset of the...
In the ensemble learning methods for training individual learners in a committee machine, two learning items should be optimized, including minimization of both the squared difference between the target and the learner's output and the estimated correlation between the learner and the rest of learners in the ensemble. The first term is to force each learner to learn the given data. The second term...
It is certain that the individual learners should be different from each other in order for a committee machine to reach the better performance. However, differences alone among the individual learners are not enough for the committee machine to predict well on the unknown data. It would be essential for each individual learner to be able to decide whether to learn to be different or not to the other...
Negative correlation learning is an ensemble learning approach that is able to create negatively correlated learners simultaneously and cooperatively in a committee machine. One problem in negative correlation learning is that the learning error functions are defined in the same way for all individual learners. Learners have little choice in making their own decisions on how to learn a given data...
Negative correlation learning has been proposed to create a set of negatively correlated artificial neural networks (ANNs) in a committee machine. In negative correlation learning, the error signals for each ANN on a given data are not only decided by the error differences between the output of ANN and the targets. Two terms are optimized at the same time. The first one is to minimize the error between...
Two different implementations of negative correlation learning with λ > 1 are discussed in this paper. In the first implementation, every learner is forced to learn to be different to the ensemble on every data point no matter what have been learned by the ensemble and itself. In the second implementation, every learner is selectively to learn to be different to the ensemble on every data point...
Different to independent and sequential learning, negative correlation learning trains all learners in an ensemble simultaneously and cooperatively with direct interactions. In negative correlation learning, each learner can be learned by the error signals only based on the differences between the output of the ensemble and the target output on a given example without considering whether itself has...
Self-awareness is a kind of ability of recognizing oneself as an individual being different from the environment and other individuals. This paper proposes negative correlation learning with self-awareness in order for each artificial neural network (ANN) in a committee machine to be self-aware in learning so that it could decide by itself to learn more or less. On one hand, when the learning would...
Different to other re-sampling ensemble learning, negative correlation learning trains all individual models in an ensemble simultaneously and cooperatively. In negative correlation learning, each individual could see all training data, and adapt its target function based on what the rest of individuals in the ensemble have learned. In this paper, two error bounds are introduced in negative correlation...
Two error bounds were introduced in the learning process of balanced ensemble learning. They are the lower bound of error rate (LBER) and the upper bound of error output (UBEO) on the training set, respectively. These two error bounds would decide whether a training data point should be further learned or not after balanced ensemble learning has reached certain stage. Before the error rates are higher...
Balanced ensemble learning is developed from negative correlation learning by shifting the learning targets. Compared to the negative correlation learning, balanced ensemble learning is able to learn faster and achieve the higher accuracy on the training sets for a number of the tested classification problems. However, it has been found that the higher accuracy balanced ensemble learning obtained...
Ensemble learning system could lessen the degree of overfitting that often appear in the supervised learning process for a single learning model. However, overfitting had still been observed in negative correlation learning that is an ensemble learning method with correlation-based penalty. Two constraints were introduced into negative correlation learning in order to conquer such overfitting. One...
It has been proved that there is a bias-variance-covariance trade-off among the trained neural network ensembles. In this paper, extra learning on random data points was proposed to control the variations of the correlations in the negative correlation learning (NCL). Without the control of the correlations, NCL might have arbitrary values on the unknown data points after learning too much on the...
In neural network learning, it has been often observed that some data have been learned extremely well while others have been barely learned. Such unbalanced learning often lead to the learned neural networks or neural network ensembles that could be too strongly biased on those learned-well data. The stronger bias could contribute to the larger variance and the poorer generalization on the unseen...
It has been shown that as the number of weak learners in a majority voting model is increased so does its generalization if those weak learners are uncorrelated or negatively correlated. Although some learning algorithms including bagging and boosting have been developed to create such weak learners, learners trained by these learning algorithms are actually not so weak in many applications. This...
In view of the defects in model of thermocouple characteristic using BP neural network (BPNN), such as lower precision, varying output, instability (after repeated training, the output may be queer), a model of thermocouple characteristic based on Generalized Regression Neural Network (GRNN) is established. The paper gives the process of model building for Ni-Cr Constantan thermocouple characteristic...
It is often that the learned neural networks end with different decision boundaries under the variations of training data, learning algorithms, architectures, and initial random weights. Such variations are helpful in designing neural network ensembles, but are harmful for making unstable performances, i.e., large variances among different learnings. This paper discusses how to reduce such variances...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.