The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Full-Batch update and mini-batch update are two most widely used algorithms in back-propagation(BP) neural network, to deal with the huge training time and computation cost in the learning process. Parallel computing can improve the computation efficiency and have implemented these two algorithms on Mapreduce framework. In this paper, we implement these two algorithms on Spark framework and evaluate...
This paper proposes a current balance control in parallel operation using neural network control in addition to the conventional digital soft-start control for dc-dc converters. The neural network predictor is used in the average current balance control to improve the voltage drop when the current balance operation starts. In the proposed method, the neural network is trained to predict the output...
As an advanced artificial intelligence technology, error back-propagation (BP) neural network algorithm has been widely applied to electronics, communications, automation and other fields. However, traditional BP neural network algorithm has the disadvantages, such as inclination to stick into local optima, and slow convergence, which exert a great impact on the processing performance, and also limit...
Neural Network is an important tool for many pattern recognition, prediction and function approximation tasks. Three Java open source Neural Network Frameworks such as Encog v2.4, Neuroph v2.4 and JOONE 2.0 (Java Object Oriented Neural Engine) are considered here for an experimental evaluation. The performance evaluation is carried out by training the feed-forward neural network to recognize the XOR...
Currently, the optical character recognition (OCR) is applied in many fields such as reading the office letter and to read the serial on parts of industrial. The most manufacturing focus the processing time and accuracy for inspection process. The learning method of the optical character recognition is used a neural network to recognize the fonts and correlation the matching value. The neural network...
There has been developed many method for the better convergence and generalization ability of neural network. Multilayer Perceptron (MLP) is made multi hidden layered structure for better performance. But in these types of structures still error from any output classes propagates in the backward direction which has a negative impact on the weight updating as well as overall performance because every...
Presented in this paper is an extended version of the Multi-ADAptive LINear Element (MADALINE) neural network, termed EMADALINE, for On-line System identification of Multi-Input Multi-Output (MIMO) linear time-varying (LTV) systems Trained by Levenberg-Marquardt Method. A sliding window on the data set is used in the learning algorithm for the purpose of improving convergence speed during training...
The artificial neural network (ANN) is among the most widely used methods in data processing applications. The memristor-based neural network further demonstrates a power efficient hardware realization of ANN. Training phase is the critical operation of memristor-based neural network. However, the traditional training method for memristor-based neural network is time consuming and energy inefficient...
In human brain the neurons are excited in a dynamic way. The response of different neurons varies widely because of the variation of electrical signal in every neuron. Backpropagation(BP) is a training algorithm where the learning of the Neural Network (NN) is done by a constant learning rate (LR). But to replicate the human brain function, the learning rate should be changed as the excitation of...
This paper presents a sliding-window version of online identification method for linear time varying systems based on the ADaptive LINear Element — ADALINE (Widrow and Lehr, 1990) neural network trained with Levenberg-Marquardt method which offers faster tracking of system parameter change. It is well known ADALINE is slow in convergence which is not appropriate for online application and identification...
Increased demands for higher storage capacity solution have driven the Hard Disk Drive (HDD) technological boundaries. As the Perpendicular Magnetic Recording (PMR) head shows promising increase in Areal Density away from the limit of Longitudinal Magnetic Recording, HDD companies have switch to 100% PMR drives. PMR heads requires tight physical specifications fabricating its Writer Element in order...
The purpose of this paper is to show performance characteristics of the reference modification control dc-dc converter which uses neural network and model controls. In the presented method, the neural network controller is used to modify the reference in the proportional control term of the conventional PID control. The neural network controller is repeatedly trained using former predicted data to...
The artificial bee colony algorithm is a novel simulated evolutionary algorithm. The artificial bee colony algorithm has positive feedback, distributed computation and a constructive greedy heuristic convergence. Back propagation is a kind of feed forward neural network widely used in many areas, but it has some shortcomings, such as low precision solutions, slow search speed and easy convergence...
The traditional BP neural network training method processes the training dataset serially on one machine, so the efficiency is quite low. The massive data that need to be explored brings great challenge for BP neural network. The traditional serial training method of BP neural network will encounter many problems, such as costing too much time and insufficient memory to finish the training process...
Backpropagation algorithm is widely used to solve many real-world problems, using the concept of Multilayer Perceptron. However, main disadvantages of Backpropagation are the convergence rate of it being relatively slow, and it is often trapped in the local minima. To solve this problem, it is found in the literatures, an evolutionary algorithm such as Particle Swarm Optimization algorithm is applied...
This paper presents a case study on the impact of using reduced precision arithmetic on learning in Restricted Boltzmann Machine (RBM) deep belief networks. FPGAs provide a hardware accelerator framework to speed up many algorithms, including the learning and recognition tasks of ever growing neural network topologies and problem complexities. Current FPGAs include DSP blocks - hard blocks that allow...
The Back Propagation Neural Network(BPNN) has been used widely in objects recognition, but in fact, the BPNN can easily be trapped into a local minimum and has slow convergence. Moreover, the number of neural cells for hidden layer in the BPNN is hard to determine. For this reason, this paper proposes a novel method to improve the performance from the structure and the algorithm. The improved BP algorithm...
This paper presents an online system identification method for a linear time-varying system whose parameters change with time. The method is based on an improved generalized ADAptive LINear Element (ADALINE) neural network. It is well known ADALINE is slow in convergence which is not appropriate for online application and identification of time varying system. To speed up convergence of learning and...
This paper uses generalized congruence function instead of transfer function of classical BP neural network, and improve convergence rate of neural network. We introduce the subsection generalized derivation, error back propagation derivation mechanism of classical BP algorithm to adjust weight vector in generalized congruence neural network, and modify generalized congruence neural network, and then...
Local minimum is incorporated problem in neural network (NN) training. To alleviate this problem, a modification of standard backpropagation (BP) algorithm, called BPCL for training NN is proposed. When local minimum arrives in the training, the weights of NN become idle. If the chaotic variation of learning rate (LR) is included during training, the weight update may be accelerated in the local minimum...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.