The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
In this paper, we study the performance of different classifiers on the CIFAR-10 dataset, and build an ensemble of classifiers to reach a better performance. We show that, on CIFAR-10, K-Nearest Neighbors (KNN) and Convolutional Neural Network (CNN), on some classes, are mutually exclusive, thus yield in higher accuracy when combined. We reduce KNN overfitting using Principal Component Analysis (PCA),...
We develop a new non-parametric hierarchical information theoretic clustering algorithm based on implicit estimation of cluster densities using k-nearest neighbors (k-nn). Compared to a kernel-based procedure, our k-nn approach is very robust with respect to the parameter choices, with a key ability to detect clusters of vastly different scales. Of particular importance is the use of two different...
This paper describes a novel approach to pattern classification that combines Parzen window and support vector machines. Pattern classification is usually performed in universes where all possible categories are defined. Most of the current supervised learning classification techniques do not account for undefined categories. In a universe that is only partially defined, there may be objects that...
This paper presents an alternative to the traditional impedance based fault location methods, using a simple technique of the learning approaches called k-Nearest Neighbors (k-NN), where besides the fault location distance, the multiple estimation problem is also addressed. This approach only uses the single end measurements of voltage and current available at the power substation. As principal advantage,...
The k-nearest neighbor (kNN) search problem is widely used in domains and applications such as classification, statistics, and biology. In this paper, we propose two fast GPU-based implementations of the brute-force kNN search algorithm using the CUDA and CUBLAS APIs. We show that our CUDA and CUBLAS implementations are up to, respectively, 64X and 189X faster on synthetic data than the highly optimized...
Aircraft large vertical load means the aircraft is weighted above the normal, which may harm the air safety. So it is crucial to do the research on how to choose suitable models and methods to identify whether the aircraft got a large vertical load before landing and adjust immediately. The MWW nonparametric test is used to extract the distinct features. Then, compare the most commonly used models...
Supervised Learning (SL) is a machine learning research area which aims at developing techniques able to take advantage from labeled training samples to make decisions over unseen examples. Recently, a lot of tools have been presented in order to perform machine learning in a more straightforward and transparent manner. However, one problem that is increasingly present in most of the SL problems being...
While advances in sensor and signal processing techniques have provided effective tools for quantitative research on traditional Chinese pulse diagnosis (TCPD), the automatic classification of pulse waveforms is remained a difficult problem. To address this issue, this paper proposed a novel edit distance with real penalty (ERP)-based k-nearest neighbors (KNN) classifier by referring to recent progresses...
This paper proposes an efficient training strategy for one-class support vector machines. The strategy exploits the feature of a trained one-class SVM which uses points only residing on the exterior region of data distribution as support vectors. Thus the proposed training set reduction method selects the so-called extreme points which sit on the boundary of data distribution, through local geometry...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.