The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Prototype classifiers trained with multi-class classification objective are inferior in pattern retrieval and outlier rejection. To improve the binary classification (detection, verification, retrieval, outlier rejection) performance of prototype classifiers, we propose a one-vs-all training method, which enriches each prototype as a binary discriminant function with a local threshold, and optimizes...
Cost-sensitive learning is of critical importance in many domains including bankruptcy prediction where the costs of different errors are unequal. Most existing classification methods aim to minimize overall error based on the assumption that the costs are equal. This paper presents three cost-sensitive learning vector quantization (LVQ) approaches to incorporate cost matrix in classification. Experimental...
Our open source real-time recognition engine for online isolated handwritten characters is a 3-nearest neighbor classifier that uses approximate dynamic time warping comparisons with a set of prototypes filtered by two fast distance-based methods. This engine achieved excellent classification rates on two writer-independent tasks: UJIpenchars and Pendigits. We present the integration of multilayer...
The k-nearest neighbor classification rule (k-NNR) is among the most popular and successful pattern classification techniques. However, it usually suffers from the existing outliers, and in the small training samples situation, it performed poor. In this paper, a variant of the k-NNR, the extended nearest neighbor classification based on the local mean vector and the class mean vector has been proposed...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.