The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing...
In this paper, a novel fast support vector machine (SVM) method combining with the deep quasi-linear kernel (DQLK) learning is proposed for large scale image classification. This method can train large-scale dataset with SVM fast using less memory space and less training time. Since SVM classifiers are constructed by support vectors (SVs) that lie close to the separation boundary, removing the other...
A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of...
For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local...
In this paper, we introduce a data-dependent kernel called deep quasi-linear kernel, which can directly gain a profit from a pre-trained feedforward deep network. Firstly, a multi-layer gated bilinear classifier is formulated to mimic the functionality of a feed-forward neural network. The only difference between them is that the activation values of hidden units in the multi-layer gated bilinear...
Kernel functions based machine learning algorithms have been extensively studied over the past decades with successful applications in a variety of real-world tasks. In this paper, we formulate a kernel level composition method to embed multiple local classifiers (kernels) into one kernel function, so as to obtain a more flexible data-dependent kernel. Since such composite kernels are composed by...
In the multi-label classification issue, some implicit constraints and dependencies are always existed among labels. Exploring the correlation information among different labels is important for many applications. It not only can enhance the classifier performance but also can help to interpret the classification results for some specific applications. This paper presents an improved multi-label classification...
This paper proposes a geometric way to construct a quasi-linear kernel by which a quasi-linear support vector machine (SVM) is performed. A quasi-linear SVM is a SVM with quasi-linear kernel, in which the nonlinear separation boundary is approximated by using multi-local linear boundaries with interpolation. However, the local linearity extraction for the composition of quasi-linear kernel is still...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.