Serwis Infona wykorzystuje pliki cookies (ciasteczka). Są to wartości tekstowe, zapamiętywane przez przeglądarkę na urządzeniu użytkownika. Nasz serwis ma dostęp do tych wartości oraz wykorzystuje je do zapamiętania danych dotyczących użytkownika, takich jak np. ustawienia (typu widok ekranu, wybór języka interfejsu), zapamiętanie zalogowania. Korzystanie z serwisu Infona oznacza zgodę na zapis informacji i ich wykorzystanie dla celów korzytania z serwisu. Więcej informacji można znaleźć w Polityce prywatności oraz Regulaminie serwisu. Zamknięcie tego okienka potwierdza zapoznanie się z informacją o plikach cookies, akceptację polityki prywatności i regulaminu oraz sposobu wykorzystywania plików cookies w serwisie. Możesz zmienić ustawienia obsługi cookies w swojej przeglądarce.
A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing...
In this paper, a novel fast support vector machine (SVM) method combining with the deep quasi-linear kernel (DQLK) learning is proposed for large scale image classification. This method can train large-scale dataset with SVM fast using less memory space and less training time. Since SVM classifiers are constructed by support vectors (SVs) that lie close to the separation boundary, removing the other...
A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of...
For many problems in machine learning fields, the data are nonlinearly distributed. One popular way to tackle this kind of data is training a local kernel machine or a mixture of several locally linear models. However, both of these approaches heavily relies on local information, such as neighbor relations of each data sample, to capture potential data distribution. In this paper, we show the non-local...
In this paper, we introduce a data-dependent kernel called deep quasi-linear kernel, which can directly gain a profit from a pre-trained feedforward deep network. Firstly, a multi-layer gated bilinear classifier is formulated to mimic the functionality of a feed-forward neural network. The only difference between them is that the activation values of hidden units in the multi-layer gated bilinear...
Kernel functions based machine learning algorithms have been extensively studied over the past decades with successful applications in a variety of real-world tasks. In this paper, we formulate a kernel level composition method to embed multiple local classifiers (kernels) into one kernel function, so as to obtain a more flexible data-dependent kernel. Since such composite kernels are composed by...
In the multi-label classification issue, some implicit constraints and dependencies are always existed among labels. Exploring the correlation information among different labels is important for many applications. It not only can enhance the classifier performance but also can help to interpret the classification results for some specific applications. This paper presents an improved multi-label classification...
This paper proposes a geometric way to construct a quasi-linear kernel by which a quasi-linear support vector machine (SVM) is performed. A quasi-linear SVM is a SVM with quasi-linear kernel, in which the nonlinear separation boundary is approximated by using multi-local linear boundaries with interpolation. However, the local linearity extraction for the composition of quasi-linear kernel is still...
Podaj zakres dat dla filtrowania wyświetlonych wyników. Możesz podać datę początkową, końcową lub obie daty. Daty możesz wpisać ręcznie lub wybrać za pomocą kalendarza.