The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
This paper introduces a novel paradigm to impute missing data that combines a decision tree with an auto-associative neural network (AANN) based model and a principal component analysis-neural network (PCA-NN) based model. For each model, the decision tree is used to predict search bounds for a genetic algorithm that minimise an error function derived from the respective model. The modelspsila ability...
PCA-guided k-Means performs non-hierarchical hard clustering based on PCA-guided subspace learning mechanism in a batch process. Sequential Fuzzy Cluster Extraction (SFCE) is a procedure for analytically extracting fuzzy clusters one by one, and is useful for ignoring noise samples. This paper considers a hybrid concept of the two clustering approaches and proposes a new robust k-Means algorithm that...
ISOMAP is a manifold learning based algorithm for dimensionality reduction, which is successfully applied to data visualization. However, there exists such limitation in classical ISOMAP that the algorithm is sensitive to noises, especially outliers. So in this paper an extended ISOMAP algorithm is put forward to solve the problem of sensitivity. The proposed algorithm follows the method of classical...
In this paper, an approach for estimating the number of emitters from a set of interleaved pulses trains is proposed. The approach is based on the application of information theoretic criterion, which is formulated by using a new model of eigenvalues from principal component analysis (PCA) of pulse envelope vectors. In this model, the logarithm likelihood function is obtained by clustering the eigenvalues...
Computational intelligence and other data mining techniques are used for characterizing regional and time-varying climatic variations in Spain in the period 1901-2005. Daily maximum temperature data from 10 climatic stations are analyzed (with and without missing values) using principal components (PC), similarity-preservation feature generation, clustering, Kolmogorov-Smirnov dissimilarity analysis...
In this paper, an efficient feature extraction method named as Constrained Maximum Variance Mapping (CMVM) is developed for dimensionality reduction. The proposed algorithm can be viewed as a linear approximation of multi-manifolds based learning approach, which takes the local geometry and manifold labels into account. After the local scatters have been characterized, the proposed method focuses...
This paper proposes a method for the integration of multi-modal biometrics. As the conventional authentication method, password system is mostly used. But, password mechanism has many issues. In order to solve the problems, biometric authentication methods are often used. But, the authentication method using biological characteristics, such as fingerprint, also has some problems. In this paper, we...
In this paper, we investigate the wellposedness of the kernel adaline. The kernel adaline finds the linear coefficients in a radial basis function network using deterministic gradient descent. We will show that the gradient descent provides an inherent regularization as long as the training is properly early-stopped. Along with other popular regularization techniques, this result is investigated in...
Nonlinear principal component analysis (NLPCA) is one of the most progressive computational tools developed during the last two decades. However, in spite of its proper performance in feature extraction and dimension reduction, it is considered as a blind processor which can not extract physical or meaningful factors from dataset. This paper presents a new distributed model of autoassociative neural...
Vector data are normally used for probabilistic graphical models with Bayesian inference. However, tensor data, i.e., multidimensional arrays, are actually natural representations of a large amount of real data, in data mining, computer vision, and many other applications. Aiming at breaking the huge gap between vectors and tensors in conventional statistical tasks, e.g., automatic model selection,...
One of the Internetpsilas hallmark is the rapid spread of the use of information and communication technology. This has boosted methods for hiding stego information inside digital cover content images which is a concerning issue in information security. On the other hand, attack of steganographic schemes has leveraged methods for steganalysis which is a challenging problem. In this paper, first we...
In this paper, we analyze the information feature of principal component analysis (PCA) deeply based on information entropy. According to idea of entropy function, a new weighted information functions (WIF) is proposed, and the information content of data matrix X is measured by it. Based on WIF, the information compression rate (ICR, RIC) and accumulated information compression rate (AICR, RIC) are...
A new modified pseudo-Zernike moments feature, namely, ldquospatial weighted pseudo-Zernike momentsrdquo (SWPZM) is proposed for face recognition in this paper. Since different facial region plays a different important role for face recognition, the new modified pseudo-Zernike feature is weighted with a weight function derived from the spatial information of the human face; hence the most important...
A few adaptive algorithms for generalized eigen-decomposition have been proposed, which are very useful in many applications such as digital mobile communications, blind signal separation, etc. These algorithms are all focusing on extracting principal generalized eigenvectors. However, in many practical applications such as dimension reduction and signal processing, extracting the minor generalized...
Feature extraction has been widely used in sensor fault detection. Commonly used feature extraction methods such as PCA and MDS involve signal process of liner time-invariant systems, which are less effective in dealing with the nonlinear systems. In this paper, we will present that Local Linear Embedding (LLE) concept is adopted to solve the fault detection problems and that certain enhancement have...
Density estimation in high-dimensional data spaces is a challenge due to the sparseness of data which is known as ldquothe curse of dimensionalityrdquo. Researchers often resort to low-dimensional subspaces for such tasks, while discard the distribution in the complementary subspace. In this paper, we propose a new mixture density model based on pooled subspace. In our method, the Gaussian components...
Sound-scapes are useful for understanding our surrounding environments in applications such as security, source tracking or understanding human computer interaction. Accurate position or localisation information from sound-scape samples consists of many channels of high dimensional acoustic data. In this paper we demonstrate how to obtain a visual representation of sound-scapes by applying dimensionality...
Artificial emotion study will be of utmost importance in future artificial intelligence research. In this paper, an emotion understanding system based on brain activity and ldquoGISTrdquo is newly proposed to categorize emotions reflected by natural scenes. According to the strong relationship of human emotion and the brain activity, functional magnetic resonance imaging (fMRI) and electroencephalography...
In this paper , we propose a new statistical learning algorithm. This study quantitatively verifies the effectiveness of its feature extraction performance for face information processing. Simple-FLDA is an algorithm based on a geometrical analysis of the Fisher linear discriminant analysis. As a high-speed feature extraction method, the present algorithm in this paper is the improved version of Simple-FLDA...
In this article, based on the Markov approach proposed by shi etal., we expand it to the inter-blocks of the DCT domain, calculate the difference of the expanded Markov features between the testing image and the calibrated version, and combine these difference features and the polynomial fitting features on the histogram of the DCT coefficients as detectors. We reasonably improve the detection performance...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.