The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that are based on building and sampling a probability model. Copula theory provides methods that simplify the estimation of a probability model. An island-based version of copula-based EDA with probabilistic model migration (mCEDA) was tested on a set of well-known standard optimization benchmarks in the continuous...
We develop a new efficient method for designing unimodular waveforms with good auto- and cross-correlation properties for multiple-input multiple-output (MIMO) radar. Our waveform design scheme is conducted based on minimization of the integrated sidelobe level of designed waveforms, which is formulated as a quartic non-convex optimization problem. We start from simplifying the quartic optimization...
In room acoustics, the under-modelled blind system identification (BSI) problem arises when the identified room impulse response (RIR) is shorter than the real one. Conventional BSI methods can perform poorly under these circumstances. In this paper, we propose an algorithm for multichannel BSI in under-modelled situations. Instead of minimizing the cross-relation error, a new optimization criterion...
Deep learning is proven to outperform other machine learning methods in numerous research fields. However, previous approaches, like multispace probability distribution hidden Markov models still surpass deep learning methods in the prediction accuracy of speech fundamental frequency (F0), inter alia, due to its discontinuous behavior. The current research focuses on the application of feedforward...
Sparse reconstruction algorithms aim to retrieve high-dimensional sparse signals from a limited number of measurements. A common example is LASSO or Basis Pursuit where sparsity is enforced using an ℓ1-penalty together with a cost function ‖y — Hx‖22. For random design matrices H, a sharp phase transition boundary separates the ‘good’ parameter region where error-free recovery of a sufficiently sparse...
Social networks and location based social networks have many active users who provide various kind of data, such as where they have been, who their friends are, which items they like more, when they go to a venue. Location, social network and temporal information provided by them can be used by recommendation systems to give more accurate suggestions. Also, recommendation systems can provide dynamic...
As technology has advanced nowadays, it is common for a system to be optimized under more than one objective function, which leads to inconclusive results if two contradictory objectives exist. Traditional approaches suggest a simple aggregation of multiple objectives into one via a linear combination, but it is hard to justify the weights quantitatively. This paper proposes a systematic therapy to...
Brain computer interface (BCI) is a system for communication between people and computers via brain activity. Steady-state visual evoked potentials (SSVEPs), a brain response observed in EEG, are evoked by flickering stimuli. SSVEP is one of the promising paradigms for BCI. Canonical correlation analysis (CCA) is widely used for EEG signal processing in SSVEP-based BCIs. However, the classification...
Cooperative coevolution is effective for solving high-dimensional optimization problems. This paper proposes an adaptive hybrid differential evolution with circular sliding window to tackle the high-dimensional optimization problems. A circular sliding window strategy is proposed to solve the task of the decomposition of the original problem, where the “window” size represents the size of the group,...
In this paper we apply techniques for numerical estimation of system resolution from imaging, to the regression problem of relating biological data to phenotypes. Our approach can be viewed as an extension of Backus-Gilbert theory, which attempts to find the most concentrated estimator that may be reliably computed in an inverse problem. Applied to a regression model, we estimate a minimal combination...
This study addresses neural decoding of a code modulated visual evoked potentials (c-VEPs). c-VEP was recently developed, and applied to brain computer interfaces (BCIs). c-VEP BCI exhibits faster communication speed than existing VEP-based BCIs. In c-VEP BCI, the canonical correlation analysis (CCA) that maximizes the correlation between an averaged signal and single trial signals is often used for...
Feasibility determination in a stochastic setting is to determine for each alternative design whether its performance, which can only be estimated through simulation, exceeds a known threshold. Several intelligent simulation budget allocation methods have been developed to enhance the simulation efficiency of feasibility determination. To further improve the simulation efficiency, we develop a new...
Sensor networks are a collection of sensor nodes that co-operatively transmit sensed data to a base station. One of the well-known characteristics of Wireless Sensor Networks (WSNs) is its limited resources. Energy consumption of the network's nodes is considered one of the major challenges faced by researchers nowadays. On the other hand, data aggregation helps in reducing the redundant data transferred...
Clustering is a fundamental tool for data analysis. Typically, all attributes of the data are used for clustering. However, if a set of attributes can be divided into meaningful subsets, it may be effective to cluster the data for each subset. In this paper, we propose a method for dividing the set of elements of feature vectors into meaningful subsets. Considering the dependencies between the elements,...
The partially-conditioned Gaussian (PCG) density, a variant of the Gauss-Bingham density, quantifies the uncertainty of a state vector comprised of an attitude quaternion and other Euclidean states on their natural manifold, the unit hypercylinder. The conditioned Gaussian density is first developed by conditioning a Gaussian density on the unit hypersphere, and is an alternate representation of the...
This paper mainly deals with the distributed estimation fusion problem when the correlations are unknown. The local estimates are represented as a set of probability density functions, on which a Riemannian structure endowed with the Fisher metric is built. From the perspective of information geometry, the fused density is formulated as the Fisher barycenter in the space of probability densities and...
Feature selection is an effective technique for dimensionality reduction to get the most useful information from huge raw data. Many spectral feature selection algorithms have been proposed to address the unsupervised feature selection problem, but most of them fail to pay attention to the noises induced during the feature selection process. In this paper, we not only consider the feature structural...
YCbCr color space is widely used in previous video coding standards such as H.264 and the latest video coding standard HEVC. Although the conversion from RGB to YCbCr reduces the inter-channel redundancy significantly, there are still some correlations among Y, Cb and Cr. In order to reduce the inter-channel redundancy, LM mode was proposed based on HEVC in which the reconstructed luma component is...
Interval programming problems are ubiquitous in real-world situations. There exist a variety of theories and methods for handling them; the existing methods, however, have adopted various dominance criteria to distinguish solutions, and these criteria are always subjective. Different dominance criteria will produce different optimal solution(s), and subjective criteria make users, especially for those...
In this paper, we propose a simple but effective objective reduction algorithm (ORA) for many-objective optimization problems (MaOPs). It uses a hyperplane involving sparse non-negative coefficients to roughly approximate the conflicting structure of the Pareto front in the objective space. Then the objectives with non-zero coefficients are considered as essential objectives. In order to verify the...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.