The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The Computed Tomography (CT) is a imaging method based on X-rays to obtain cross-sectional images from an object. It is a widely used method in several areas, such as medicine, archeology or material sciences. Tomographic reconstruction techniques, use the projections of images from multiple directions. There are several algorithms for this purpose but can be classified according to their reconstruction...
A Brain-Computer Interface (BCI) speller system based on the Steady-State Visually Evoked Potentials (SSVEP) paradigm is presented. The potentials are elicited through the gaze fixation at one out of the four checkerboards shown on screen, which are flickering at 5, 12, 15 and 20 Hz. After the feature extraction, two dimensionality reduction algorithms, Principal Components Analysis (PCA) and Linear...
Given a set of points P⊄ R^d and a kernel k, the Kernel Density Estimate at a point x∊R^d is defined as \mathrm{KDE}_{P}(x)=\frac{1}{|P|}\sum_{y\in P} k(x,y). We study the problem of designing a data structure that given a data set P and a kernel function, returns approximations to the kernel density} of a query point in sublinear time}. We introduce a class of unbiased estimators...
We propose new generative adversarial networks for generalized image deconvolution, GAN-D. Most of the previous researches concentrate to specific sub-topic of image deconvolution or generative image deconvolution models with a strong assumption. However, our network restores visual data from distorted images applied multiple dominant degradation problems such as noise, blur, saturation, compression...
Low graduation rate is a significant and growing problem in U.S. higher education systems. Although previous studies have demonstrated the usefulness of building statistical models for predicting students' graduation outcomes, advanced machine learning models promise to improve the effectiveness of these models, and hone in on the “difference that makes a difference” not only on the group level, but...
All semiconductor market domains are converging to concurrent platforms. This trend has certainly led real challenge to develop applications software that effectively uses these concurrent processors to achieve efficiency and performance goals. This paper argues that the Computer System related courses are natural places to introduce the parallelism, and the earlier to parallel computing concepts...
Software Transactional Memory (STM) allows encapsulating shared-data accesses within transactions, executed with atomicity and isolation guarantees. The assessment of the consistency of a running transaction is performed by the STM layer at specific points of its execution, such as when a read or write access to a shared object occurs, or upon a commit attempt. However, performance and energy efficiency...
This paper deals with the problem of audio source separation. To handle the complex and ill-posed nature of the problems of audio source separation, the current state-of-the-art approaches employ deep neural networks to obtain instrumental spectra from a mixture. In this study, we propose a novel network architecture that extends the recently developed densely connected convolutional network (DenseNet),...
In almost every mental disorder, there are deficiencies in both structure and function of the brain. So the need for analyzing complementary modalities that project all aspects of the brain is rising. The most severe kind of these disorders is schizophrenia. The main cause of schizophrenia is still unknown. Therefore, analyzing resting-state fMRI (rs-fMRI) and structural MRI (sMRI) to investigate...
Modern computer systems are accelerator-rich, equipped with many types of hardware accelerators to speed up computation. For example, graphics processing units (GPUs) are a type of accelerators that are widely employed to accelerate parallel workloads. In order to well utilize different accelerators to gain better execution time speedup or reduce total energy consumption, many scheduling algorithms...
Recently, the combination of classification systems with semi-supervised learning has attracted researchers in several fields. Usually, for tasks with high complexity such as handwriting based age prediction, individual systems, using one classifier associated with specific data features, cannot provide satisfactory performance. In this paper, we investigate the contribution of the Co-training approach,...
Location based services like localization in wireless network are drawing more and more attention in the recent years. According to published literatures, the fingerprint based method outperforms many other methods, where constructing an accurate fingerprint database is a new challenge. In this paper, we introduce a Bayesian regression model, Gaussian Process Regression(GPR) model to profile the signal...
Taking advantage of computing capabilities offered by modern parallel and distributed architectures is fundamental to run large-scale simulation models based on the Parallel Discrete Event Simulation (PDES) paradigm. By relying on this computing organization, it is possible to effectively overcome both the power and the memory wall, which are core limiting aspects to deliver high-performance simulations...
Sparse Modeling Representative Selection (SMRS) has been recently proposed for finding the most relevant instances in datasets. This method deploys a data self-representativeness coding in order to infer a coding matrix that is regularized with a row sparsity constraint. The method assumes that the score of any sample is set to the L2 norm of the corresponding row in the coding matrix. Since the SMRS...
GPU-based clusters are widely chosen for accelerating a variety of scientific applications in high-end cloud environments. With their growing popularity, there is a necessity for improving the system throughput and decreasing the turnaround time for co-executing applications on the same GPU device. However, resource contention among multiple applications on a multi-tasked GPU leads to the performance...
Predicting an approval rate of politicians is a popular task. While a type of prediction is using a text mining from news articles, we introduce a text augmented Gaussian process to perform the prediction with contexts. We test our model with 2017 South Korea Presidential Election in 1) a quantitative evaluation, and 2) a qualitative analysis. The performance of the model with text input is better...
Matrix factorization is a popular low dimensional representation approach that plays an important role in many pattern recognition and computer vision domains. Among them, convex and semi-nonnegative matrix factorizations have attracted considerable interest, owing to its clustering interpretation. On the other hand, the generalized correlation function (correntropy) as the error measure does not...
With the growing number of automated welding systems present throughout manufacturing, achieving high precision is naturally a key objective. The alignment of weld tip to weld seam, particularly in very long welds (such as in pipes), is a technical challenge in which computer vision has much to offer. This paper introduces a real-time methodology for weld-seam tracking. The key challenge associated...
Uncertainty based active learning has been well studied for selecting informative samples to improve the performance of the classifier. One of the simplest strategy is that we always select samples with top largest uncertainties for a query. However, the selected samples may be very similar to each other, which results in little information added to update the classifier. In other words, we should...
The random Fourier Features method has been found very effective in approximating the kernel functions. Our former studies show that through a mixing mechanism of the feature space formed by random Fourier features and certain linear algorithms, the fuzzy clustering results in the approximated feature space are comparable to or even exceed the classical kernel-based algorithms. To increase the robustness...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.