The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Social surveys have been used by researchers and policy makers as an essential tool for understanding social and political activities in society. Social media has introduced a new way of capturing data from large numbers of people. Unlike surveys, social media deliver data more rapidly and cheaply. In this paper, we aim to rapidly identify socio-political activity in South Africa using proxy data...
The recognized significance of rumen microbiome has inspired efforts to examine the composition of rumen microbial communities in a large scale. One of the key research areas is to infer association and dependencies between members of rumen microbial communities through correlation analysis. However, it has been found that due to the compositional nature of data, simply applying correlation-based...
Parkinson's disease is a debilitating and chronic disease of the nervous system. Traditional Chinese Medicine (TCM) is a new way for diagnosing Parkinson, and the data of Chinese Medicine for diagnosing Parkinson is a multi-label data set. Considering that the symptoms as the labels in Parkinson data set always have correlations with each other, we can facilitate the multi-label learning process by...
We describe a “crowd measurement” project, referred to as PoQeMoN, whose main objective is to identify Quality of Service (QoS) indicators in order to predict the Quality of Experience (QoE) for HTTP YouTube content on mobile networks. Results are based on experiments on an operational network. The second contribution of this paper is to show that the proposed indicator is easy to implement in order...
The popularly used subjective estimator- mean opinion score (MOS) is often biased by the testing environment, viewers mode, domain expertise, and many other factors that may actively influence on actual assessment. We therefore, devise a no- reference subjective quality assessment metric by exploiting the nature of human eye browsing on videos. The participants' eye-tracker recorded gaze-data indicate...
Automatic image aesthetics rating has received a growing interest with the recent breakthrough in deep learning. Although many studies exist for learning a generic or universal aesthetics model, investigation of aesthetics models incorporating individual user’s preference is quite limited. We address this personalized aesthetics problem by showing that individual’s aesthetic preferences exhibit strong...
Network Function Virtualization is an emerging paradigm to allow the creation, at software level, of complex network services by composing simpler ones. However, this paradigm shift exposes network services to faults and bottlenecks in the complex software virtualization infrastructure they rely on. Thus, NFV services require effective anomaly detection systems to detect the occurrence of network...
Failure modes and effects analysis (FMEA) is a powerful and proactive quality tool for defining, detecting, and identifying potential failure modes and their effects. However, conventional FMEA process is sometimes difficult to implement due to workload required and subjectivity of the evaluations performed. Hence, automation of this tool can be useful for some application domains to objectively evaluate...
To increase the success in computer programming courses, it is important to understand the learning process and common difficulties faced by students. Although several studies have investigated possible relationships between students performance and self-regulated learning characteristics in computer programming courses, little attention has been given to the source code produced by students in this...
The key to maintaining high standards of quality and power conservation of physical machines in data centers lies in efficient consolidation of virtual machines (VMs). Several schemes have been proposed for this purpose; and these include online migration and VM placement — which can offer the best in terms of resource utilization. The consolidation process can be made effective by finding “opportunities”...
In this paper, the problem of single-channel blind source separation (SCBSS) of a mixture of two co-frequency phase-shift keying (PSK) signals with unknown carrier frequency offsets (CFOs) is investigated. Two SCBSS algorithms which are robust to CFOs are proposed to perform separation of the mixture signals. In the first algorithm, the phase changes of the received signals caused by CFOs are tracked...
Program understanding plays a pivotal role in software maintenance and evolution: a deep understanding of code is the stepping stone for most software-related activities, such as bug fixing or testing. Being able to measure the understandability of a piece of code might help in estimating the effort required for a maintenance activity, in comparing the quality of alternative implementations, or even...
Feature ranking from video-wide temporal evolution brings reliable information for complex action recognition. However, a video may contain similar features in the sequence of frames which deliver unnecessary information to the ranking function. This paper proposes a method to improve the rank-pooling strategy which captures the optimized latent structure of the video sequence data. The optimization...
Metric suites to assess Web Service quality attributes have been proposed recently. In particular, services interfaces in WSDL (Web Service Description Language) have distinct intrinsic aspects (e.g., size or complexity) able to be measured. We present an approach to prevent a high complexity on services interfaces (WSDLs), to ease consumers to reason about services' offered functionality. Mostly,...
Research into computational jigsaw puzzle solving, an emerging theoretical problem with numerous applications, has focused in recent years on puzzles that constitute square pieces only. In this paper we wish to extend the scientific scope of appearance-based puzzle solving and consider ’’brick wall” jigsaw puzzles – rectangular pieces who may have different sizes, and could be placed next to each...
Learning similarity functions between image pairs with deep neural networks yields highly correlated activations of large embeddings. In this work, we show how to improve the robustness of embeddings by exploiting independence in ensembles. We divide the last embedding layer of a deep network into an embedding ensemble and formulate training this ensemble as an online gradient boosting problem. Each...
A low-complexity algorithm is presented that clusters sensor nodes based on similarity in the sensed signals. This feature makes it an enabler for distributed detection of events that are impossible to identify using information available to a single node. The algorithm does not require system training prior to deployment nor does it assume statistical knowledge of the signal. Experimental results...
Today, The cloud industry is adopting the container technology both for internal usage and as commercial offering. The use of containers as base technology for large-scale systems opens many challenges in the area of resource management at run-time. This paper addresses the problem of selecting the more appropriate performance metrics to activate auto-scaling actions. Specifically, we investigate...
Resource usage data, collected using tools such as TACC_Stats, capture the resource utilization by nodes within a high performance computing system. We present methods to analyze the resource usage data to understand the system performance and identify performance anomalies. The core idea is to model the data as a three-way tensor corresponding to the compute nodes, usage metrics, and time. Using...
This paper discusses the motivation and implementation for Cray's Project Caribou. Project Caribou enables users to correlate HPC job performance with Lustre file systems through collected metrics and events. We will discuss use cases, the sources of metrics that are collected, correlation, and how the data is visualized. Additional topics to include events and alerts that are available, as well as...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.