The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Programs that operate over recursive data structures may contain potential parallel computations. Writing parallel programs, even when aided by parallel skeletons, is very challenging, requires intricate analysis of the underlying algorithm and often uses inefficient intermediate data structures. Very few automated parallelisation methods that address a wide range of programs and data types exist...
For popular Open Source Software (OSS) projects there are always a large number of worldwide developers who have been glued to making code contributions, while most of these developers play the role of casual contributors due to their very limited code commits (for fixing defects and enhancing features, casually). The frequent turnover of such group of casual developers and the wide variations among...
We study the maximum depth of context tree estimates, i.e., the maximum Markov order attainable by an estimated tree model given an input sequence of length n. We consider two classes of estimators: 1) Penalized maximum likelihood (PML) estimators where a context tree T is obtained by minimizing a cost of the form — log PT (xn) + f (n) |ST|, where PT (xn) is the ML probability of the input sequence...
In this paper we provide a general process to create the most K-groups based on incomplete profiles and conditional preferences. The proposed approach is shown to be a combination of techniques that each output is the input of the next. It solves three problems: studying the incompletenesses of profiles based on conditional preferences, distances ans similarity measurement, and finding the top credible...
A pixel domain algorithm for low complexity perceptual image coding is proposed. The algorithm exploits a combination of downsampling, predictive coding and just-noticeable difference (JND) model. Downsampling is performed adaptively on the input image based on regions-of-interest (ROI) identified by measuring the downsampling distortions against the visibility thresholds given by the JND model. The...
JPEG 2000's most computationally expensive building block is the Embedded Block Coder with Optimized Truncation (EBCOT). This paper evaluates how encoders targeting a parallel architecture such as a GPU can increase their throughput in use cases where very high data rates are used. The compression efficiency in the less significant bit-planes is then often poor and it is beneficial to enable the Selective...
Adaptive transform learning schemes have been extensively studied in the literature with a goal to achieve better compression efficiency compared to extensively used Discrete Cosine Transforms (DCT) inside a video codec. These transforms are learned offline on a large training set and are tested either in competition with or in place of the core transforms i.e. DCT. In our previous work, we proposed...
The main difficulty to implement modern image coding systems in a GPU is that the algorithms employed in the core of the coding scheme are inherently sequential. We recently proposed bitplane image coding with parallel coefficient processing (BPC-PaCo), a coding scheme that, contrarily to most systems, permits the processing of multiple coefficients of the image in parallel. This enables the use of...
Chain coding is widely used in image compression to encode the boundaries of objects efficiently. Although chain codes are effective, they still need large amount of memory to store the codes. Therefore, an efficient encoding technique for chain codes is required. In this paper, we propose an algorithm to encode contours losslessly. First, the morphological operation is applied to shrink the contours...
Coarse-grained Reconfigurable Architecture (CGRA) has been considered to be efficient for radar applications due to the performance and flexibility that it can provide. However, it has a crucial problem on cache memory that storing the large configuration contexts increases the silicon area and power consumption. This paper proposes a configuration compression and decompression approach based on dynamic...
In this paper, we analyze the binary arithmetic coding of High Efficiency Video Coding (HEVC) and the second generation of audio and video coding standard (AVS2). Then an optimized probability estimation scheme is proposed for arithmetic coder. The proposed scheme is incorporated into the HEVC reference software (HM 16.0) and AVS2 reference software (RD 10.1). Experimental results demonstrate that...
We propose an Augmented Visual Intelligence (AVI) framework to assist human in vision- and memory-related tasks. The AVI framework exploits wearable cameras and ambient computing facilities to empower a user's vision and memory functions by answering four types of queries central to visual activities. In particular, the Extended Visual Memory (EVM) model plays a central role in AVI. Learning of EVM...
Proving program termination is key to guaranteeing absence of undesirable behaviour, such as hanging programs and even security vulnerabilities such as denial-of-service attacks. To make termination checks scale to large systems, interprocedural termination analysis seems essential, which is a largely unexplored area of research in termination analysis, where most effort has focussed on difficult...
The Minimum Satisfiability Problem (MinSAT) consists in finding the minimum number of satisfied clauses of a CNF formula. This NP-hard problem is an extension of the famous SAT problem and has received less attention than its dual problem MaxSAT (Maximum Satisfiability). One of the MinSAT variants is the Partial MinSAT Problem where some clauses are hard and the others are soft. This variant is used...
Hierarchical State Transition Matrix (HSTM) is a table-based modeling language that has been broadly used for developing software designs of embedded systems. In this paper, we describe a model checker Garakabu2, which we have been implementing for verifying HSTM designs against LTL properties. The HSTM designs that Garakabu2 takes as input are those developed using an industrial-strength model-based...
An increasing volume of data puts MapReduce data analytic platforms such as Hadoop under constant resource pressure. A new two-phase text compression scheme has been specially designed to accelerate data analysis and reduce cluster resource usage, and this has been implemented for Hadoop. The scheme consists of two levels of compression. The first level compression allows a Hadoop program to consume...
In this paper, a novel fractal image compression with adaptive quardtree partitioning scheme is proposed. The technique classifies ranges and domains using a different classification method known as archetype classification that significantly reduces the mean square error (MSE) computations during encoding. Performance of the proposed technique in terms of compression ratios and fidelity versus encoding...
Debugging in large-scale parallel applications with long runtime where frequency of errors is high became very problematic. Traditional debugging techniques with locating exactly errors no longer seems to be appropriate when applying to these applications because of high overhead in storing trace files, especially they are difficult to be able to scale efficiently. An effective solution to these problems...
Hybrid Automatic Repeat reQuest (HARQ) techniques are largely used in the context of wireless communications systems especially in the last few years, being employed in the latest cellular systems, including the Long Term Evolution (LTE) standard. Such schemes have been widely studied in literature; however, in this paper we are interested in applying the results of Polyanskiy-Poor-Verdù on the finite...
This paper examines network-centric warfare (NCW) penetration within the U.S. Army. NCW was intended to be an emerging theory of war for the information age. It was supposed to provide a conceptual framework that would prevent new technology enabled approaches to warfare from being constrained by outmoded ideas. A thematic analysis of the literature was performed using NVivo 10, a computer assisted...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.