The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Even though the fixed HH's (Fraunhofer Heinrich-Hertz-Institute) scheme for multi-view video coding can get very good performance by fully utilizing the predictions in both the temporal and view directions, the complexity of this inter-prediction is very high. This paper presents some techniques to reduce the complexity while maintaining the coding performance.
The Lempel-Ziv 77 (LZ77) and LZ-Storer-Szymanski (LZSS) text compression algorithms use a sliding window over the sequence of symbols, with two sub-windows: the dictionary (symbols already encoded) and the look-ahead-buffer (LAB) (symbols not yet encoded). Binary search trees and suffix trees (ST) have been used to speedup the search of the LAB over the dictionary, at the expense of high memory usage...
This paper presents fast 15times15 transform for image and video coding applications. We note, that our proposed transform is significantly less complex than DCT-II of nearest dyadic size N=16, for which best known factorization [2] requires 31 multiplications and 81 additions. In comparison with H.264-type cascade of 4-point transforms, our proposed algorithm is slightly higher in complexity, but...
In this work, we have designed an efficient arithmetic coder for the non-linear bi-level image coder based on reversible cellular automata transform reported in the work of Cruz-Reyes and Kari (2008). The proposed approach relies on non-linear transform operations based on RmbCA, decomposing the image into four subimages (named LL, LH, HL, HH in lexicographic order) in such a way that the LL subimage...
We use a scalar function thetas to describe the complexity of data compression systems based on vector quantizers (VQs). This function is associated with the analog hardware implementation of a VQ, as done for example in focal-plane image compression systems. The rate and distortion of a VQ are represented by a Lagrangian cost function J. In this work we propose an affine model for the relationship...
In this paper a framework is proposed for efficient entropy coding of data which can be represented by a parametric distribution model. Based on the proposed framework, an entropy coder achieves coding efficiency by estimating the parameters of the statistical model (for the coded data), either via maximum a posteriori (MAP) or Maximum Likelihood (ML) parameter estimation techniques. The problem of...
An efficient policy allocation algorithm for the transmission of embedded bit streams over noisy channels with feedback is proposed. The transmission is based on the type-II Hybrid ARQ/FEC protocol and uses a nested sequence C of channel codes to protect the packets. There are also constraints on the total bit budget and on the allowed number of retransmissions per packet. The allocation algorithm...
The algorithm proposed is motivated by a detailed analysis of the wavelet coefficient set partitioning schemes found in the SPIHT algorithm. Initially, the image is transformed into the wavelet domain. The threshold of each independent subband is calculated and sent to the decoder. The subbands scanning sequence is established by the magnitude of the thresholds, subbands with larger threshold are...
Summary form only given. This paper studies how to better exploit intra correlation to compress images. In general, edge and texture areas of images exhibit strong anisotropic property. The correlation among samples is determined by not only their distance but also the link orientation. Traditional transforms are not efficient on handling this anisotropic correlation. Therefore, in this paper we propose...
This paper investigates compression of encrypted data. It has been previously shown that data encrypted with Vernam's scheme, also known as the one-time pad, can be compressed without knowledge of the secret key, therefore this result can be applied to stream ciphers used in practice. However, it was not known how to compress data encrypted with non-stream ciphers. In this paper, we address the problem...
Given a source file S and two differencing files Delta(S, T) and Delta(T, R), where Delta(A, Y) is used to denote the delta file of the target file Y with respect to the source file X, the objective is to be able to construct R. This is intended for the scenario of upgrading software where intermediate releases are missing, or for the case of file system backups, where non consecutive versions must...
We present a linear time and space suffix array (SA) construction algorithm called the SA-IS algorithm.The SA-IS algorithm is novel because of the LMS-substrings used for the problem reduction and the pure induced-sorting (specially coined for this algorithm)used to propagate the order of suffixes as well as that of LMS-substrings, which makes the algorithm almost purely relying on induced sorting...
The RKLT is a lossless approximation to the KLT, and has been recently employed for progressive lossy-to-lossless coding of hyperspectral images. Both yield very good coding performance results, but at a high computational price. In this paper we investigate two RKLT clustering approaches to lessen the computational complexity problem: a normal clustering approach, which still yields good performance;...
Motivated by the paradigm of event-based monitoring, which can potentially alleviate the inherent bandwidth and energy constraints associated with wireless sensor networks, we consider the problem of joint coding of correlated sources under a cost criterion that is appropriately conditioned on event occurrences. The underlying premise is that individual sensors only have access to partial information...
Information content and compression are tightly related concepts that can be addressed by classical and algorithmic information theory. Several entities in the latter have been defined relying upon notions of the former, such as entropy and mutual information, since the basic concepts of these two approaches present many common tracts. In this work we further expand this parallelism by defining the...
A formal result is stated and proved showing that the bit stream produced by the encoder of a nearly optimal sliding-block source coding of a stationary and ergodic source is close to an equiprobable i.i.d. binary process.
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.