The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Summary form only given: In all existing codec designs for asymmetric Slepian-Wolf coding (SWC), it is assumed that the source sequence is i.i.d and equiprobable. When it comes to more complex source statistics, the encoder should firstly remove the redundancy within the source. However, this increases the complexity of the encoder. In this paper, we propose an asymmetric SWC scheme which explores...
We propose an improvement to VQ by applying data and dimensionality reduction techniques on the original dataset and using different resolutions generated by the reduction techniques as input to the GLA with splitting at different stages of codebook generation. The different resolutions are pre-computed (a one time cost), and are used instead of the original dataset in each iteration of the GLA. We...
Summary form only given: Dithered quantization has useful properties such as producing quantization noise independent of the source and continous reconstruction at the decoder side. Dithered quantizers have traditionally been considered within their natural setting of uniform quantization framework. A uniformly distributed (with step size matched to the quantization interval) dither signal is added...
Spread-spectrum steganographic (SSIS) method offers high payload and robustness to additive noise in transmission channel but the visual quality of image is distorted and exact data recovery may not be satisfied. DCT-domain message hiding based steganographic techniques provide high image imperceptibility and exact data recovery in absence of noise. In this paper, we combined the best of SSIS and...
Distortion estimation techniques are often employed in bitplane coding engines to minimize the computational load, or the memory requirements, of the encoder. A common approach is to determine distortion estimators that approximate the mean squared error decreases when data are successively coded and transmitted. Such estimators usually assume that coefficients are uniformly distributed in the quantization...
In this paper, we propose a new compression/decompression algorithm called LZB which belongs to a class of algorithms related to Lempel-Ziv (LZ). The distinguishing characteristic of LZB is that it allows decompression from arbitrary points in compressed data and decompressing a portion of compressed data does not require decompressing from the beginning. This is accomplished by setting a limit on...
A new method to compress electrocardiogram (ECG) signals, whose novelty is related to the choice of an appropriate basis of representation for each ECG, is proposed in this paper. The ZPSVD method is compared to the method presented, which is an ECG compression method based on the application of the SVD and presents good results. The experiments have shown that the ZPSVD method yields better compression...
In this paper we propose a novel out-of-core technique for progressive lossless compression and selective decompression of 3D triangle meshes larger than main memory. Most existing compression methods, in order to optimize compression ratios, only allow sequential decompression. We develop an integrated approach that resolves the issue of so-called prefix dependency to support selective decompression,...
In this paper we study how to encode W-long vectors, with N in the range of hundreds, at low bit rates of 0.5 bit per sample or lower. We adopt a vector quantization structure, where an overall gain is encoded with a scalar quantizer and the remaining scaled vector is encoded using a vector quantizer built out by combining smaller (length L) binary codes known to be efficient in filling the space,...
In 3D video applications, the virtual view is generally rendered by the compressed texture and depth. The texture and depth compression with different bit-rate overheads can lead to different virtual view rendering qualities. In this paper, we analyze the compression-induced rendering distortion for the virtual view. Based on the 3D warping principle, we first address how the texture and depth compression...
Multiple description (MD) source coding is a method to overcome unexpected information loss in a diversity system such as the Internet, or a wireless network. While classic MD coding handles the situation where the rate in some channels drops to zero temporarily, thus causing unexpected packet-loss, it fails to accommodate more subtle changes in link rate such as rate reduction. In such a case, a...
This paper describes and evaluates pFPC, a parallel implementation of the lossless FPC compression algorithm for 64-bit floating-point data. pFPC can trade off compression ratio for throughput. For example, on a 4-core 3 GHz Xeon system, it compresses our nine datasets by 18% at a throughput of 1.36 gigabytes per second and by 41% at a throughput of 570 megabytes per second. Decompression is even...
This paper proposes to use a bipartite graph to represent compressive sensing (CS). The evolution of nodes and edges in the bipartite graph, which is equivalent to the decoding process of compressive sensing, is characterized by a set of differential equations. One of main contributions in this paper is that we derive the close-form formulation of the evolution in statistics, which enable us to more...
Rapid development of DNA sequencing technologies exponentially increases the amount of publicly available genomic data. Whole genome multiple sequence alignments represent a particularly voluminous, frequently downloaded static dataset. In this work we propose an asymmetric source coding scheme for such alignments using evolutionary prediction in combination with lossless black and white image compression...
The quantitative underpinning of the information contents of biosequences represents an elusive goal and yet also an obvious prerequisite to the quantitative modeling and study of biological function and evolution. Previous studies have consistently exposed a tenacious lack of compressibility on behalf of biosequences. This leaves the question open as to what distinguishes them from random strings,...
Volume visualization with random data access poses significant challenges. While tiling techniques lead to simple implementations, they are not well suited for cases where the goal is to access arbitrarily located subdimensional datasets (e.g., being able to display an arbitrary 2D planar ldquocutrdquo from a 3D volume). Significant effort has been devoted to volumetric data compression, with most...
The integration of arithmetic codes (AC) in the most recent coding standards for media applications motivated the development of joint source/channel (JSC) techniques for AC-encoded data. In this paper, we propose a low-complexity scheme which enables iterative decoding for serially concatenated AC and convolutional codes. Iterations are performed between soft input soft output (SISO) component decoders.
In this work, we attempt to capitalize more fully on the source residual redundancy and then develop an MD-ISCD scheme which permits to exchange between its two constituent decoders the whole symbol extrinsic information. The first step toward realization is to derive the modified BCJR algorithm based on sectionalized code trellises that provides reliability information on each transmitted symbol...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.