This paper presents a GPU-based implementation for constructing edge-preserving multiscale image decompositions. An input image is decomposed into a piecewise smooth base layer and multiple detail layers. The base layer captures large scale variations in the image, while the detail layers contain the small scale details. The detail layers are progressively obtained with the edge-preserving weighted least squares optimizations. The improvement of performance is achieved by introducing a Jacobi-like GPU solver, which converges to the right solution much faster than the standard Jacobi iterator. Note that the whole pipeline design is highly parallel, enabling a real-time implementation. Several experimental examples on edge-preserving tonal adjustment and image abstraction are shown to demonstrate the feasibility of the proposed method.