Randomized (dithered) quantization is a method capable of achieving white reconstruction error independent of the source. Dithered quantizers have traditionally been considered within their natural setting of uniform quantization. In this paper we extend conventional dithered quantization to nonuniform quantization, via a subterfage: dithering is performed in the companded domain. Closed form necessary conditions for optimality of the compressor and expander mappings are derived for both fixed and variable rate randomized quantization. Numerically, mappings are optimized by iteratively imposing these necessary conditions. The resulting quantizer renders the reconstruction error white with negligible performance loss compared to the optimal quantizer. The framework is extended to include an explicit constraint that deterministic or randomized quantizers yield reconstruction error that is uncorrelated with the source. Surprising theoretical results show direct and simple connection between the optimal constrained quantizers and their unconstrained counterparts. Numerical results for the Gaussian source provide strong evidence that the proposed constrained randomized quantizer outperforms the conventional dithered quantizer, as well as the constrained deterministic quantizer.