Consider the estimation of a signal \({\mathbf {x}}\in \mathbb {R}^{N}\) from noisy observations \({{\mathbf {r}}={\mathbf {x}}+{\mathbf {z}}}\) , where the input \({{\mathbf x}}\) is generated by an independent and identically distributed (i.i.d.) Gaussian mixture source, and \({{\mathbf z}}\) is additive white Gaussian noise in parallel Gaussian channels. Typically, the \(\ell _{2}\) -norm error (squared error) is used to quantify the performance of the estimation process. In contrast, we consider the \(\ell _\infty \) -norm error (worst case error). For this error metric, we prove that, in an asymptotic setting where the signal dimension \(N\to \infty \) , the \(\ell _\infty \) -norm error always comes from the Gaussian component that has the largest variance, and the Wiener filter asymptotically achieves the optimal expected \(\ell _\infty \) -norm error. The i.i.d. Gaussian mixture case can be extended to i.i.d. Bernoulli-Gaussian distributions, which are often used to model sparse signals. Finally, our results can be extended to linear mixing systems with i.i.d. Gaussian mixture inputs, in settings where a linear mixing system can be decoupled to parallel Gaussian channels.