Many financial applications, such as risk analysis, and derivatives pricing, depend on time scaling of risk. A common method for this purpose is the square-root-of-time rule where an estimated quantile of a return distribution is scaled to a lower frequency by the square root of the time horizon. This paper examines time scaling of quantiles when returns follow a jump diffusion process. We demonstrate that when jumps represent losses, the square-root-of-time rule leads to a systematic underestimation of risk, whereby the degree of underestimation worsens with the time horizon, the jump intensity and the confidence level.