A common simplifying assumption made in wireless simulation and modeling is that the world is flat, i.e. to ignore the effect of the terrain in which the wireless signal propagates. In this paper, we show with empirical measurements from an urban wireless network testbed how the terrain affects the spatial and temporal correlation of the wireless signal, and in turn, the distance or duration over which the wireless signal remains consistent. Furthermore, we suggest that this effect has practical implications for systems that make assumptions about the duration over which wireless signal quality stays roughly the same, such as adaptive transmission schemes or applications that buffer data to smooth over variations in signal quality.