Motion estimation is a major computational task in real-time vision circuits and artificial retinas that require energy- efficient, high-speed, and microminiaturized circuitry. Traditionally, the motion estimation is made by means of velocity-tuned filters (VTFs), a class of spatiotemporal signal processing circuitry. However, conventional VTFs have limitations in area, power, and speed for real-time motion computation because they employ bulky and slow analog circuitry. In this paper, we propose a nanoscale VTF that employs quantum dot arrays to perform temporal filtering to track moving and stationary objects. The new velocity-tuned filter is not only amenable for nanocomputing, but also superior to other VTFs in terms of area, power, and speed. We also show that the proposed nanoarchitecture for VTF is asymptotically stable in the specific region where f' (Sn,m) > 0.