This paper presents a sensor fusion framework that incorporates a monocular vision system, laser range finder and an inertial navigation sensor (INS) for localization of mobile robots and manipulators. The proposed method is particularly useful for applications in which there is a featured wall or floor in proximity to the robots. Examples include unmanned ground vehicles (UGV), unmanned aerial vehicles (UAV), and field inspection robots. In essence, the proposed method fuses image mosaicing and dead reckoning where the sequential images of a digital camera are stitched together with the help of an inertial navigation system (INS). In the proposed model, the processing load of image mosaicing is reduced significantly, and at the same time accumulation of error of the INS is prevented. The localization method was developed as a standalone module and tested using a hardware-in-the-loop simulator explained in this paper. This module will eventually be used for autonomous navigation of a pipe inspection robot capable of carrying nondestructive testing (NDT) sensors and visual inspection instruments into large water mains.