Ground target tracking using electro-optical and infrared video sensors onboard unmanned aerial vehicles has drawn a great deal of interest in recent years due to the evolution of inexpensive video sensors and platforms. We present algorithms for geolocation using pixel location measurements which are based on the perspective transformation and includes radial and tangential lens distortions. The covariance of geolocation error takes into account the errors in pixel location, intrinsic and extrinsic camera parameters, and terrain height. Pixel coordinates of the optical center, focal distances along the X and Y axes, lens distortion parameters, and the skew parameter constitute the intrinsic camera parameters. The extrinsic camera parameters include the sensor position and sensor attitude relative a local east-north-up coordinate frame. Numerical results are presented using simulated data. Our results show that the errors in the Euler angles used to represent the sensor attitude is the dominant source of geolocation error.