The average ridge distance of fingerprint images is used in many problems and applications. It is used in fingerprint filter design or in identification and classification procedures. This paper addresses the problem of local average ridge distance computation. This computation is based on a two-step procedure: first, the average distance is defined in each significant portion of the image and then this information is propagated onto the remaining regions to complete the computation. Two methods are considered in the first step: geometric and spectral. In the geometric approach the central points of ridges are estimated on a regular grid and straight lines passing through these points and parallel to the ridge directions are used. The second method is based on the computation of harmonic coefficients leading to effective estimates of the average ridge period. In order to complete the average distance map a diffusion equation is used so that maps with minimum variations are favored. Finally, some experimental results on NIST SDB4 are reported.