Image inpainting is an active research field of image processing. Previous inpainting methods often require a long computational time to give sufficient results, especially due to the extensive search process of exemplar-based methods. This work improves a previous fast inpainting method based on local similarity, which achieves runtimes measured in tens of milliseconds per image, but often results in unacceptable artifacts. We improve the resulting image quality by allowing pixels to be filled at any angle, determining the angle based only on the vicinity of the target region, and cross-fading between source pixels from opposite sides of the target region. The proposed method is shown to eliminate the two drawbacks, while retaining a fast runtime