The matching of oriented local image feature descriptors like SIFT, SURF or ORB often includes the refinement and filtering of matches based on the relative orientation of the features. This is important since the computational cost for subsequent tasks like camera pose estimation or object detection increases dramatically with the number of outliers. Simple 2D orientation descriptions are unsuitable for Omni directional images because of image distortions and non-monotonic mapping from camera rotations to image rotations. In this work we introduce 3D orientation descriptors which, unlike 2D descriptors, are suitable for match refinement on Omni directional images and improve matching results on images from cameras and camera rigs with a wide field of view. We evaluate different match refinement strategies based on 2D and 3D orientations and show the fundamental advantages of our approach.