The task of object tracking in rectangular videos has been addressed in recent years by many researchers, where each method tries to propose a solution for a special challenge. Handling a variety of challenging situation of object tracking in 360-degree videos is still an unsolved problem and needs to be more considered. In the real world, the challenging situations include moving camera, high-resolution videos, background clutter, fast processing, in and out-of-plane rotation, small, articular and rigid objects, illumination variation and partial and full occlusion. In this paper, a new structure of SURF-based object tracking is proposed which uses a train-based matching to address the challenging object tracking in 360-degree videos. The proposed tracker is able to estimate out-of-plane rotation and occlusion during the tracking and adapt itself to handle it. Our experiments demonstrate the robustness of our tracker.