Distributed tracking in wireless smart-camera networks is affected by varying local processing delays that generally depend on the current scene complexity. As a consequence, each camera makes target information available to the network at different time instants. These unknown delays compound the drifts caused by local clocks and may induce tracking failures when target information is fused. To address this problem, we propose a distributed batch asynchronous tracker for fully connected wireless smart-camera networks. The cameras use the information filter to estimate the target state information and to predict corresponding information of other cameras based on the asynchronous information received from them. Finally, the temporally aligned information is fused. We show that the proposed approach achieves higher tracking accuracy than the state of the art under varying degrees of asynchronism.