We present a Bayesian framework for absolute sensor calibration that naturally integrates scene maps as prior and observations of moving objects. In the proposed framework, we design a utility function that measures the statistical fitness of joint hypotheses for sensor positions and object trajectories. Because of the unknown trajectory, such utility function is defined over a space of varying dimensionality that has to be directly manipulated for accurate position estimation to exploit its discriminative properties. To this end, we design a hierarchical algorithm that, from an initial approximate sensor position, infers the most plausible paths followed by the observed moving objects through the scene. Then, a transdimensional Monte Carlo optimization created to iteratively improve the initially provided sensor estimate according to the MAP criterion. This paper instantiates the proposed general model for sensors providing relative 2-D object positions, such as video cameras, sonar, and radar (distance and bearing). The proposed model is applicable to different sensor types and can be used either as alternative to active positioning methods or as refinement for other calibration approaches.