Bayesian decentralized data fusion (DDF) is challenging to implement for problems modeled by arbitrarily complex non-Gaussian pdfs, especially those with hierarchical or hybrid uncertainties. Furthermore, in ad hoc communication topologies, ‘rumor-robust’ fusion approximations for handling unknown dependencies are often too conservative and lossy. This work exploits novel insights about Bayesian DDF to address these issues. Firstly, it is shown that DDF naturally factorizes into semi-parallelizable conditional DDF updates, which leads to an efficient and generalizable hierarchical Bayesian partial information-sharing scheme for multi-agent networks. Secondly, it is shown that conditional factorizations can significantly extend the capabilities of conservative weighted exponential product (WEP) DDF approximations for ad hoc networks, enabling convex information-theoretical optimization of implicit conditional common information factors. Simulation results for a multi-robot target search mission show that the proposed methods lead to significant improvements in computation time and information gain over traditional monolithic DDF techniques.