Quadratic optimization problems appear in several interesting estimation, learning and control tasks. To solve these problems in peer-to-peer networks it is necessary to design distributed optimization algorithms supporting directed, asynchronous and unreliable communication. This paper addresses this requirement by extending a promising distributed convex optimization algorithm, known as Newton-Raphson consensus, and originally designed for static and undirected communication. Specifically, we modify this algorithm so that it can cope with asynchronous, broadcast and unreliable lossy links, and prove that the optimization strategy correctly converge to the global optimum when the local cost functions are quadratic. We then support the intuition that this robustified algorithm converges to the true optimum also for general convex problems with dedicated numerical simulations.