Multicast join delay is an important metric to evaluate the performance of multicast services. However, there is little analytic study of this metric. In this paper, we convert the multicast join delay into the average node-tree distance problem in the graph, and propose an area-overlay method to analyze the multicast join delay against the different topology sizes and multicast densities. We set up a simulation platform to evaluate the performance of the analytic result, and the simulation results show that it can represent the multicast join delay in lower multicast density and bigger topology size. The analytic result can be used to determine the number of the servers (such as multicast router) in network to achieve good trade-offs between delay and load.