In this paper, the problem of determining the approximate stability regions of large-scale time-delay systems (LS TDS) is solved using model approximation techniques. To achieve this, an ℋ2-oriented approximation algorithm, referred to as TF-IRKA [1], is considered. This algorithm has been shown to be well suited for the approximation of infinite-dimensional systems into finite-dimensional ones. We show here how model reduction can be used to approximate time-delay systems with multiple delays and estimate their stability regions. Discussions regarding the adaptation of existing algorithms to the considered problem are also provided. Several numerical examples illustrate the efficiency and accuracy of the approach.