Lifetime reliability and the resultant temporal performance degradation due to Negative Bias Temperature Instability (NBTI) has emerged as a critical challenge in design and test of integrated circuits in nanometer technology nodes. In this work, we have developed a model that self-consistently estimates the NBTI degradation by considering the impact on the circuit lifetime of inter-dependent parameters such as Vdd and temperature simultaneously. Using the proposed model, we observed that a circuit with lower Vdd can provide better lifetime performance than with higher Vdd. This interesting observation can be attributed to the reduction of electric field in the transistor along with the circuit power/temperature reduction that leads to lesser NBTI degradation. Based on this observation we have developed a on-line detection and mitigation scheme that allows Vdd scaling to enhance system lifetime. The proposed scheme was applied to various arithmetic units and results in 45nm IBM process technology show 18% lifetime improvement with 57% reduction in power compared to conventional mitigation techniques. We also show that by using existing NBTI estimation models, the error in delay estimation can be as large as 7.6%.