A method for measuring gas entrapment in granular iron (Fe 0 ) was developed and used to estimate the impact of gas production on porosity loss during the treatment of a high NO 3 − groundwater (up to ∼10 mM). Over the 400-d study period the trapped gas in laboratory columns was small, with a maximum measured at 1.3% pore volume. Low levels of dissolved H 2 (g) were measured (up to 0.07±0.02 M). Free moving gas bubbles were not observed. Thus, porosity loss, which was determined by tracer tests to be 25–30%, is not accounted for by residual gas trapped in the iron. The removal of aqueous species (i.e., NO 3 − , Ca, and carbonate alkalinity) indicates that mineral precipitation contributed more significantly to porosity loss than did the trapped gases. Using the stoichiometric reactions between Fe 0 and NO 3 − , an average corrosion rate of 1.7 mmol kg −1 d −1 was derived for the test granular iron. This rate is 10 times greater than Fe 0 oxidation by H 2 O alone, based on H 2 gas production. NO 3 − ion rather than H 2 O was the major oxidant in the groundwater in the absence of molecular O 2 . The N-mass balance [e.g., N 2 (g) and NH 4 + and NO 3 − ] suggests that abiotic reduction of NO 3 − dominated at the start of Fe 0 treatment, whereas N 2 production became more important once the microbial activity began. These laboratory results closely predict N 2 gas production in a separated large column experiment that was operated for ∼2 yr in the field, where a maximum of ∼600 ml d −1 gas volumes was detected, of which 99.5% (v/v) was N 2 . We conclude that NO 3 − suppressed the production of H 2 (g) by competing with water for Fe 0 oxidation, especially at the beginning of water treatment when Fe 0 is highly reactive. Depends on the groundwater composition, gas venting may be necessary in maintaining PRB performance in the field.