In this paper, we analyze the symbol-error rate of a two-hop fixed-gain amplify-and-forward OFDM relay network when nonlinear distortion, as a consequence of relay amplifier saturation, is considered at the relay. We begin by constructing a theoretical model of the system. Due to the Gaussianity of OFDM waveforms, we are able to employ Bussgang's theorem in our model, which enables the calculation of a closed form expression for the SER. From our theoretical analysis, we can calculate the optimal relay gain that should be applied to minimize the symbol error rate. We demonstrate the performance gain achieved through the application of the optimal relay gains through both numerical simulations and real-world experimentation using a USRP relay network testbed.