The application of a genetic algorithm (GA) to the selection of principal components (PCs) is proposed as an efficient method to determine the optimal multivariate regression model. This stochastic method was compared with other deterministic methods such as: exhaustive search (here taken as a validation procedure), forward and backward-stepwise variable selection and correlation principal components regression (CPCR). It is shown that for the range of data sets used, the GA gives the same result as the those obtained by an exhaustive search and by CPCR whereas the stepwise procedures do not. These results also show that in order to build optimal predictive models using principal components regression (PCR) one needs to select the best subset of PCs rather than simply use those with the highest eigenvalues.