The effects of the minor allele frequency of single nucleotide variants and the degree of departure from normality of a quantitative trait on type I error rates were evaluated using Genetic Analysis Workshop 17 mini-exome sequence data. Four simulated traits were generated: standard normal and gamma distributed traits and two transformations of the gamma distributed trait by log10 and rank-based inverse normal functions. Tiled regression was compared with simple linear regression. Average type I error rates were obtained for minor allele frequency classes. The distribution of the type I error rate for tiled regression analysis followed a pattern similar to that of simple linear regression analysis, but with much lower type I error.