At the first United States’ biotechnology conference in 1975, molecular biologists unanimously consented to follow regulations established by the Recombinant DNA Advisory Committee (RAC) of the National Institutes of Health (NIH) to review, monitor, and approve all research projects in the brave new field of genetic manipulation. Notwithstanding the incredible economic potential of biotechnology for pharmaceutical and agricultural companies, the RAC recommended that industry experts refrain from some experiments and follow stern restrictions for others. For example, in 1984, when the advisory committee introduced “non-scientific and non-objective considerations into the guidelines about genetic transfer experiments,” it declared that research involving the transfer of genetic traits between animal and human germ lines to be “morally and ethically unacceptable” (Naik, 2000, 38). In 1986, at the suggestion of the RAC, the United States government created a coordinated framework of governmental agencies and industry experts to ensure that biotechnology-derived food products are substantially familiar or equivalent to their non-biotech counterparts. By maximizing many of the political benefits while minimizing most of the societal risks, the RAC successfully monitored the ethical impacts of genetically modified organism research and development (Naik, 2000, 59). According to some risk scholars, strict governmental scrutiny of biotechnology research, corresponding deliberate industry development, and the transparent public actions of the RAC represent a now lamented past that is conspicuously different from the present (Krimsky, 1991, 183–184).