Conservation voltage reduction (CVR), the theory that reducing distribution feeder voltages will result in a reduction in demand, has recently been reintroduced with the aim of improving overall energy efficiency. Although CVR is a simple and cost-effective scheme, reducing voltages may not always be the most beneficial decision in terms of reducing demand. In fact, for certain types of load, reducing voltages may actually increase demand by increasing losses. The work in this paper uses accurate ZIP load models along with a three-phase optimal power flow methodology to determine optimal tap settings for a test feeder. Three objectives are examined and compared to deduce which is the most advantageous for that particular feeder. The results show that significant savings can be achieved through the use of this straightforward strategy.