How many choice sets respondents will answer is a critical issue in how much data a researcher has for analysis. We used 66 separate surveys that ask respondents, from an opt-in web panel, sequences of preference questions about consumer products to study design factors that influence the rate of completing the entire sequence of questions comprising a discrete choice experiment. We do this by systematically varying the number of choice sets, the number of alternatives respondents were asked to consider, the nature of the list of attributes of each alternative and the type of statistical design. Completion rates systematically varied with the factors explored, but perhaps the key finding is that completion rates are reasonably high in all cases. We found that completion rates are relatively unaffected by asking more questions (choice sets), but they decline as one includes more alternatives. Expected time to complete a survey often plays a key role in the cost of web-based panels, so we also look at how the preceding factors impact completion times. Practical implications for applied research using opt-in web panels are discussed.