A complicated learning problem can be decomposed into multiple simple two-class problems. A single-output classifier for separating its represented class from the others really solves a two-class problem, and can be trained by all samples from the represented class and a small part from its neighboring classes. The equal sample sizes in the two-class problems thus come into being. Two of the solutions are as follows: a) add some virtual samples to the smaller classes; b) multiply the weight increments in the smaller sides by enlargement factors. If the decision boundaries of single-output perceptrons are open, their effective regions must be limited by adding correction coefficients, which are related to the class means and variances. The result for solving the two-spirals problem shows that the above methods are effective