Multi-label classification (MLC), allowing instances to have multiple labels, has been received a surge of interests in recent years due to its wide range of applications such as image annotation and document tagging. One of simplest ways to solve MLC problems is label-power set method (LP) that regards all possible label subsets as classes. LP validates traditional multi-classification classifiers such as multi-class SVM but it suffers from the increased number of classes. Therefore, several improvements have been made for LP to be scaled for large problems with many labels. Random k labELsets (RAkEL) proposed by Tsoumakas et al. solves this problem by randomly sampling a small number of labels and taking ensemble of them. However, RAkEL needs all instances for constructing each model and thus suffers from high computational complexity. In this paper, we propose a new fast algorithm for RAkEL. First, we assign each training instance to a small number of models. Then LP is applied for each model with only the assigned instances. Experiments on twelve benchmark datasets demonstrated that the proposed algorithm works faster than the conventional methods while keeping accuracy. In the best case, it was 100 times faster than baseline method (LP) and 30 times faster than the original RAkEL.