In clustering-based active learning, the performance of the learner relies heavily on the quality of clustering results. Empirical studies have shown that different clustering techniques are applicable to different data. In this paper, we propose the three-way active learning through clustering selection (TACS) algorithm to dynamically select the appropriate techniques during the learning process. The algorithm follows the coarse-to-fine scheme of granular computing coupled with three-way instance processing. For label query, we select both representative instances with density peaks, and informative instances with the maximal total distance. For block partition, we revise six popular clustering techniques to speed up learning and accommodate binary splitting. For clustering evaluation, we define weighted entropy with 1-nearest-neighbor. For insufficient labels, we design tree pruning techniques with the use of a block queue. Experiments are undertaken on twelve UCI datasets. The results show that TACS is superior to single clustering technique based algorithms and other state-of-the-art active learning algorithms.
Min, F., Zhang, S., Ciucci, D., Wang, M. (2020). Three-way active learning through clustering selection. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 11(5), 1033-1046 [10.1007/s13042-020-01099-2].
Three-way active learning through clustering selection
Ciucci, D;
2020
Abstract
In clustering-based active learning, the performance of the learner relies heavily on the quality of clustering results. Empirical studies have shown that different clustering techniques are applicable to different data. In this paper, we propose the three-way active learning through clustering selection (TACS) algorithm to dynamically select the appropriate techniques during the learning process. The algorithm follows the coarse-to-fine scheme of granular computing coupled with three-way instance processing. For label query, we select both representative instances with density peaks, and informative instances with the maximal total distance. For block partition, we revise six popular clustering techniques to speed up learning and accommodate binary splitting. For clustering evaluation, we define weighted entropy with 1-nearest-neighbor. For insufficient labels, we design tree pruning techniques with the use of a block queue. Experiments are undertaken on twelve UCI datasets. The results show that TACS is superior to single clustering technique based algorithms and other state-of-the-art active learning algorithms.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.