Botho University Repository

Efficient Active Learning Constrains for Improved Semi-Supervised Clustering Performance

Show simple item record

dc.contributor.author Eswaraprasad, Ramkumar
dc.contributor.author Vengidusamy, Shanmugam
dc.date.accessioned 2016-08-23T09:54:55Z
dc.date.accessioned 2020-10-28T07:06:53Z
dc.date.available 2016-08-23T09:54:55Z
dc.date.available 2020-10-28T07:06:53Z
dc.date.issued 2015
dc.identifier.citation Eswaraprasad, R. (2015). Efficient Active Learning Constrains for Improved Semi-Supervised Clustering Performance. International Journal of Computer Science and Electronics Engineering, 3(4), en_US
dc.identifier.issn 2320–4028
dc.identifier.uri http://localhost:8080/xmlui/handle/123456789/160
dc.description.abstract This paper presents a semi supervised clustering technique with incremental and decremented affinity propagation (ID-AP) that structures labeled exemplars into the AP algorithm and a new method for actively selecting informative constraints to make available of improved clustering performance. The clustering and active learning methods are both scalable to large data sets, and can hold very high dimensional data. In this paper, the active learning challenges are examined to choose the must-link and cannot-link constraints for semi-supervised clustering. The proposed active learning approach increases the neighborhoods based on selecting the informative points and querying their relationship between the neighborhoods. At this time, the classic uncertainty-based principle is designed and novel approach is presented for calculating the uncertainty associated with each data point. Further, a selection criterion is introduced that trades off the amount of uncertainty of each data point with the probable number of queries (the cost) essential to determine this uncertainty. This permits us to select queries that have the maximum information rate. Experimental results demonstrate that the proposed ID-AP technique adequately captures and takes full advantage of the intrinsic relationship between the labeled samples and unlabeled data, and produces better performance than the other considered methods Empirically evaluate the proposed method on the eight benchmark data sets against a number of competing methods. The evaluation results indicate that our method achieves consistent and substantial improvements over its competitors. en_US
dc.description.sponsorship Botho University en_US
dc.publisher International Journal of Computer Science and Electronics Engineering en_US
dc.subject Affinity propagation (AP) en_US
dc.subject Decremental learning en_US
dc.subject Incremental learning en_US
dc.subject Clustering en_US
dc.subject Semi-supervised learning en_US
dc.title Efficient Active Learning Constrains for Improved Semi-Supervised Clustering Performance en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search BU Repository


Advanced Search

Browse

My Account