Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Distributed Semi-Supervised Partial Multi-Label Learning over Networks
by
Chen, Weibin
, Xu, Zhen
in
Algorithms
/ Analysis
/ Classification
/ Confidence
/ Consumption
/ Datasets
/ Discriminant analysis
/ Distributed memory
/ Errors
/ Feature maps
/ Information storage and retrieval
/ Kernel functions
/ Labeling
/ Labels
/ Machine learning
/ Performance evaluation
/ Ranking
/ Supervision
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Distributed Semi-Supervised Partial Multi-Label Learning over Networks
by
Chen, Weibin
, Xu, Zhen
in
Algorithms
/ Analysis
/ Classification
/ Confidence
/ Consumption
/ Datasets
/ Discriminant analysis
/ Distributed memory
/ Errors
/ Feature maps
/ Information storage and retrieval
/ Kernel functions
/ Labeling
/ Labels
/ Machine learning
/ Performance evaluation
/ Ranking
/ Supervision
2024
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Distributed Semi-Supervised Partial Multi-Label Learning over Networks
by
Chen, Weibin
, Xu, Zhen
in
Algorithms
/ Analysis
/ Classification
/ Confidence
/ Consumption
/ Datasets
/ Discriminant analysis
/ Distributed memory
/ Errors
/ Feature maps
/ Information storage and retrieval
/ Kernel functions
/ Labeling
/ Labels
/ Machine learning
/ Performance evaluation
/ Ranking
/ Supervision
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Distributed Semi-Supervised Partial Multi-Label Learning over Networks
Journal Article
Distributed Semi-Supervised Partial Multi-Label Learning over Networks
2024
Request Book From Autostore
and Choose the Collection Method
Overview
Inthis paper, a distributed semi-supervised partial multi-label learning (dS2PML) algorithm is proposed, which can be used to address the problem of distributed classification of partially multi-labeled data and unlabeled data. In this algorithm, we utilize the multi-kernel function together with the label correlation term to construct the discriminant function. In addition, to obtain a decentralized implementation, we design a reconstructed error on the labeling confidence based on globally common basic data that are selected by a distributed strategy. By exploiting the similarity structure among feature and label spaces under the sparsity constraint, the labeling confidences of partially multi-labeled and unlabeled data are estimated in a decentralized manner. Meanwhile, by using the sparse random feature map to approximate the kernel feature map, the multi-label classifier can be trained under the supervision of the estimated labeling confidence. Experiments on multiple real datasets are conducted to evaluate the learning performance of the proposed approach. According to the experimental results, the average ranks of all the comparison algorithms evaluated on five evaluation metrics are computed. The ranking results show that the average ranks of our algorithm in terms of hamming loss, one error, average precision, ranking loss, and coverage are 3.16, 2.27, 2.15, 2.38, and 2.18, respectively. The average ranks of the dS2PML are second only to the corresponding centralized S2PML (cS2PML) algorithms and higher than other existing comparison algorithms in five evaluation metrics. The average rank differences in terms of Hamming loss, one error, average precision, ranking loss, and coverage between our proposed algorithm and the closest comparison algorithm are 0.28, 1.67, 1.80, 1.15, and 1.62, respectively. Additionally, owing to the distributed storage and decentralized processing of training data, our proposed dS2PML algorithm reduces CPU time by more than 65% and memory consumption by more than 6% compared to the centralized comparison algorithms. The experimental results indicate that our proposed algorithm outperforms the other state-of-the-art algorithms in classification accuracy, CPU time, and memory consumption.
Publisher
MDPI AG
This website uses cookies to ensure you get the best experience on our website.