FedPFT: Federated Proxy Fine-Tuning of Foundation Models

Loading...
Thumbnail Image
File version

Version of Record (VoR)

Author(s)
Peng, Z
Fan, X
Chen, Y
Wang, Z
Pan, S
Wen, C
Zhang, R
Wang, C
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)

Larson, Kate

Date
2024
Size
File type(s)
Location

Jeju, Korea

License
Abstract

Adapting Foundation Models (FMs) for down- stream tasks through Federated Learning (FL) emerges a promising strategy for protecting data privacy and valuable FMs. Existing methods fine- tune FM by allocating sub-FM to clients in FL, however, leading to suboptimal performance due to insufficient tuning and inevitable error accumula- tions of gradients. In this paper, we propose Feder- ated Proxy Fine-Tuning (FedPFT), a novel method enhancing FMs adaptation in downstream tasks through FL by two key modules. First, the sub-FM construction module employs a layer-wise com- pression approach, facilitating comprehensive FM fine-tuning across all layers by emphasizing those crucial neurons. Second, the sub-FM alignment module conducts a two-step distillations—layer- level and neuron-level—before and during FL fine- tuning respectively, to reduce error of gradient by accurately aligning sub-FM with FM under theo- retical guarantees. Experimental results on seven commonly used datasets (i.e., four text and three vi- sion) demonstrate the superiority of FedPFT. Our code is available at https://github.com/pzp-dzd/FedPFT.

Journal Title
Conference Title

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence

Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

This work is covered by copyright. You must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the document is available under a specified licence, refer to the licence for details of permitted re-use. If you believe that this work infringes copyright please make a copyright takedown request using the form at https://www.griffith.edu.au/copyright-matters.

Item Access Status
Note
Access the data
Related item(s)
Subject
Persistent link to this record
Citation

Peng, Z; Fan, X; Chen, Y; Wang, Z; Pan, S; Wen, C; Zhang, R; Wang, C, FedPFT: Federated Proxy Fine-Tuning of Foundation Models, Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence, 2024, pp. 4806-4814