TransFusion: Covariate-Shift Robust Transfer Learning for High-Dimensional Regression

The main challenge that sets transfer learning apart from traditional supervised learning is the distribution shift, reflected as the shift between the source and target models and that between the marginal covariate distributions. In this work, we tackle model shifts in the presence of covariate shifts in the high-dimensional regression setting. Specifically, we propose a two-step method with a novel fused-regularizer that effectively leverages samples from source tasks to improve the learning performance on a target task with limited samples. Nonasymptotic bound is provided for the estimation error of the target model, showing the robustness of the proposed method to covariate shifts. We further establish conditions under which the estimator is minimax-optimal. Additionally, we extend the method to a distributed setting, allowing for a pretraining-finetuning strategy, requiring just one round of communication while retaining the estimation rate of the centralized version. Numerical tests validate our theory, highlighting the method’s robustness to covariate shifts.

Files

Metadata

Work Title TransFusion: Covariate-Shift Robust Transfer Learning for High-Dimensional Regression
Access
Open Access
Creators
  1. Zelin He
  2. Ying Sun
  3. Jingyuan Liu
  4. Runze Li
License In Copyright (Rights Reserved)
Work Type Article
Publisher
  1. Proceedings of Machine Learning Research
Publication Date 2024
Related URLs
Deposited October 07, 2024

Versions

Analytics

Collections

This resource is currently not in any collection.

Work History

Version 1
published

  • Created
  • Added he24a-2.pdf
  • Added Creator Zelin He
  • Added Creator Ying Sun
  • Added Creator Jingyuan Liu
  • Added Creator Runze Li
  • Published
  • Updated
  • Updated Related URLs, Publication Date Show Changes
    Related URLs
    • https://proceedings.mlr.press/v238/he24a.html
    Publication Date
    • 2024-01-01
    • 2024