Anke Tang

Orcid: 0000-0002-0576-8153

According to our database1, Anke Tang authored at least 17 papers between 2023 and 2025.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
Data-Adaptive Weight-Ensembling for Multi-task Model Fusion.
Int. J. Comput. Vis., August, 2025

Unsupervised deep learning model for fast energy layer pre-selection of delivery-efficient proton arc therapy plan optimization of nasopharyngeal carcinoma.
CoRR, June, 2025

Mix Data or Merge Models? Balancing the Helpfulness, Honesty, and Harmlessness of Large Language Model via Model Merging.
CoRR, February, 2025

Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging.
CoRR, January, 2025

Modeling Multi-Task Model Merging as Adaptive Projective Gradient Descent.
CoRR, January, 2025

Learning from models beyond fine-tuning.
Nat. Mac. Intell., 2025

Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace.
Proceedings of the Thirteenth International Conference on Learning Representations, 2025

2024
Efficient and Effective Weight-Ensembling Mixture of Experts for Multi-Task Model Merging.
CoRR, 2024

Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace.
CoRR, 2024

SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models.
CoRR, 2024

Towards Efficient Pareto Set Approximation via Mixture of Experts Based Model Fusion.
CoRR, 2024

FusionBench: A Comprehensive Benchmark of Deep Model Fusion.
CoRR, 2024

Merging Multi-Task Models via Weight-Ensembling Mixture of Experts.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

Parameter-Efficient Multi-Task Model Fusion with Partial Linearization.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

2023
Concrete Subspace Learning based Interference Elimination for Multi-task Model Fusion.
CoRR, 2023

Learn From Model Beyond Fine-Tuning: A Survey.
CoRR, 2023

Improving Heterogeneous Model Reuse by Density Estimation.
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, 2023


  Loading...