Gongfan Fang

According to our database1, Gongfan Fang authored at least 17 papers between 2019 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Deep semantic image compression via cooperative network pruning.
J. Vis. Commun. Image Represent., September, 2023

Knowledge Amalgamation for Object Detection With Transformers.
IEEE Trans. Image Process., 2023

0.1% Data Makes Segment Anything Slim.
CoRR, 2023

DeepCache: Accelerating Diffusion Models for Free.
CoRR, 2023

LLM-Pruner: On the Structural Pruning of Large Language Models.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

Structural Pruning for Diffusion Models.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

DepGraph: Towards Any Structural Pruning.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023

2022
Federated Selective Aggregation for Knowledge Amalgamation.
CoRR, 2022

Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt.
Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, 2022

Up to 100x Faster Data-Free Knowledge Distillation.
Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, 2022

2021
Contrastive Model Inversion for Data-Free Knowledge Distillation.
CoRR, 2021

Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Contrastive Model Invertion for Data-Free Knolwedge Distillation.
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, 2021

2020
Impression Space from Deep Template Network.
CoRR, 2020

Adversarial Self-Supervised Data-Free Distillation for Text Classification.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

2019
Data-Free Adversarial Distillation.
CoRR, 2019

Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning.
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, 2019


  Loading...