Dian Jiao

This page is a disambiguation page, it actually contains mutiple papers from persons of the same or a similar name.

Bibliography

2025
EOC-Bench: Can MLLMs Identify, Recall, and Forecast Objects in an Egocentric World?
CoRR, June, 2025

Hunyuan-TurboS: Advancing Large Language Models through Mamba-Transformer Synergy and Adaptive Chain-of-Thought.
CoRR, May, 2025

Truth Discovery for Multiple Judgments With Crowdsourced Sparse Data.
IEEE Internet Things J., March, 2025

MEMO: Fine-grained Tensor Management For Ultra-long Context LLM Training.
Proc. ACM Manag. Data, February, 2025

Advancing personalized digital therapeutics: integrating music therapy, brainwave entrainment methods, and AI-driven biofeedback.
Frontiers Digit. Health, 2025

PM-SRCANet: A Privacy-Preserving Multimodal Stress Recognition Convolutional Attention Network Model.
Proceedings of the Wireless Artificial Intelligent Computing Systems and Applications, 2025

Neo: Towards Efficient Fully Homomorphic Encryption Acceleration using Tensor Core.
Proceedings of the 52nd Annual International Symposium on Computer Architecture, 2025

Align²LLaVA: Cascaded Human and Large Language Model Preference Alignment for Multi-modal Instruction Curation.
Proceedings of the Findings of the Association for Computational Linguistics, 2025

2024
Hypergraph-based Truth Discovery for Sparse Data in Mobile Crowdsensing.
ACM Trans. Sens. Networks, May, 2024

SymSwin: Multi-Scale-Aware Super-Resolution of Remote Sensing Images Based on Swin Transformers.
Remote. Sens., 2024

Enhancing privacy-preserving machine learning with self-learnable activation functions in fully homomorphic encryption.
J. Inf. Secur. Appl., 2024

Hunyuan-Large: An Open-Source MoE Model with 52 Billion Activated Parameters by Tencent.
CoRR, 2024

Align<sup>2</sup>LLaVA: Cascaded Human and Large Language Model Preference Alignment for Multi-modal Instruction Curation.
CoRR, 2024

Efficiently Training 7B LLM with 1 Million Sequence Length on 8 GPUs.
CoRR, 2024

IDEAL: Leveraging Infinite and Dynamic Characterizations of Large Language Models for Query-focused Summarization.
CoRR, 2024

DuetRAG: Collaborative Retrieval-Augmented Generation.
CoRR, 2024

GraphControl: Adding Conditional Control to Universal Graph Pre-trained Models for Graph Domain Transfer Learning.
Proceedings of the ACM on Web Conference 2024, 2024

Surge Phenomenon in Optimal Learning Rate and Batch Size Scaling.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

FedTAIL: A Federated Learning Approach with Trans-Architecture Intermediate Links.
Proceedings of the International Joint Conference on Neural Networks, 2024

A Privacy-Preserving Computer-Aided Diagnosis Framework for Medical Applications Using Federated Learning and Homomorphic Encryption.
Proceedings of the Attacks and Defenses for the Internet-of-Things, 2024

2023
Angel-PTM: A Scalable and Economical Large-scale Pre-training System in Tencent.
Proc. VLDB Endow., 2023

2018
A Method of Lunar Mapping SAR Baseline Estimation Without GCPs.
Proceedings of the IEEE International Conference on Information and Automation, 2018

2009
Trypsin-ligand binding free energies from explicit and implicit solvent simulations with polarizable potential.
J. Comput. Chem., 2009


  Loading...