Hi, I am Hongduan Tian, a first-year Ph.D. student at Trustworthy Machine Learning and Reasoning (TMLR) Group in Department of Computer Science, Hong Kong Baptist University, advised by Dr. Bo Han and Dr. Feng Liu.

Before that, I got my master degree from Nanjing University of Information Science and Technology (NUIST) and fortunately supervised by Prof. Xiao-Tong Yuan and Prof. Qingshan Liu.

My research interests mainly include few-shot/meta learning, transfer learning, and LLM Agents.

Please feel free to email me for research, collaborations, or a casual chat.

πŸ“£ News

  • $\frak{2024.05}$: One paper is accepted by ICML 2024.

πŸ“– Educations

  • 2023.09 - present, Hong Kong Baptist University (HKBU), Ph.D. in Computer Science.
  • 2018.09 – 2021.06, Nanjing University of Information Science and Technology (NUIST), M.E. in Control Enginerring.
  • 2014.09 - 2018.06, Nanjing University of Information Science and Technology (NUIST), B.E. in Automation.

πŸ“ Publications

βœ‰οΈ Corresponding author.

sym

Static Badge MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence.
[paper] [code] [slides] [poster] [CN-video] [EN-video]
Hongduan Tian, Feng Liu, Tongliang Liu, Bo Du, Yiu-ming Cheung, Bo Hanβœ‰οΈ.

Quick Introduction In cross-domain few-shot classification, nearest centroid classifier (NCC) aims to learn representations to construct a metric space where few-shot classification can be performed by measuring the similarities between samples and the prototype of each class. An intuition behind NCC is that each sample is pulled closer to the class centroid it belongs to while pushed away from those of other classes. However, in this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes.

In order to address this problem, we propose a bi-level optimization framework, maximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures indicated by labeled data of the given task. Specifically, MOKD first optimizes the kernel adopted in Hilbert-Schmidt independence criterion (HSIC) to obtain the optimized kernel HSIC (opt-HSIC) that can capture the dependence more precisely. Then, an optimization problem regarding the opt-HSIC is addressed to simultaneously maximize the dependence between representations and labels and minimize the dependence among all samples.

Extensive experiments on Meta-Dataset demonstrate that MOKD can not only achieve better generalization performance on unseen domains in most cases but also learn better data representation clusters.
sym

Static Badge Meta-learning with network pruning.
[paper] [code]
Hongduan Tianβœ‰οΈ, Bo Liu, Xiao-Tong Yuanβœ‰οΈ, Qingshan Liu.

Quick Introduction Meta-learning is a powerful paradigm for few-shot learning. Although with remarkable success witnessed in many applications, the existing optimization based meta-learning models with over-parameterized neural networks have been evidenced to ovetfit on training tasks.

To remedy this deficiency, we propose a network pruning based meta-learning approach for overfitting reduction via explicitly controlling the capacity of network. A uniform concentration analysis reveals the benefit of network capacity constraint for reducing generalization gap of the proposed meta-learner. We have implemented our approach on top of Reptile assembled with two network pruning routines: Dense-Sparse-Dense (DSD) and Iterative Hard Thresholding (IHT).

Extensive experimental results on benchmark datasets with different over-parameterized deep networks demonstrate that our method not only effectively alleviates meta-overfitting but also in many cases improves the overall generalization performance when applied to few-shot classification tasks.

πŸ’» Services

  • Conference Reviewer for ICML (22-24), NeurIPS (22-24), ICLR (22-24).
  • Journal Reviewer for TMLR, NEUNET, TNNLS, TPAMI.

🏫 Teaching

  • 2024 Spring, TA for COMP7940: Cloud Computing.

πŸ“– Experiences

  • 2024.06 - 2024.08, Research Intern @WeChat
  • 2023.09 - present, PhD student @HKBU-TMLR Group, advised by Dr. Bo Han.
  • 2023.07 - 2023.08, Research Intern @Alibaba.
  • 2022.07 - Present, Research Intern @NVIDIA NVAITC, advised by Charles Cheung.
  • 2022.07 - 2023.05, Research intern @HKBU-TMLR Group, advised by Dr. Bo Han and Dr. Feng Liu.
  • 2021.07 - 2022.07, Algorithm Engineer @ZTE Nanjing Research and Development Center.