Bio. I am an assistant professor in the School of Science and Engineering at the Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen). Before that, I received the Ph.D. degree from Texas A&M University (2019-2024), and conducted postdoctoral research at the California Institute of Technology (2024-2025). Please refer to my CV for more information.

I am starting the AI for Life Sciences Laboratory to develop machine-learning algorithms for structural data mining (e.g. graphs, point clouds, and fields), and leveraging them to build simulators of living organisms at multiple scales (e.g. virtual tissues). [join us]

News.
2025/03. Receive the Distinguished Graduate Student Award for Excellence in Research from the Association of Former Students @ TAMU. [news]
2025/01. Excited to release the Cellular Interaction Foundation Model (CIFM), an AI foundation model capable of simulating living systems like DeepSeek/ChatGPT – it inputs cellular microenvironments (like language contexts) and query locations, and outputs gene expressions (like next word tokens) for those locations (2025/03: Accepted @ MLGenX Workshop, ICLR’25). [huggingface] 🎉

2024/12. Co-organize AI Bootcamp VIII on Graph Machine Learning @ Caltech.
2024/10.Correlational Lagrangian Schrödinger Bridge: Learning Dynamics with Population-Level Regularization” (biology-inspired diffusion models under correlation conservation) is accepted @ AIDrugX Workshop, NeurIPS’24. [poster] 🎉
2024/06. Pass the Ph.D. final defense titled “Generalizable Graph AI for Biomedicine: Data-Driven Self-Supervision and Principled Regularization”, and become a Ph.D. 🎓🎉
2024/04. Participate in the community effort of CAGI6 Rare Genomes Project with the outcome accepted @ Human Genomics’24.
2024/03.Multi-Modal Contrastive Learning for Proteins by Combining Domain-Informed Views” (multi-modal protein representation learning) is accepted @ MLGenX Workshop, ICLR’24. [poster]
2024/01.Latent 3D Graph Diffusion” (latent diffusion models for 3D graphs) is accepted @ ICLR’24. [poster]

2023/04. Receive the Quality Graduate Student Award from ECEN @ TAMU.
2023/01.Graph Domain Adaptation via Theory-Grounded Spectral Regularization” (model-based risk bound analysis of GDA) is accepted @ ICLR’23. [poster] 🎉

2022/10.Does Inter-Protein Contact Prediction Benefit from Multi-Modal Data and Auxiliary Tasks?” (multi-modal/task protein-protein interface prediction) is accepted @ MLSB Workshop, NeurIPS’22. [poster]
2022/09.Augmentations in Hypergraph Contrastive Learning: Fabricated and Generative” (contrastive learning on hypergraphs) is accepted @ NeurIPS’22. [poster]
2022/06.Cross-Modality and Self-Supervised Protein Embedding for Compound-Protein Affinity and Contact Prediction” (multi-modal self-supervision in CPAC) is accepted @ Bioinformatics’22 (MoML’22, ECCB’22). [poster]
2022/01.Bayesian Modeling and Uncertainty Quantification for Learning to Optimize: What, Why, and How” (Bayesian learning to optimize) is accepted @ ICLR’22. [poster]

2021/12. Receive the NSF Student Travel Awards from WSDM’22.
2021/10.Bringing Your Own View: Graph Contrastive Learning without Prefabricated Data Augmentations” (generative augmentations in GraphCL) is accepted @ WSDM’22. [poster]
2021/09. Receive the Chevron Scholarship from ECEN @ TAMU.
2021/07. Serve as the session chair of Semisupervised and Unsupervised Learning @ ICML’21 and talk.
2021/05.Graph Contrastive Learning Automated” (long presentation, automatic augmentation selection in GraphCL) is accepted @ ICML’21. [video] 🎉
2021/03.Probabilistic Constructive Interference Precoding for Imperfect CSIT” (robust CI precoding in wireless communication) is accepted @ TVT’21. 🎉

2020/09.Cross-Modality Protein Embedding for Compound-Protein Affinity and Contact Prediction” (cross-modality learning in CPAC) is accepted @ MLSB Workshop, NeurIPS’20. [poster]
2020/09.Graph Contrastive Learning with Augmentations” (contrastive learning in GNN pre-training) is accepted @ NeurIPS’20. [poster]
2020/06.When Does Self-Supervision Help Graph Convolutional Networks?” (self-supervision in GCNs) is accepted @ ICML’20.
2020/02.L2-GCN: Layer-Wise and Learned Efficient Training of Graph Convolutional Networks” (efficient GCN training) is accepted @ CVPR’20.

2019/08 – 2024/08. Attend Texas A&M University for the Ph.D. Degree in Electrical Engineering, advised by Prof. Yang Shen.
2019/02. Receive the Electrical and Computer Engineering PhD Merit Fellowship from ECEN @ TAMU.