About me

I am currently a third-year Ph.D. student in ECE at Purdue University under the supervision of Prof. Jing Gao. Before coming to Purdue, I spent two wonderful years at University of Michigan to acquire my MS degree in Statistics. Prior to that, I got my BS degree from Xiamen University.

I am broadly interested in trustworthy machine learning with a special focus on comprehending and enhancing the trustworthiness of machine learning in unfavorable scenarios. Some topics I have explored or been exploring include: algorithmic fairness under data scarce regime, robustness of fair machine learning, and deep networks uncertainty.

Education

  • 2021 - 2026 (Expected), Ph.D. in Electrical and Computer Engineering, Purdue University
  • 2019 - 2021, M.S. in Applied Statistics, University of Michigan
  • 2013 - 2017, B.S. in Statistics, School of Economics, Xiamen University

Publications

Conference Publications

[ICLR ‘24] Tianci Liu, Haoyu Wang, Feijie Wu, Hengtong Zhang, Pan Li, Lu Su, Jing Gao, “Towards Poisoning Fair Representations.” the 12th International Conference on Learning Representations (ICLR), Austria, May 2024.

[EMNLP ‘23] Haoyu Wang, Yaqing Wang, Tianci Liu, Tuo Zhao, Jing Gao, “HadSkip: Homotopic and Adaptive Layer Skipping of Pre-trained Language Models for Efficient Inference.” Findings of the 2023 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2023), Singapore, Dec 2023.

[ICML ‘23] Tianci Liu\(^\ast\), Tong Yang\(^\ast\), Quan Zhang, Qi Lei, “Optimization for Amortized Inverse Problems.” the 40th International Conference on Machine Learning (ICML), Honolulu, Hawaii, Jul 2023.

[AAAI ‘23] Tianci Liu, Haoyu Wang, Yaqing Wang, Xiaoqian Wang, Lu Su, Jing Gao, “SimFair: A Unified Framework for Fairness-Aware Multi-Label Classification.” the 37th AAAI Conference on Artificial Intelligence (AAAI), Washington D.C., Feb 2023. (Distinguished Paper Award)

[DeepInverse@NeurIPS ‘21] Tianci Liu, Quan Zhang, Qi Lei, “PANOM: Automatic Hyper-parameter Tuning for Inverse Problems.” NeurIPS Workshop on Deep Learning and Inverse Problems, Virtual, Dec 2021.

[BDL@NeurIPS ‘21] Tianci Liu, Jeffrey Regier, “An Empirical Comparison of GANs and Normalizing Flows for Density Estimation.” NeurIPS Workshop on Bayesian Deep Learning, Virtual, Dec 2021.

Journal Publications

Daiwei Zhang, Tianci Liu, Jian Kang (2023) “Density Regression and Uncertainty Quantification with Bayesian Deep Noise Neural Networks.” Stat, 12(1), e604.

Laura Zichi, Tianci Liu, Elizabeth Drueke, Liuyan Zhao, and Gongjun Xu (2023) “Physically informed machine-learning algorithms for the identification of two-dimensional atomic crystals.” Scientific Reports 13(1): 6143.

Tianci Liu, Chun Wang, Gongjun Xu (2022) “Estimating three- and four-parameter MIRT models with importance-weighted sampling enhanced variational auto-encoder.” Frontiers in Psychology, 13:935419.

( \(^\ast\) = equal contribution)