Welcome to my homepage!
I’m a Computer Science MSc student at The University of British Columbia, working with Prof. Ivan Beschastnikh and Prof. Mathias Lécuyer. I completed my Bachelor’s in Computing at The Hong Kong Polytechnic University.
My research interests involve distributed system, cloud computing and machine learning. In 2019, I did an internship at Microsoft Research Asia (MSRA), working with Dr. Yang Chen. We worked on Forerunner, a novel technique that accelerates block validation process in Ethereum. In 2020, I finished my undergraduate thesis under the supervision of Prof. Song Guo. My thesis was about developing new compression techniques in distributed machine learning. Currently, we are working on evaluating compression techniques in Federated Learning at UBC with Huawei.
I love solving algorithm problems. In past years, I participated in a number of programming contests, such as ACM ICPC Regional Contest, IEEEXtreme, and National Olympiad in Informatics in Province. I am also an experienced codeforces user :).
July 2022 - Join our competition at NeurIPS 2022!
NL4Opt: Formulating Optimization Problems Based on Their Natural Language Descriptions
Publications & Pre-prints
GlueFL: Reconciling Client Sampling and Model Masking for Bandwidth Efficient Federated Learning
Shiqi He, Qifan Yan, Feijie Wu, Lanjun Wang, Mathias Lécuyer, Ivan Beschastnikh
Submitted to Conference on Machine Learning and Systems (MLSys 2023)
Anchor Sampling for Federated Learning with Partial Client Participation
Feijie Wu, Song Guo, Zhihao Qu, Shiqi He, Ziming Liu
Submitted to International Conference on Learning Representations (ICLR 2023) [.pdf]
Augmenting Operations Research with Auto-Formulation of Optimization Models from Problem Descriptions
Rindranirina Ramamonjison, Haley Li, Timothy T. Yu, Shiqi He, Vishnu Rengan, Amin Banitalebi-Dehkordi, Zirui Zhou, Yong Zhang
To appear in Empirical Methods in Natural Language Processing (EMNLP 2022) [.pdf]
Sign Bit is Enough: A Learning Synchronization Framework for Multi-hop All-reduce with Ultimate Compression
Feijie Wu*, Shiqi He*, Song Guo, Zhihao Qu, Haozhao Wang, Weihua Zhuang, Jie Zhang
Design Automation Conference (DAC 2022) [.pdf]
On the Convergence of Quantized Parallel Restarted SGD for Serverless Learning
Feijie Wu, Shiqi He, Yutong Yang, Haozhao Wang, Zhihao Qu, Song Guo
arXiv, preprint arXiv:2004.09125. [.pdf]