About me

I am a 5th-year PhD student in the Department of Computer Science and Technology at Tsinghua University. I am supervised by Prof. Zhiyuan Liu. My research interests lie in Pre-trained Models for Natural Language Processing.

Main Publications

* indicates equal contribution.

  • Zhengyan Zhang*, Zhiyuan Zeng*, Yankai Lin, Huadong Wang, Deming Ye, Chaojun Xiao, et al. Plug-and-Play Knowledge Injection for Pre-trained Language Models. Proceedings of The 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023). [arxiv]

  • Zhengyan Zhang*, Zhiyuan Zeng*, Yankai Lin, Chaojun Xiao, Xiaozhi Wang, Xu Han, Zhiyuan Liu, et al. Emergent Modularity in Pre-trained Transformers. Findings of ACL 2023. [arxiv]

  • Chenglei Si*, Zhengyan Zhang*, Yingfa Chen*, Xiaozhi Wang, Zhiyuan Liu and Maosong Sun. READIN: A Chinese Multi-Task Benchmark with Realistic and Diverse Input Noises. Proceedings of The 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023). [arxiv]

  • Chenglei Si*, Zhengyan Zhang*, Yingfa Chen*, Fanchao Qi, Xiaozhi Wang, Zhiyuan Liu, Yasheng Wang, Qun Liu, Maosong Sun. Sub-Character Tokenization for Chinese Pretrained Language Models. Transactions of the Association for Computational Linguistics. [arxiv]

  • Zhengyan Zhang, Baitao Gong, Yingfa Chen, Xu Han, Guoyang Zeng, Weilin Zhao, Yanxu Chen, Zhiyuan Liu, Maosong Sun. BMCook: A Task-agnostic Compression Toolkit for Big Models. Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2022 Demo). [pdf]

  • Zhengyan Zhang, Yankai Lin, Zhiyuan Liu, Peng Li, Maosong Sun, Jie Zhou. MoEfication: Transformer Feed-forward Layers are Mixtures of Experts. Findings of ACL 2022. [pdf]

  • Zhengyan Zhang, Xu Han, Hao Zhou, Pei Ke, Yuxian Gu, Deming Ye, Yujia Qin, Yusheng Su, Haozhe Ji, Jian Guan, Fanchao Qi, Xiaozhi Wang, Yanan Zheng, Guoyang Zeng, Huanqi Cao, Shengqi Chen, Daixuan Li, Zhenbo Sun, Zhiyuan Liu, Minlie Huang, Wentao Han, Jie Tang, Juanzi Li, Xiaoyan Zhu, Maosong Sun. CPM: A Large-scale Generative Chinese Pre-trained Language Model. AI Open. [arxiv] [code] [homepage]

  • Yuan Yao, Haoxi Zhong, Zhengyan Zhang, Xu Han, Xiaozhi Wang, Chaojun Xiao, Guoyang Zeng, Zhiyuan Liu, Maosong Sun. Adversarial Language Games for Advanced Natural Language Intelligence. AAAI Conference on Artifical Intelligence (AAAI 2021). [arxiv]

  • Xiaozhi Wang, Tianyu Gao, Zhaocheng Zhu, Zhengyan Zhang, Zhiyuan Liu, Juanzi Li, Jian Tang. KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation. Transactions of the Association for Computational Linguistics. [pdf] [dataset]

  • Yuxian Gu, Zhengyan Zhang, Xiaozhi Wang, Zhiyuan Liu, Maosong Sun. Train No Evil: Selective Masking for Task-guided Pre-training. The Conference on Empirical Methods in Natural Language Processing (EMNLP 2020). [pdf] [code] (short)

  • Zhengyan Zhang*, Xu Han*, Zhiyuan Liu, Xin Jiang, Maosong Sun, Qun Liu. ERNIE: Enhanced Language Representation with Informative Entities. The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019). [arxiv] [code]

  • Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Maosong Sun, Zhichong Fang, Bo Zhang, Leyu Lin. COSINE: Compressive network embedding on large-scale information networks. IEEE Transactions on Knowledge and Data Engineering. (TKDE) [pdf]

  • Cunchao Tu, Zhengyan Zhang, Zhiyuan Liu, Maosong Sun. TransNet: Translation-Based Network Representation Learning for Social Relation Extraction. International Joint Conference on Artificial Intelligence. (IJCAI 2017) [pdf] [code]

  • Cunchao Tu, Xiangkai Zeng, Hao Wang, Zhengyan Zhang, Zhiyuan Liu, Maosong Sun, Bo Zhang, Leyu Lin. A Unified Framework for Community Detection and Network Representation Learning. IEEE Transactions on Knowledge and Data Engineering. (TKDE) [pdf]