Working Drafts
Please send us an email if you have any suggestions regarding the following working drafts. We will buy you a cup of thank you Naixue (奈雪) or Hey Tea (喜茶) for help improving the work :)
- Chang Ma, Haiteng Zhao, Lin Zheng, Jiayi Xin, Qintong Li, Lijun Wu, Zhihong Deng, Yang Lu, Qi Liu, and Lingpeng Kong, Retrieved Sequence Augmentation for Protein Representation Learning, arXiv:2302.12563, 2023.2
- Lin Zheng, Jianbo Yuan, Lei Yu, and Lingpeng Kong, A Reparameterized Discrete Diffusion Model for Text Generation, arXiv:2302.04542, 2023.2
Publications
2023
- Zhiyong Wu*, Yaoxiang Wang*, Jiacheng Ye*, and Lingpeng Kong, Self-adaptive In-context Learning, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023), Toronto, Canada, 2023.7
- Qintong Li, Zhiyong Wu, Lingpeng Kong, and Wei Bi, Explanation Regeneration via Information Bottleneck, In Findings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023 Findings), Toronto, Canada, 2023.7
- Fei Yuan, Yinquan Lu, Wenhao Zhu, Lingpeng Kong, Lei Li, and Jingjing Xu, Lego-MT: Learning Detachable Models for Massively Multilingual Machine Translation, In Findings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023 Findings), Toronto, Canada, 2023.7
- Wenhao Zhu, Jingjing Xu, Shujian Huang, Lingpeng Kong, and Jiajun Chen, INK: Injecting kNN Knowledge in Nearest Neighbor Machine Translation, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023), Toronto, Canada, 2023.7
- Jiyue Jiang, Sheng Wang, Qintong Li, Lingpeng Kong, and Chuan Wu, A Cognitive Stimulation Therapy Dialogue System with Multi-Source Knowledge Fusion for Elders with Cognitive Impairment, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023), Toronto, Canada, 2023.7
- Xueliang Zhao, Tingchen Fu, Lemao Liu, Lingpeng Kong, Shuming Shi, and Rui Yan, SORTIE: Dependency-Aware Symbolic Reasoning for Logical Data-to-text Generation, In Findings of the Annual Meeting of the Association for Computational Linguistics (ACL 2023 Findings), Toronto, Canada, 2023.7
- Jiacheng Ye, Zhiyong Wu, Jiangtao Feng, Tao Yu, and Lingpeng Kong, Compositional Exemplars for In-context Learning, In Proceedings of the International Conference on Machine Learning (ICML 2023), Honolulu, Hawaii, 2023.7
- Jun Zhang*, Shuyang Jiang*, Jiangtao Feng, Lin Zheng, and Lingpeng Kong, CAB: Comprehensive Attention Benchmarking on Long Sequence Modeling, In Proceedings of the International Conference on Machine Learning (ICML 2023), Honolulu, Hawaii, 2023.7
- Xuyang Shen, Dong Li, Jinxing Zhou, Zhen Qin, Bowen He, Xiaodong Han, Aixuan Li, Yuchao Dai, Lingpeng Kong, Meng Wang, and Yiran Zhong, Fine-grained Audible Video Description, In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR 2023), Vancouver, Canada, 2023.6
- Lin Zheng, Jianbo Yuan, Chong Wang, and Lingpeng Kong, Efficient Attention via Control Variates, In International Conference on Learning Representations (ICLR 2023), Kigali, Rwanda, 2023.5 [Oral]
- Jiahui Gao, Renjie Pi, Lin Yong, Hang Xu, Jiacheng Ye, Zhiyong Wu, Weizhong Zhang, Xiaodan Liang, Zhenguo Li, and Lingpeng Kong, Self-Guided Noise-Free Data Generation for Efficient Zero-Shot Learning, In International Conference on Learning Representations (ICLR 2023), Kigali, Rwanda, 2023.5 [Spotlight]
- Zhen Qin, Xiaodong Han, Weixuan Sun, Bowen He, Dong Li, Dongxu Li, Yuchao Dai, Lingpeng Kong, and Yiran Zhong, Toeplitz Neural Network for Sequence Modeling, In International Conference on Learning Representations (ICLR 2023), Kigali, Rwanda, 2023.5 [Spotlight]
- Shansan Gong, Mukai Li, Jiangtao Feng, Zhiyong Wu, and Lingpeng Kong, DiffuSeq: Sequence to Sequence Text Generation with Diffusion Models, In International Conference on Learning Representations (ICLR 2023), Kigali, Rwanda, 2023.5
- Sijie Chen, Zhiyong Wu, Jiangjie Chen, Zhixing Li, Yang Liu, and Lingpeng Kong, Unsupervised Explanation Generation via Correct Instantiations, In Proceedings of AAAI Conference on Artificial Intelligence (AAAI 2023), Washington, DC, 2023.2
2022
- Jiacheng Ye, Jiahui Gao, Zhiyong Wu, Jiangtao Feng, Tao Yu, and Lingpeng Kong, ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback, In Findings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2022 Findings), Abu Dhabi, 2022.12
- Jiacheng Ye*, Jiahui Gao*, Qintong Li, Hang Xu, Jiangtao Feng, Zhiyong Wu, Tao Yu, and Lingpeng Kong, ZeroGen: Efficient Zero-shot Learning via Dataset Generation, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2022), Abu Dhabi, 2022.12
- Tianbao Xie*, Chen Henry Wu*, Peng Shi, Ruiqi Zhong, Torsten Scholak, Michihiro Yasunaga, Chien-Sheng Wu, Ming Zhong, Pengcheng Yin, Sida Wang, Victor Zhong, Bailin Wang, Chengzu Li, Connor Boyle, Ansong Ni, Ziyu Yao, Dragomir Radev, Caiming Xiong, Lingpeng Kong, Rui Zhang, Noah A. Smith, Luke Zettlemoyer, and Tao Yu, UnifiedSKG: Unifying and Multi-Tasking Structured Knowledge Grounding with Text-to-Text Language Models, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2022), Abu Dhabi, 2022.12
- Zhen Qin, Xiaodong Han, Weixuan Sun, Dongxu Li, Lingpeng Kong, Nick Barnes, and Yiran Zhong, The Devil in Linear Transformer, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2022), Abu Dhabi, 2022.12
- Changlong Yu, Tianyi Xiao, Lingpeng Kong, Yangqiu Song, and Wilfred Ng, An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2022), Abu Dhabi, 2022.12
- Chenxin An, Jiangtao Feng, Kai Lv, Lingpeng Kong, Xipeng Qiu, and Xuanjing Huang, CoNT: Contrastive Neural Text Generation, In Advances in Neural Information Processing Systems (NeurIPS 2022), New Orleans, Louisiana, 2022.11 [Spotlight]
- Yixuan Su, Tian Lan, Yan Wang, Dani Yogatama, Lingpeng Kong, and Nigel Collier, A Contrastive Framework for Neural Text Generation, In Advances in Neural Information Processing Systems (NeurIPS 2022), New Orleans, Louisiana, 2022.11 [Spotlight]
- Jinxing Zhou, Jianyuan Wang, Jiayi Zhang, Weixuan Sun, Jing Zhang, Stan Birchfield, Dan Guo, Lingpeng Kong, Meng Wang, and Yiran Zhong, Audio-Visual Segmentation, In Proceedings of the European Conference on Computer Vision (ECCV 2022), 2022.10
- Lin Zheng, Chong Wang, and Lingpeng Kong, Linear Complexity Randomized Self-attention Mechanism, In Proceedings of the International Conference on Machine Learning (ICML 2022), 2022.7
- Lin Zheng, Huijie Pan, and Lingpeng Kong, Ripple Attention for Visual Perception with Sub-quadratic Complexity, In Proceedings of the International Conference on Machine Learning (ICML 2022), 2022.7
- Jakob Prange, Nathan Schneider, and Lingpeng Kong, Linguistic Frameworks Go Toe-to-Toe at Neuro-Symbolic Language Modeling, In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022), 2022.7
- Qintong Li, Piji Li, Wei Bi, Zhaochun Ren, Yuxuan Lai, and Lingpeng Kong, Event Transition Planning for Open-ended Text Generation, In Findings of the Annual Meeting of the Association for Computational Linguistics (ACL 2022 Findings), 2022.5
- Zhiyong Wu, Wei Bi, Xiang Li, Lingpeng Kong, and Ben Kao, Lexical Knowledge Internalization for Neural Dialog Generation, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2022), 2022.5
- Hao Peng, Jungo Kasai, Nikolaos Pappas, Dani Yogatama, Zhaofeng Wu, Lingpeng Kong, Roy Schwartz, and Noah A. Smith, ABC: Attention with Bounded-memory Control, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2022), 2022.5
- Zhen Qin, Weixuan Sun, Hui Deng, Dongxu Li, Yunshen Wei, Baohong Lv, Junjie Yan, Lingpeng Kong, and Yiran Zhong, cosFormer: Rethinking Softmax In Attention, In International Conference on Learning Representations (ICLR 2022), 2022.4
- Han Shi*, Jiahui Gao*, Hang Xu, Xiaodan Liang, Zhenguo Li, Lingpeng Kong, Stephen M. S. Lee, and James Kwok, Revisiting Over-smoothing in BERT from the Perspective of Graph, In International Conference on Learning Representations (ICLR 2022), 2022.4 [Spotlight]
2021
- Lin Zheng, Zhiyong Wu and Lingpeng Kong, Cascaded Head-colliding Attention, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2021), 2021.8
- Zhiyong Wu, Lingpeng Kong, Wei Bi, Xiang Li and Ben Kao, Good for Misconceived Reasons: An Empirical Revisiting on the Need for Visual Context in Multimodal Machine Translation, In Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL 2021), 2021.8
- Dani Yogatama, Cyprien de Masson d'Autume, and Lingpeng Kong, Adaptive Semiparametric Language Models, Transactions of the Association for Computational Linguistics (TACL), 2021
- Hao Peng, Nikolaos Pappas, Dani Yogatama, Roy Schwartz, Noah A. Smith, and Lingpeng Kong, Random Feature Attention, In International Conference on Learning Representations (ICLR 2021), Vienna, Austria, 2021.5 [Spotlight]
2020
- Adhiguna Kuncoro*, Lingpeng Kong*, Daniel Fried*, Dani Yogatama, Laura Rimell, Chris Dyer, and Phil Blunsom, Syntactic Structure Distillation Pretraining For Bidirectional Encoders, Transactions of the Association for Computational Linguistics (TACL), 2020.9
- Lei Yu, Laurent Sartran, Wojciech Stokowiec, Wang Ling, Lingpeng Kong, Phil Blunsom, and Chris Dyer, Better Document-level Machine Translation with Bayes' Rule, Transactions of the Association for Computational Linguistics (TACL), 2020.4
- Lingpeng Kong, Cyprien de Masson d'Autume, Wang Ling, Lei Yu, Zihang Dai, and Dani Yogatama, A Mutual Information Maximization Perspective of Language Representation Learning, In International Conference on Learning Representations (ICLR 2020), Ethiopia, 2020.4 [Spotlight]
2019
- Cyprien de Masson d'Autume, Sebastian Ruder, Lingpeng Kong, and Dani Yogatama, Episodic Memory in Lifelong Language Learning, In Advances in Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada, 2019.11
- Lingpeng Kong, Gabor Melis, Wang Ling, Lei Yu, and Dani Yogatama, Variational Smoothing in Recurrent Neural Network Language Models, In International Conference on Learning Representations (ICLR 2019), New Orleans, Louisiana, 2019.5
- Dani Yogatama, Cyprien de Masson d'Autume, Jerome Connor, Tomas Kocisky, Mike Chrzanowski, Lingpeng Kong, Angeliki Lazaridou, Wang Ling, Lei Yu, Chris Dyer, and Phil Blunsom, Learning and Evaluating General Linguistic Intelligence, arXiv:1901.11373, 2019.2
2018
- Jiangtao Feng, Lingpeng Kong, Po-Sen Huang, Chong Wang, Da Huang, Jiayuan Mao, Kan Qiao, and Dengyong Zhou, Neural Phrase-to-Phrase Machine Translation, arXiv:1811.02172, 2018.11
- Lei Yu, Cyprien de Masson d'Autume, Chris Dyer, Phil Blunsom, Lingpeng Kong, and Wang Ling, Sentence Encoding with Tree-constrained Relation Networks, arXiv:1811.10475, 2018.11
2017
- Lingpeng Kong, Neural Representation Learning in Linguistic Structured Prediction, Ph.D. thesis, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, 2017.9
- Hao Tang, Liang Lu, Lingpeng Kong, Kevin Gimpel, Karen Livescu, Chris Dyer, Noah A. Smith, and Steve Renals, End-to-End Neural Segmental Models for Speech Recognition, IEEE Journal of Selected Topics in Signal Processing, 2017.8
- Liang Lu, Lingpeng Kong, Chris Dyer, and Noah A. Smith, Multi-task Learning with CTC and Segmental CRF for Speech Recognition, In Proceedings of the Annual Conference of the International Speech Communication Association (INTERSPEECH 2017), Stockholm, Sweden, 2017.8
- Adhiguna Kuncoro, Miguel Ballesteros, Lingpeng Kong, Chris Dyer, Graham Neubig, and Noah A. Smith, What Do Recurrent Neural Network Grammars Learn About Syntax?, In Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics (EACL 2017), Valencia, Spain, 2017.4 [Outstanding Paper Award]
- Chris Alberti, Daniel Andor, Ivan Bogatyy, Michael Collins, Dan Gillick, Lingpeng Kong, Terry Koo, Ji Ma, Mark Omernick, Slav Petrov, Chayut Thanapirom, Zora Tung, and David Weiss, SyntaxNet Models for the CoNLL 2017 Shared Task, arXiv:1703.04929, 2017.3
- Lingpeng Kong, Chris Alberti, Daniel Andor, Ivan Bogatyy, and David Weiss, DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks, arXiv:1703.04474, 2017.3
- Graham Neubig, Chris Dyer, Yoav Goldberg, Austin Matthews, Waleed Ammar, Antonios Anastasopoulos, Miguel Ballesteros, David Chiang, Daniel Clothiaux, Trevor Cohn, Kevin Duh, Manaal Faruqui, Cynthia Gan, Dan Garrette, Yangfeng Ji, Lingpeng Kong, Adhiguna Kuncoro, Gaurav Kumar, Chaitanya Malaviya, Paul Michel, Yusuke Oda, Matthew Richardson, Naomi Saphra, Swabha Swayamdipta, and Pengcheng Yin, DyNet: The Dynamic Neural Network Toolkit, arXiv:1701.03980, 2017.1
2016
- Adhiguna Kuncoro, Miguel Ballesteros, Lingpeng Kong, Chris Dyer, and Noah A. Smith, Distilling an Ensemble of Greedy Dependency Parsers into One MST Parser, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2016), Austin, TX, 2016.11
- Liang Lu*, Lingpeng Kong*, Chris Dyer, Noah A. Smith, and Steve Renals, Segmental Recurrent Neural Networks for End-to-end Speech Recognition, In Proceedings of the Annual Conference of the International Speech Communication Association (INTERSPEECH 2016), San Francisco, California, 2016.9 (*equal contribution)
- Lingpeng Kong, Chris Dyer, and Noah A. Smith, Segmental Recurrent Neural Networks, In International Conference on Learning Representations (ICLR 2016), San Juan, Puerto Rico, 2016.5
- Yangfeng Ji, Trevor Cohn, Lingpeng Kong, Chris Dyer, and Jacob Eisenstein, Document Context Language Models, In International Conference on Learning Representations (ICLR 2016 Workshop Track), San Juan, Puerto Rico, 2016.5
2015
- Dani Yogatama, Lingpeng Kong, and Noah A. Smith, Bayesian Optimization of Text Representations, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2015), Lisboa, Portugal, 2015.9
- Ting-Hao (Kenneth) Huang, Yun-Nung Chen, and Lingpeng Kong, ACBiMA: Advanced Chinese Bi-Character Word Morphological Analyzer, In Proceedings of The 8th SIGHAN Workshop on Chinese Language Processing (SIGHAN-8), Beijing, China, 2015.7 [software]
- Lingpeng Kong, Alexander M. Rush, and Noah A. Smith, Transforming Dependencies into Phrase Structures, In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics - Human Language Technologies (NAACL-HLT 2015), Denver, CO, 2015.5 [software]
2014
- Lingpeng Kong, Nathan Schneider, Swabha Swayamdipta, Archna Bhatia, Chris Dyer, and Noah A. Smith, A Dependency Parser for Tweets, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2014), Doha, Qatar, 2014.10 [slides, software]
- William Yang Wang, Lingpeng Kong, Kathryn Mazaitis, and William W. Cohen, Dependency Parsing for Weibo: An Efficient Probabilistic Logic Programming Approach, In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2014), Doha, Qatar, 2014.10
- Lingpeng Kong and Noah A. Smith, An Empirical Comparison of Parsing Methods for Stanford Dependencies, arXiv:1404.4314, 2014.04
2011
- Lingpeng Kong and Likun Qiu, Formalization and Rules for Recognition of Satirical Irony, In Proceedings of the International Conference on Asian Language Processing, Penang, Malaysia, 2011.11
- Likun Qiu, Lei Wu, Changjian Hu, Kai Zhao, and Lingpeng Kong, Improving Chinese Dependency Parsing with Self-Disambiguating Patterns, In Proceedings of the International Conference on Asian Language Processing, Penang, Malaysia, 2011.11