Hi! I'm Linqing Liu, a PhD student in the NLP group at University College London, co-supervised by Prof. Pontus Stenetorp and Prof. Sebastian Riedel. Before that, I received my thesis-based master degree at University of Waterloo, supervised by Prof. Jimmy Lin. I got my bachelor degree from School of Software Engineering at Tongji University, Shanghai.
My primary research interest lies at the intersection of Deep Learning and Natural Language Processing (NLP). This is my CV.
MKD: a Multi-Task Knowledge Distillation Approach for Pretrained Language Models
Linqing Liu, Huan Wang, Jimmy Lin, Richard Socher and Caiming Xiong
Incorporating Contextual and Syntactic Structures Improves Semantic Similarity Modeling   [slides]
Linqing Liu, Wei Yang, Jinfeng Rao, Raphael Tang and Jimmy Lin
Bridging the Gap between Relevance Matching and Semantic Matching with Hierarchical Co-Attention Network
Jinfeng Rao, Linqing Liu, Yi Tay, Wei Yang, Peng Shi and Jimmy Lin
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
Raphael Tang*, Yao Lu*, Linqing Liu*, Lili Mou, Olga Vechtomova, Jimmy Lin
Generative Adversarial Network for Abstractive Text Summarization
Linqing Liu, Yao Lu, Min Yang, Qiang Qu, and Jia Zhu
The 30th AAAI Conference on Artificial Intelligence (AAAI, student poster), 2018
[supplementary file][output summary]
A Multi-task Learning Framework for Abstractive Text Summarization
Yao Lu, Linqing Liu, Zhile Jiang, Min Yang and Randy Goebel
The 31th AAAI Conference on Artificial Intelligence (AAAI, student poster), 2019
Detecting "Smart" Spammers On Social Network: A Topic Model Approch
Linqing Liu, Yao Lu, Ye Luo, Renxian Zhang, Laurent Itti, and Jianwei Lu
the North American Chapter of the Association for Computational Linguistics (NAACL, student session). 2016
- 2020.9 — Started my Ph.D. at the University College London.
- 2020.8 — Succesfully defended my master's thesis: Towards Effective Utilization of Pretrained Language Models - Knowledge Distillation from BERT.