Home

Awesome

AACL-IJCNLP2022_Efficient_Robust_KGC_Tutorial

Materials for AACL2022 tutorial: Efficient and Robust Knowledge Graph Construction

Tutorial abstract [PDF]

Knowledge graph construction which aims to extract knowledge from the text corpus has appealed to the NLP community researchers. Previous decades have witnessed the remarkable progress of knowledge graph construction on the basis of neural models; however, those models often cost massive computation or labeled data resources and suffer from unstable inference accounting for biased or adversarial samples. Recently, numerous approaches have been explored to mitigate the efficiency and robustness issues for knowledge graph construction, such as prompt learning and adversarial training. In this tutorial, we aim at bringing interested NLP researchers up to speed about the recent and ongoing techniques for efficient and robust knowledge graph construction. Additionally, our goal is to provide a systematic and up-to-date overview of these methods and reveal new research opportunities to the audience.

If you find this tutorial helpful for your work, please kindly cite our paper.

@inproceedings{zhang-etal-2022-efficient-robust,
    title = "Efficient and Robust Knowledge Graph Construction",
    author = "Zhang, Ningyu  and
      Gui, Tao  and
      Nan, Guoshun",
    booktitle = "Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing: Tutorial Abstracts",
    month = nov,
    year = "2022",
    address = "Taipei",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.aacl-tutorials.1",
    pages = "1--7",
    abstract = "Knowledge graph construction which aims to extract knowledge from the text corpus, has appealed to the NLP community researchers. Previous decades have witnessed the remarkable progress of knowledge graph construction on the basis of neural models; however, those models often cost massive computation or labeled data resources and suffer from unstable inference accounting for biased or adversarial samples. Recently, numerous approaches have been explored to mitigate the efficiency and robustness issues for knowledge graph construction, such as prompt learning and adversarial training. In this tutorial, we aim to bring interested NLP researchers up to speed on the recent and ongoing techniques for efficient and robust knowledge graph construction. Additionally, our goal is to provide a systematic and up-to-date overview of these methods and reveal new research opportunities to the audience.",
}

Tutorial Materials

1. Slides [Introduction] [EfficientKGC] [RobustKGC] [Conclusion]

2. Video [AllParts]

3. Related Tutorials:

4. Survey:

Knowledge Graph Construction

Efficient NLP

Low-resource Learning

5. Reading list:

Tutorial schedule

Local time (GMT)ContentPresenterSlides
09:00-10:00Introduction and ApplicationsGuoshun Nan[Slides]
10:00-11:00Efficient Knowledge Graph ConstructionNingyu Zhang[Slides]
11:00-11:50Robust Knowledge Graph ConstructionTao Gui[Slides]
11:50-12:00SummaryNingyu Zhang[Slides]

Presenters

  <img src="imgs/ningyu.jpg" width="120" height = "150" align=center>   <img src="imgs/guitao.jpg" width="120" height = "150" align=center>   <img src="imgs/guoshun.jpg" width="120" height = "150" align=center>

Ningyu Zhang           Tao Gui               Guoshun Nan

Ningyu Zhang is an associate professor at Zhejiang University, his main research interests are knowledge graph, NLP, etc. He has published papers in top international academic conferences and journals such as NeurIPS/ICLR/WWW/KDD/WSDM/AAAI/IJCAI/ACL/ENNLP/NAACL/COLING/SIGIR/TASLP/ESWA/KBS/Journal of Software/Nature Communications. Three paper has been selected as Paper Digest Most Influential Papers (KnowPrompt'WWW22、DocuNet'IJCAI21、AliCG'KDD21). He has served as a PC for NeurIPS/ICLR/ICML/KDD/AAAI/IJCAI/ACL/EMNLP/NAACL, and reviewer of TKDE/WWWJ/JWS/TALLIP/IEEE Transactions on Cybernetics/ESWA.

Tao Gui is an associate professor in Institute of Modern Languages and Linguistics of Fudan University. He is the key member of FudanNLP group. He is a member of ACL, a member of the Youth Working Committee of the Chinese Information Processing Society of China, the member of the Language and Knowledge Computing Professional Committee of the Chinese Information Processing Society of China. He has published more than 30 papers in top international academic conferences and journals such as ACL, ENNLP, AAAI, IJCAI, SIGIR, and so on. He has served as Editor-in-Chief of the NLPR Information Extraction Special Issue, PCs for SIGIR, AAAI, IJCAI, and reviewer for TPAMI and ARR. He has received the Outstanding Doctoral Dissertation Award of the Chinese Information Processing Society of China, the area chair favorite Award of COLING 2018, the outstanding Paper Award of NLPCC 2019, and scholar of young talent promoting project of CAST.

Guoshun Nan is a tenure-track professor in School of Cyber Science and Engineering, Beijing University of Posts and Telecommunications (BUPT). He is the key member of National Engineering Research Center of Mobile Network Security, and a member of Wireless Technology Innovation Institute of BUPT. Before starting academic career, he also worked in Hewlett-Packard Company (China) for more than 4 years as an engineer. He is a member of ACL. His has broad interest in information extraction, model robustness, multimodal retrieval, cyber security and the next generation wireless networks. He has published more than 10 papers in top-tier conferences such as ACL, CVPR, EMNLP, SIGIR, IJCAI, CKIM and Sigcomm. He served as a reviewer for ACL, EMNLP, AAAI, IJCAI, Neurocomputing and IEEE Transaction on Image Processing.