Zae Myung Kim

University of Minnesota Twin Cities, Minneapolis, Minnesota, USA


200 Union St SE

Minneapolis, MN 55455

I am a machine learning researcher interested in natural language processing (NLP).
I will be joining the Minnesota NLP group led by Prof. Dongyeop Kang at University of Minnesota Twin Cities as a PhD student in this fall 2022.

My research interests focus on the intersection of discourse analysis, multilinguality, and conditional language generation. In particular, I am currently interested in improving the performance of document-level multilingual discourse analysis through self-supervision and multi-task learning; and applying the learned discourse information to generative tasks such as interactive and iterative text generation.

Previously, I worked as a researcher at NAVER LABS Europe and Papago team at NAVER Korea, where I researched on various topics in neural machine translation (NMT), such as analysis of language-pair-specific multilingual representation, document-level NMT with discourse information, cross-attention-based website translation, and quality estimation for evaluating NMT models.

I received a B.Eng. degree in Computer Science from Imperial College London in 2011. From 2012 to 2013, I served in the Republic of Korea Army Special Forces as an army interpreter. In 2016, I received an M.S. degree in Computer Science from Korea Advanced Institute of Science and Technology (KAIST).


May 26, 2022 Our work on system demonstration for interactive and iterative text revision was shared in In2Writing workshop at ACL 2022 and received best paper award 🎉.
May 23, 2022 Our work on iterative text revision was accepted at ACL 2022.
Apr 22, 2022 I am excited for my summer internship at Grammarly where I will be working on interactive and iterative text revision task.
Nov 11, 2021 Our work on visualization of cross-lingual discourse relations was shared in CODI workshop at EMNLP 2021.
Aug 1, 2021 Our work on analysis of multilingual representation in NMT was accepted at Findings of ACL 2021.
Nov 20, 2020 Our work on quality estimation for NMT models was accepted at WMT 2020.
Nov 20, 2020 Our work on multilingual NMT model was shared in NLP-COVID19 workshop at EMNLP 2020.