Professor Aji explores efficient Natural Language Processing (NLP) through model compression and distillation, and NLP for under-resourced languages. This involves dataset curation/construction, data-efficient learning/adaptation, zero-shot approaches, or building multilingual language models. He is currently active in Indonesian and South-East Asian NLP researcher communities. Email
Prior to joining MBZUAI, Professor Aji was an applied research scientist at Amazon. He was a postdoctoral fellow at the Institute for Language, Cognition and Computation at the University of Edinburgh. During his postdoctoral and Ph.D., he contributed to efficient NMT-related projects, such as Marian: a fast NMT framework and browser-based translation without using the cloud.
Aside from efficient NLP, he is now interested in developing datasets and systems for multilingual NLP especially for under-resourced languages. Aji also co-initiated IndoNLP, a community-based movement to enable and advance NLP research for Indonesian languages.
Before getting into the world of AI and NLP, Professor Aji was active in competitive programming. He won a silver medal representing Indonesia in the 2010 International Olympiad of Informatics. He also worked in several well-known company's such as Apple (language engineer, 2015), Google (intern, 2017), and Amazon (applied scientist, 2021). He also worked at a start-up in Indonesia which is engaged in conversational AI.
Aji’s fields of interest include deep learning, computational linguistic, machine translation, efficient and distributed machine learning, low-resource NLP, multilingual NLP, and data construction.
Interested in working with
our renowned faculty?
Fill out the below form and we will get back to you.