representation learning for conversational AI + Auxiliary Tasks
Abstract
As the first part of my investigation on representation learning for conversational AI, I plan on working on Pretrained Language Models (PLM) in Open Domain Dialog Systems (ODD). My research will investigate the possibility of improving the contextual consistency of ODD by incorporating auxiliary tasks, as part of my study. I need access to powerful computational resources since I plan to work on GPT, BERT, and BART models at various architectural scales.