Skip to main content

and
  1. No Access

    Article

    Fine-tuning pretrained transformer encoders for sequence-to-sequence learning

    In this paper, we introduce s2s-ft, a method for adapting pretrained bidirectional Transformer encoders, such as BERT and RoBERTa, to sequence-to-sequence tasks like abstractive summarization and question generat...

    Hangbo Bao, Li Dong, Wenhui Wang, Nan Yang in International Journal of Machine Learning … (2024)

  2. No Access

    Chapter and Conference Paper

    Neural Melody Composition from Lyrics

    In this paper, we study a novel task that learns to compose music from natural language. Given the lyrics as input, we propose a melody composition model that generates lyrics-conditional melody as well as th...

    Hangbo Bao, Shaohan Huang, Furu Wei, Lei Cui in Natural Language Processing and Chinese Co… (2019)

  3. No Access

    Chapter and Conference Paper

    Neural Question Generation from Text: A Preliminary Study

    Automatic question generation aims to generate questions from a text passage where the generated questions can be answered by certain sub-spans of the given passage. Traditional methods mainly use rigid heuris...

    Qingyu Zhou, Nan Yang, Furu Wei, Chuanqi Tan in Natural Language Processing and Chinese Co… (2018)