Haiku Generation ~ A Transformer Based Approach ~ With Lots Of Control

I spent the last few months diving deep into Natural Language Understanding, learning from professor Christopher Potts. My end-of-course project is a Haiku poem generator that I enjoyed enormously putting together.


Haiku generation has attracted the interest of the NLP community for decades. After researching different ways of imbuing language models with a poetic sense, we introduce Haikoo, a transformer-based model that outperforms previous state-of-the-art neural network-based haiku poetry generators.
Haikoo consists mainly of two pieces: a GPT-2 model, which we fine-tuned on haiku poems, and a Plug and Play Language Models, which we employ to control the generated results to a further extent than the classic prompt approach. We found that GPT-2 learns successfully to generalize many of the qualities of haiku poetry while retaining enough flexibility to compose poems on entities never seen on the training data. PPLM on the other hand acts as a rudder, moving the generated output towards a specific concept or word. Haikoo helps to search the space of haiku poems satisfying poetic constraints, producing results that make sense to most readers, and that range from lyrical to hilarious.

Read the rest of the paper

Oh, and check out this weekend project that spawned off this paper, I think of it as an ideal descendant of Cutup!

Post a Comment through Mastodon

If you have a Mastodon account, .

Post a Comment through WordPress

Your email address will not be published. Required fields are marked *

Name *