© 2020 Strange Loop
Poetry can take many forms but could be defined as a delicate balance between constraints and expression. Computers are pretty good at constraints, less so at expression.
The general problem of cracking the structure and meaning of a text has been one of the goals of linguistics, and Natural Language Processing. This had first been addressed in a reductionist way, by decomposing a sentence, counting, matching then, later, by learning from evidence and statistics. Or, as it has been shown, lately to be more and more effective, by training neural networks with a minimal set of assumptions - as was shown by the seminal paper “Natural Language Processing (almost) from Scratch”.
In this talk we'll see how to follow these techniques to spot poetry in unlikely textual places, or generate it (almost) from scratch, using a minimal set of assumptions, hoping for the meter and rhyme rules to emerge. Along the way we'll touch on topics such as the impossible definition of poetry and the tension between simple, classical rules, and deep learning models that can defy interpretation.
My day job and interests revolve around distributed systems that stream buckets of bytes, most recently to perform anomaly and fraud detection at a French Ad Retargeting Company. My other interests often revolve around streams of characters, from books to NLP. My greatest achievement - and failure - was taking a sabbatical to write a novel on a remote island, and ending up making a zombie movie instead.