Talk - spaCy PyTorch Transformers

Description

Huge transformer models like BERT, GPT-2 and XLNet have set a new standard for accuracy on almost every Natural Language Processing leaderboard. However, these models are very new, and most of the software ecosystem surrounding them is oriented towards the many opportunities for further research that they provide. In this talk, I’ll describe how you can now use these models in spaCy, a popular library for putting Natural Language Processing to work on real problems. I’ll also discuss the many opportunities that new transfer learning technologies can offer production NLP, regardless of which specific software packages you choose to get the job done.

Speaker

Dr. Matthew Honnibal is the original author of the spaCy NLP library, and a co-founder of Explosion AI. He has been working on Natural Language Processing technologies since 2005, completing his PhD in 2009, and continuing to publish in academia until 2014, when he decided there was a gap in the open-source ecosystem for a library that helped engineers take models from research into production.

He can be reached at his twitter handle: @honnibal.