Lock in $30 Savings on PRO—Offer Ends Soon! ⏳

spaCy meets Transformers

spaCy meets Transformers

Huge transformer models like BERT, GPT-2 and XLNet have set a new standard for accuracy on almost every Natural Language Processing leaderboard. However, these models are very new, and most of the software ecosystem surrounding them is oriented towards the many opportunities for further research that they provide. In this talk, I’ll describe how you can now use these models in spaCy, a popular library for putting Natural Language Processing to work on real problems. I’ll also discuss the many opportunities that new transfer learning technologies can offer production NLP, regardless of which specific software packages you choose to get the job done.

Matthew Honnibal

October 12, 2019
Tweet

More Decks by Matthew Honnibal

Other Decks in Technology

Transcript

  1. Matthew Honnibal CO-FOUNDER PhD in Computer Science in 2009. 10

    years publishing research on state-of-the- art natural language understanding systems. Left academia in 2014 to develop spaCy. Ines Montani CO-FOUNDER Programmer and front-end developer with degree in media science and linguistics. Has been working on spaCy since its first release. Lead developer of Prodigy.
  2. Modular architecture • Functions should be small and self-contained •

    Avoid state and side- effects • Lots of systems from fewer parts Speed and accuracy • Small functions make you repeat work • Without state, models lose information • ML models aren’t really interchangeable anyway
  3. Transformers: Pros • Easy network design • Great accuracy •

    Need few annotated examples Transformers: Cons • Slow / expensive • Need large batches • Bleeding edge
  4. • pip install spacy-transformers • Supports textcat, aligned tokenization, custom

    models • Coming soon: NER, tagging, dependency parsing • Coming soon: RPC for the transformer components • Coming soon: Transformers support in Prodigy Conclusion