Lock in $30 Savings on PRO—Offer Ends Soon! ⏳

spaCy and the future of multi-lingual NLP

spaCy and the future of multi-lingual NLP

Slides from our talk at the META Forum 2019 award ceremony where spaCy received the META Seal of Recognition.

Ines Montani

October 09, 2019
Tweet

More Decks by Ines Montani

Other Decks in Technology

Transcript

  1. Matthew Honnibal CO-FOUNDER PhD in Computer Science in 2009. 10

    years publishing research on state-of-the- art natural language understanding systems. Left academia in 2014 to develop spaCy. Ines Montani CO-FOUNDER Programmer and front-end developer with degree in media science and linguistics. Has been working on spaCy since its first release. Lead developer of Prodigy.
  2. Early 2015 spaCy is first released • open-source library for

    industrial- strength Natural Language Processing • focused on production use
  3. Early 2015 spaCy is first released • open-source library for

    industrial- strength Natural Language Processing • focused on production use Current stats 100k+ users worldwide 15k stars on GitHub 400 contributors 60+ extension packages
  4. Early 2016 German model Current stats 52+ supported languages 23

    pre-trained statistical models for 21 languages
  5. Late 2016 Explosion • new company for AI developer tools

    • bootstrapped through consulting for the first 6 months • funded through software sales since 2017 • remote team, centered in Berlin
  6. Late 2016 Explosion • new company for AI developer tools

    • bootstrapped through consulting for the first 6 months • funded through software sales since 2017 • remote team, centered in Berlin Current stats 7 team members 100% independent & profitable
  7. Late 2017 Prodigy • first commercial product • modern annotation

    tool • fully scriptable in Python Current stats 2500+ users, including 250+ companies 1200+ forum members
  8. The good Universal Dependencies • 100+ treebanks, 70+ languages, 1

    annotation scheme • driving lots of new multi- lingual parsing research
  9. The good Universal Dependencies • 100+ treebanks, 70+ languages, 1

    annotation scheme • driving lots of new multi- lingual parsing research More work in the field • huge increase in the number of papers on all NLP topics – including multi-lingual • lots of crossover from general ML work, too
  10. The good Universal Dependencies • 100+ treebanks, 70+ languages, 1

    annotation scheme • driving lots of new multi- lingual parsing research More work in the field • huge increase in the number of papers on all NLP topics – including multi-lingual • lots of crossover from general ML work, too Transfer Learning • we’re much better at learning from unlabeled text (e.g. Wikipedia) • leverage resources so we need less annotation per language
  11. The bad More competitive • “winning” a shared task could

    be worth millions (fame and future salary) • few researchers care about the languages
  12. The bad More competitive • “winning” a shared task could

    be worth millions (fame and future salary) • few researchers care about the languages More expensive • experiments now cost a lot to run (especially GPU) • results are unpredictable • pressure to run fewer experiments on fewer datasets
  13. The bad More competitive • “winning” a shared task could

    be worth millions (fame and future salary) • few researchers care about the languages More expensive • experiments now cost a lot to run (especially GPU) • results are unpredictable • pressure to run fewer experiments on fewer datasets Faster moving, less careful • huge pressure to publish • volume of publications makes reviewing more random • dynamics promote incremental work
  14. What we need: annotated data Annotated carefully ideally by small

    teams of experts Extensive experiments designed to answer questions not optimize benchmarks
  15. What we need: annotated data Annotated carefully ideally by small

    teams of experts Extensive experiments designed to answer questions not optimize benchmarks Maintained datasets not just static resources that never improve