One of the hottest fields right now in machine learning is natural language processing. Whether it’s getting sentiment from tweets, summarizing your documents, sarcasm detection, or predicting stock trends from the news, NLP is definitely the wave of the future. Special guest Daniel Svoboda talks about transfer learning and the latest developments such as BERT that promises to revolutionize NLP even further. Sponsors
- Machine Learning for Software Engineers by Educative.io
- Audible.com
- CacheFly
Panel
- Charles Max Wood
- Gant Laborde
Guest
Links
- towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp
- ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html
- ai.googleblog.com/2017/08/transformer-novel-neural-network.html
- www.nltk.org
- spacy.io
- https://www.kaggle.com
-
Picks Charles Max Wood:
- Traffic Secrets: The Underground Playbook for Filling Your Websites and Funnels with Your Dream Customers
- Range: Why Generalists Triumph in a Specialized World
Gant Laborde:
Daniel Svoboda:
Follow Adventures in Machine Learning on Twitter > @podcast_ml
Advertising Inquiries: https://redcircle.com/brands
Privacy & Opt-Out: https://redcircle.com/privacy
Become a supporter of this podcast: https://www.spreaker.com/podcast/adventures-in-machine-learning--6102041/support.