Environmental Impact of Large-Scale NLP Model Training with Emma Strubell - TWIML Talk #286
Today we’re joined by Emma Strubell, currently a visiting scientist at Facebook AI Research. Emma’s focus is on NLP and bringing state of the art NLP systems to practitioners by developing efficient and robust machine learning models. Her paper, Energy and Policy Considerations for Deep Learning in NLP, hones in on one of the biggest topics of the generation: environmental impact. In this episode we discuss:
How training neural networks have resulted in an increase in accuracy, however the computational resources required to train these models is staggering - and carbon footprints are only getting bigger Emma’s research methods for determining carbon emissions How companies are reacting to environmental concerns What we, as an industry, can be doing better
Create your
podcast in
minutes
It is Free