We conclude season one of Underrated ML by having Stephen Merity on as our guest. Stephen has worked at various institutions such as MetaMind and Salesforce ohana, Google Sydney, Freelancer.com, the Schwa Lab at the University of Sydney, the team at Grok Learning, the non-profit Common Crawl, and IACS @ Harvard. He also holds a Bachelor of Information Technology from the University of Sydney and a Master of Science in Computational Science and Engineering from Harvard University.
In this weeks episode we talk about the current influences of hardware in the field of Deep Learning research, baseline models, strongly typed RNNs and Alan Turings paper on the chemical basis of morphogenesis.
Underrated ML Twitter: https://twitter.com/underrated_ml
Stephen Merity Twitter: https://twitter.com/Smerity
Please let us know who you thought presented the most underrated paper in the form below: https://forms.gle/97MgHvTkXgdB41TC8
Links to the papers:
“The Chemical Basis of Morphogenesis” - https://www.dna.caltech.edu/courses/cs191/paperscs191/turing.pdf
"Strongly-Typed Recurrent Neural Networks” - https://arxiv.org/abs/1602.02218
"Quasi-Recurrent Neural Networks" - https://arxiv.org/abs/1611.01576
"An Analysis of Neural Language Modelling at Multiple Scales" - https://arxiv.org/abs/1803.08240
Additional Links:
Create your
podcast in
minutes
It is Free