Extracting knowledge from large datasets with large number of variables is always tricky. Dimensionality reduction helps in analyzing high dimensional data, still maintaining most of the information hidden behind complexity. Here are some methods that you must try before further analysis (Part 1).
Episode 41: How can deep neural networks reason
Episode 40: Deep learning and image compression
Episode 39: What is L1-norm and L2-norm?
Episode 38: Collective intelligence (Part 2)
Episode 38: Collective intelligence (Part 1)
Episode 37: Predicting the weather with deep learning
Episode 36: The dangers of machine learning and medicine
Episode 35: Attacking deep learning models
Episode 34: Get ready for AI winter
Episode 33: Decentralized Machine Learning and the proof-of-train
Episode 32: I am back. I have been building fitchain
Founder Interview – Francesco Gadaleta of Fitchain
Episode 31: The End of Privacy
Episode 30: Neural networks and genetic evolution: an unfeasible approach
Episode 29: Fail your AI company in 9 steps
Episode 28: Towards Artificial General Intelligence: preliminary talk
Episode 27: Techstars accelerator and the culture of fireflies
Episode 26: Deep Learning and Alzheimer
Episode 25: How to become data scientist [RB]
Episode 24: How to handle imbalanced datasets
Create your
podcast in
minutes
It is Free
Insight Story: Tech Trends Unpacked
Zero-Shot
Fast Forward by Tomorrow Unlocked: Tech past, tech future
The Unbelivable Truth - Series 1 - 26 including specials and pilot
Elliot in the Morning