In standard neural networks the amount of computation used grows with the size of the inputs, but not with the complexity of the problem being learnt. To overcome this limitation we introduce PonderNet, a new algorithm that learns to adapt the amount of computation based on the complexity of the problem at hand. On a complex synthetic problem, PonderNet dramatically improves performance over previous adaptive computation methods and additionally succeeds at extrapolation tests where traditional neural networks fail. Finally, PonderNet reached state of the art results on a complex task designed to test the reasoning capabilities of neural networks.
2021: Andrea Banino, Jan Balaguer, C. Blundell
https://arxiv.org/pdf/2107.05407v1.pdf
view more