In this talk I will discuss models of Binarized Neural Networks (BNN) in two different fields. First, BNNs in an online learning memory task. For this part, we devised biological feasible algorithms that enable neural networks to scale up linearly in memory capacity. This offers major advantage over previously devised sub-linear scaling algorithms.
For the second part of this talk, BNNs are applied to real world tasks. Specifically, trained to classify images from the imageNet dataset. The main goal here is to provide significant advantage over regular deep neural networks in both computing complexity and memory space requirement aspects. The result network successfully reduced the classification error by 40% from previous published results on BNNs.