You can find the implementation of Binary MLP network here
In Binary Networks both the weights and activations are either +1 or -1.
I have implemented the version presented in 'Binarized Neural Networks: Training Neural Networks with Weights and Activations Constrained to +1 or -1' by Matthieu Courbariaux, Itay Hubara, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio https://arxiv.org/abs/1602.02830
Binary Neural Networks are computationally very efficient only at the cost of a slight decline in accuracy. They can be directly implemented in circuits using in-memory computing.
Presently I am working on the Verilog implementation which, if I am successful, will upload here. Meanwhile, you can look for Matlab implementation.
As of now, my focus is only on MLP (Multi-Layer Perceptron) network. I will be moving to CNNs later.
Link to python implementation can be found in the original paper.
Link to parameters as used in matlab https://drive.google.com/open?id=1dPtbrOCxBycn7mOVFQg1H_bN7qkZAcOj
Link to all the parameters https://drive.google.com/file/d/1j2mqnFlUTV8vx-y06wADaGCwDgcy3fPz/view