blob: ecc13b50ef8c027ff4614358334ad83b8aeba8c3 [file] [log] [blame] [view]
Tamas Hubaid7eba712022-12-05 04:14:36 +01001# Trainable Neural Network
manarabdelatyf2b6ea22021-04-20 19:07:40 +02002
Tamas Hubaid7eba712022-12-05 04:14:36 +01003![Overview diagram](docs/overview.jpg)
Manarc7bcaf92021-04-16 18:21:23 +02004
Tamas Hubaid7eba712022-12-05 04:14:36 +01005Implements a simple neural network that supports on-chip training in addition to inference.
6The two hidden layers use leaky ReLU as their activation function while the output layer uses a
7rough approximation of softmax.
Manarc7bcaf92021-04-16 18:21:23 +02008
Tamas Hubaid7eba712022-12-05 04:14:36 +01009Unlike usual neural network implementations we use fixed-point saturation arithmetic and
10each neuron and synapse is represented as an individual instance.
11Inputs, outputs and weights, as well as forward and backward propagation can be managed
12through the wishbone bus.
Manarc7bcaf92021-04-16 18:21:23 +020013