| | |
| | |
Stat |
Members: 3667 Articles: 2'599'751 Articles rated: 2609
16 February 2025 |
|
| | | |
|
Article overview
| |
|
Ternary Neural Networks for Resource-Efficient AI Applications | Hande Alemdar
; Nicholas Caldwell
; Vincent Leroy
; Adrien Prost-Boucle
; Frédéric Pétrot
; | Date: |
1 Sep 2016 | Abstract: | The computation and storage requirements for Deep Neural Networks (DNNs) are
usually high. This issue limit their deployability on ubiquitous computing
devices such as smart phones or wearables. In this paper, we propose ternary
neural networks (TNNs) in order to make deep learning more resource-efficient.
We train these TNNs using a teacher-student approach. Using only ternary
weights and ternary neurons, with a step activation function of two-thresholds,
the student ternary network learns to mimic the behaviour of its teacher
network. We propose a novel, layer-wise greedy methodology for training TNNs.
During training, a ternary neural network inherently prunes the smaller weights
by setting them to zero. This makes them even more compact thus more
resource-friendly. We devise a purpose-built hardware design for TNNs and
implement it on FPGA. The benchmark results with our purpose-built hardware
running TNNs reveal that, with only 1.24 microjoules per image, we can achieve
97.76% accuracy with 5.37 microsecond latency and with a rate of 255K images
per second on MNIST. | Source: | arXiv, 1609.0222 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|