| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
06 October 2024 |
|
| | | |
|
Article overview
| |
|
Multi-Complexity-Loss DNAS for Energy-Efficient and Memory-Constrained Deep Neural Networks | Matteo Risso
; Alessio Burrello
; Luca Benini
; Enrico Macii
; Massimo Poncino
; Daniele Jahier Pagliari
; | Date: |
1 Jun 2022 | Abstract: | Neural Architecture Search (NAS) is increasingly popular to automatically
explore the accuracy versus computational complexity trade-off of Deep Learning
(DL) architectures. When targeting tiny edge devices, the main challenge for DL
deployment is matching the tight memory constraints, hence most NAS algorithms
consider model size as the complexity metric. Other methods reduce the energy
or latency of DL models by trading off accuracy and number of inference
operations. Energy and memory are rarely considered simultaneously, in
particular by low-search-cost Differentiable NAS (DNAS) solutions. We overcome
this limitation proposing the first DNAS that directly addresses the most
realistic scenario from a designer’s perspective: the co-optimization of
accuracy and energy (or latency) under a memory constraint, determined by the
target HW. We do so by combining two complexity-dependent loss functions during
training, with independent strength. Testing on three edge-relevant tasks from
the MLPerf Tiny benchmark suite, we obtain rich Pareto sets of architectures in
the energy vs. accuracy space, with memory footprints constraints spanning from
75% to 6.25% of the baseline networks. When deployed on a commercial edge
device, the STM NUCLEO-H743ZI2, our networks span a range of 2.18x in energy
consumption and 4.04% in accuracy for the same memory constraint, and reduce
energy by up to 2.2x with negligible accuracy drop with respect to the
baseline. | Source: | arXiv, 2206.00302 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|