| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
19 April 2024 |
|
| | | |
|
Article overview
| |
|
Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions | Nadav Cohen
; Or Sharir
; Ronen Tamari
; David Yakira
; Yoav Levine
; Amnon Shashua
; | Date: |
5 May 2017 | Abstract: | The driving force behind convolutional networks - the most successful deep
learning architecture to date, is their expressive power. Despite its wide
acceptance and vast empirical evidence, formal analyses supporting this belief
are scarce. The primary notions for formally reasoning about expressiveness are
efficiency and inductive bias. Efficiency refers to the ability of a network
architecture to realize functions that require an alternative architecture to
be much larger. Inductive bias refers to the prioritization of some functions
over others given prior knowledge regarding a task at hand. In this paper we
provide a high-level overview of a series of works written by the authors, that
through an equivalence to hierarchical tensor decompositions, analyze the
expressive efficiency and inductive bias of various architectural features in
convolutional networks (depth, width, convolution strides and more). The
results presented shed light on the demonstrated effectiveness of convolutional
networks, and in addition, provide new tools for network design. | Source: | arXiv, 1705.2302 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |