| | |
| | |
Stat |
Members: 3665 Articles: 2'599'751 Articles rated: 2609
25 January 2025 |
|
| | | |
|
Article overview
| |
|
eVAE: Evolutionary Variational Autoencoder | Zhangkai Wu
; Longbing Cao
; Lei Qi
; | Date: |
2 Jan 2023 | Abstract: | The surrogate loss of variational autoencoders (VAEs) poses various
challenges to their training, inducing the imbalance between task fitting and
representation inference. To avert this, the existing strategies for VAEs focus
on adjusting the tradeoff by introducing hyperparameters, deriving a tighter
bound under some mild assumptions, or decomposing the loss components per
certain neural settings. VAEs still suffer from uncertain tradeoff learning.We
propose a novel evolutionary variational autoencoder (eVAE) building on the
variational information bottleneck (VIB) theory and integrative evolutionary
neural learning. eVAE integrates a variational genetic algorithm into VAE with
variational evolutionary operators including variational mutation, crossover,
and evolution. Its inner-outer-joint training mechanism synergistically and
dynamically generates and updates the uncertain tradeoff learning in the
evidence lower bound (ELBO) without additional constraints. Apart from learning
a lossy compression and representation of data under the VIB assumption, eVAE
presents an evolutionary paradigm to tune critical factors of VAEs and deep
neural networks and addresses the premature convergence and random search
problem by integrating evolutionary optimization into deep learning.
Experiments show that eVAE addresses the KL-vanishing problem for text
generation with low reconstruction loss, generates all disentangled factors
with sharp images, and improves the image generation quality,respectively. eVAE
achieves better reconstruction loss, disentanglement, and generation-inference
balance than its competitors. | Source: | arXiv, 2301.00011 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|