Analyzing the interplay between transferable GANs and gradient optimizers

PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION(2023)

引用 0|浏览5
暂无评分
摘要
Recently, different neuroevolutionary approaches have been proposed to enhance the architecture and hyperparameters of generative neural models. Many of these approaches have to train the candidate architectures before their evaluation. The gradient descent algorithm used to do so is often overlooked in the evolutionary procedure, despite being a critical aspect of the training. In this paper, we investigate the role played by the gradient-based optimizer chosen to train a generative adversarial network when its architecture has been obtained using neuroevolution. We focus on 2D Gaussian mixture approximation problem and evaluate the effect of a set of representative gradient-based techniques on the quality of the performance of the GANs. Our results show that the particular choice of the gradient optimizer can be as relevant as the appropriate selection of the architecture.
更多
查看译文
关键词
GANs,generative models,hybrid methods,gradient-based optimization,neuroevolution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要