Publication

Optimizing deep learning hyper-parameters through an evolutionary algorithm

by Steven R Young, Derek C Rose, Thomas P Karnowski, Seung-Hwan Lim, Robert M Patton

Abstract 

There has been a recent surge of success in utilizing Deep Learning (DL) in imaging and speech applications for its relatively automatic feature generation and, in particular for convolutional neural networks (CNNs), high accuracy classification abilities. While these models learn their parameters through data-driven methods, model selection (as architecture construction) through hyper-parameter choices remains a tedious and highly intuition driven task. To address this, Multi-node Evolutionary Neural Networks for Deep Learning (MENNDL) is proposed as a method for automating network selection on computational clusters through hyper-parameter optimization performed via genetic algorithms.

Read more

Download Publication

Access for MLHPC '15 Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments subscribers only.

Publication Citation

MLHPC '15 Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments 2015
DOI: 10.1145/2834892.2834896

Share