본문 바로가기
  • Home

Deep Learning Hyperparameter Optimization Using Evolutionary Computation Techniques

  • Journal of Internet of Things and Convergence
  • Abbr : JKIOTS
  • 2025, 11(5), 9
  • Publisher : The Korea Internet of Things Society
  • Research Area : Engineering > Computer Science > Internet Information Processing
  • Received : September 9, 2025
  • Accepted : October 10, 2025
  • Published : October 31, 2025

Sangwook Lee 1

1목원대학교

Accredited

ABSTRACT

Deep learning has demonstrated outstanding performance in various AI applications, including computer vision, natural language processing, and speech recognition. However, model performance is highly dependent on hyperparameter settings, and inappropriate hyperparameters can slow convergence or lead to overfitting. In this paper, we propose three evolutionary computation techniques —genetic algorithms, particle swarm optimization, and differential evolution algorithms—to optimize hyperparameter settings. To demonstrate the performance of these three evolutionary computation techniques, we apply them to optimize six hyperparameters for the SimpleCNN-based CIFAR-10 image classification problem. Experimental results show that evolutionary computational techniques outperform basic grid search with the same amount of computation, and particle swarm optimization in particular shows the best performance. Additionally, we were able to confirm that lowering the learning rate among deep learning hyperparameters is effective for stable learning.

Citation status

* References for papers published after 2024 are currently being built.