Тип публикации: доклад, тезисы доклада, статья из сборника материалов конференций
Конференция: Hybrid Methods of Modeling and Optimization in Complex Systems (HMMOCS-III 2024); Krasnoyarsk; Krasnoyarsk
Год издания: 2025
Идентификатор DOI: 10.1051/itmconf/20257205004
Аннотация: Neural networks require careful selection of activation functions to optimize performance. Traditional methods of choosing activation functions through trial and error are time-consuming and resource-intensive. This paper presents a novel approach to automatically design activation functions for artificial neural networks using genПоказать полностьюetic programming combined with gradient descent. The proposed method aims to enhance the efficiency of the search process for optimal activation functions. Our algorithm employs genetic programming to evolve the general form of activation functions, while gradient descent optimizes their parameters during network training. This hybrid approach allows for the exploration of a wide range of potential activation functions tailored to specific tasks and network architectures. The method was evaluated on three datasets from the KEEL repository: Iris, Titanic, and Phoneme. The results demonstrate the algorithm's ability to generate and optimize custom activation functions, although improvements in network accuracy were not observed in this initial study. This work contributes to the ongoing research in neural network optimization and opens avenues for further investigation into the automatic design of activation functions.
Журнал: ITM Web of Conferences
Номера страниц: 5004
Место издания: Krasnoyarsk