Deep Neural Networks are increasingly deployed on tiny systems such as microcontrollers or embedded systems. Notwithstanding the recent success of Deep Learning, also enabled by the availability of Automated Machine Learning and Neural Architecture Search solutions, the computational requirements of the optimization of the structure and the hyperparameters of Deep Neural Networks usually far exceed what is available on tiny systems. Therefore, the deployability becomes critical when the learned model must be deployed on a tiny system. To overcome this critical issue, we propose a framework, based on Bayesian Optimization, to optimize the hyperparameters of a Deep Neural Network by dealing with black-box deployability constraints. Encouraging results obtained on a classification benchmark problem on a real microcontroller by STMicroelectronics are presented.

Perego, R., Candelieri, A., Archetti, F., Pau, D. (2020). Tuning Deep Neural Network’s Hyperparameters Constrained to Deployability on Tiny Systems. In Artificial Neural Networks and Machine Learning – ICANN 2020 (pp.92-103). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-030-61616-8_8].

Tuning Deep Neural Network’s Hyperparameters Constrained to Deployability on Tiny Systems

Perego R.
Primo
;
Candelieri A.
Secondo
;
Archetti F.
Penultimo
;
2020

Abstract

Deep Neural Networks are increasingly deployed on tiny systems such as microcontrollers or embedded systems. Notwithstanding the recent success of Deep Learning, also enabled by the availability of Automated Machine Learning and Neural Architecture Search solutions, the computational requirements of the optimization of the structure and the hyperparameters of Deep Neural Networks usually far exceed what is available on tiny systems. Therefore, the deployability becomes critical when the learned model must be deployed on a tiny system. To overcome this critical issue, we propose a framework, based on Bayesian Optimization, to optimize the hyperparameters of a Deep Neural Network by dealing with black-box deployability constraints. Encouraging results obtained on a classification benchmark problem on a real microcontroller by STMicroelectronics are presented.
paper
Bayesian optimization; Deep Neural Network; Hyperparameters optimization; Neural Architecture Search
English
29th International Conference on Artificial Neural Networks, ICANN 2020
2020
Farkaš I., Masulli P., Wermter S.
Artificial Neural Networks and Machine Learning – ICANN 2020
978-3-030-61615-1
14-ott-2020
2020
12397
92
103
https://link.springer.com/chapter/10.1007/978-3-030-61616-8_8
reserved
Perego, R., Candelieri, A., Archetti, F., Pau, D. (2020). Tuning Deep Neural Network’s Hyperparameters Constrained to Deployability on Tiny Systems. In Artificial Neural Networks and Machine Learning – ICANN 2020 (pp.92-103). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-030-61616-8_8].
File in questo prodotto:
File Dimensione Formato  
Perego_ICANN2020.pdf

Solo gestori archivio

Descrizione: Conference Paper
Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Dimensione 840.51 kB
Formato Adobe PDF
840.51 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/298197
Citazioni
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 6
Social impact