This work presents a performance comparison of Spiking Neural Networks (SNNs) under reduced numerical precision. When implementing digital SNNs in hardware for edge inference, such as on FPGAs or ASICs, power and resource usage are crucial aspects to consider. Typically, networks are trained using single precision floating-point arithmetic to facilitate parameter convergence; consequently, the trained parameters are also in high computational resolution. However, this level of precision may be redundant for classification accuracy. A reduced word width offers significant benefits in terms of resource usage, balancing resolution with accuracy in a manner quantified in this study. The accuracy achieved on three different datasets of increasing complexity by three neuron discrete equation models of increasing dynamics is compared when applying post-training quantization (PTQ) with various resolutions to both fixed-point and minifloat reduced precision models. Results indicate that fixed-point numerical representation provides the best outcomes with negligible accuracy loss when quantizing network parameters to a word width ranging from 4 to 8 bits, depending on the task's complexity and the neuron dynamics.

Tambaro, M., Radaelli, A., Stevenazzi, L., La Gala, A., Malanchini, M., De Matteis, M. (2025). Influence of Reduced Numerical Precision in Spiking Neural Network Hardware Implementation. In 2025 International Conference on IC Design and Technology (ICICDT) (pp.85-88). IEEE [10.1109/ICICDT65192.2025.11078101].

Influence of Reduced Numerical Precision in Spiking Neural Network Hardware Implementation

Tambaro M.;Stevenazzi L.;La Gala A.;Malanchini M.;De Matteis M.
2025

Abstract

This work presents a performance comparison of Spiking Neural Networks (SNNs) under reduced numerical precision. When implementing digital SNNs in hardware for edge inference, such as on FPGAs or ASICs, power and resource usage are crucial aspects to consider. Typically, networks are trained using single precision floating-point arithmetic to facilitate parameter convergence; consequently, the trained parameters are also in high computational resolution. However, this level of precision may be redundant for classification accuracy. A reduced word width offers significant benefits in terms of resource usage, balancing resolution with accuracy in a manner quantified in this study. The accuracy achieved on three different datasets of increasing complexity by three neuron discrete equation models of increasing dynamics is compared when applying post-training quantization (PTQ) with various resolutions to both fixed-point and minifloat reduced precision models. Results indicate that fixed-point numerical representation provides the best outcomes with negligible accuracy loss when quantizing network parameters to a word width ranging from 4 to 8 bits, depending on the task's complexity and the neuron dynamics.
paper
edge computing; neuromorphic circuits; post-training quantization; spiking neural networks;
English
2025 International Conference on IC Design and Technology, ICICDT 2025 - 23-25 June 2025
2025
2025 International Conference on IC Design and Technology (ICICDT)
9798331524616
2025
85
88
reserved
Tambaro, M., Radaelli, A., Stevenazzi, L., La Gala, A., Malanchini, M., De Matteis, M. (2025). Influence of Reduced Numerical Precision in Spiking Neural Network Hardware Implementation. In 2025 International Conference on IC Design and Technology (ICICDT) (pp.85-88). IEEE [10.1109/ICICDT65192.2025.11078101].
File in questo prodotto:
File Dimensione Formato  
Tambaro-2025-ICICDT-VoR.pdf

Solo gestori archivio

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Tutti i diritti riservati
Dimensione 807.91 kB
Formato Adobe PDF
807.91 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/584941
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact