Home  | Publications | FKB26

How to Realize Efficient Spiking Neural Networks?

MCML Authors

Abstract

Spiking neural networks (SNNs) have been proposed as an (energy-)efficient alternative to conventional artificial neural networks. However, the aspired benefits have not yet been realized in practice. To gain a better understanding of why this gap persists, we theoretically study both discrete-time and continuous-time models of leaky integrate-and-fire neurons. In the discrete-time model, which is a widely used framework due to its amenability to conventional deep learning software and hardware approaches, we analyze the impact of explicit recurrent connections on the network size required to approximate continuously differentiable functions. We contrast this view by investigating the computational efficiency of digital systems that simulate spike-based computations in the continuous-time model. It turns out that even in well-behaved settings, the computational complexity of this task may grow super-polynomially in the prescribed accuracy. Thereby, we exemplarily highlight the intricacies of realizing potential strengths in the biological context, namely recurrent connections and computational efficiency, of spike-based computations on digital systems.

inproceedings FKB26


MATH4AI @AAAI 2026

Workshop on Foretell of Future AI from Mathematical Foundation at the 40th Conference on Artificial Intelligence. Singapore, Jan 20-27, 2026. To be published. Preprint available.

Authors

A. FonoG. Kutyniok • H. Boche

Links

URL

Research Area

 A2 | Mathematical Foundations

BibTeXKey: FKB26

Back to Top