+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)

Towards Neuromorphic Machine Intelligence. Spike-Based Representation, Learning, and Applications

  • Book

  • June 2024
  • Elsevier Science and Technology
  • ID: 5955079

Towards Neuromorphic Machine Intelligence: Spike-Based Representation, Learning, and Applications provides readers with in-depth understanding of Spiking Neural Networks (SNNs), which is a burgeoning research branch of Artificial Neural Networks (ANNs), AI, and Machine Learning that sits at the heart of the integration between Computer Science and Neural Engineering. In recent years, neural networks have re-emerged in relation to AI, representing a well-grounded paradigm rooted in disciplines from physics and psychology to information science and engineering.
This book represents one of the established cross-over areas where neurophysiology, cognition, and neural engineering coincide with the development of new Machine Learning and AI paradigms. There are many excellent theoretical achievements in neuron models, learning algorithms, network architecture, and so on. But these achievements are numerous and scattered, with a lack of straightforward systematic integration, making it difficult for researchers to assimilate and apply. As the third generation of Artificial Neural Networks (ANNs), Spiking Neural Networks (SNNs) simulate the neuron dynamics and information transmission in a biological neural system in more detail, which is a cross-product of computer science and neuroscience. The primary target audience of this book is divided into two categories: artificial intelligence researchers who know nothing about SNNs, and researchers who know a lot about SNNs. The former needs to acquire fundamental knowledge of SNNs, but the challenge is that much of the existing literature on SNNs only slightly mentions the basic knowledge of SNNs, or is too superficial, and this book gives a systematic explanation from scratch. The latter needs learning about some novel research achievements in the field of SNNs, and this book introduces the latest research results on different aspects of SNNs and provides detailed simulation processes to facilitate readers' replication. In addition, the book introduces neuromorphic hardware architecture as a further extension of the SNN system.

The book starts with the birth and development of SNNs, and then introduces the main research hotspots, including spiking neuron models, learning algorithms, network architectures, and neuromorphic hardware. Therefore, the book provides readers with easy access to both the foundational concepts and recent research findings in SNNs.

Please Note: This is an On Demand product, delivery may take up to 11 working days after payment has been received.

Table of Contents

1. Introduction
2. Fundamentals of Spiking Neural Networks
3. Specialized Spiking Neuron Model
4. Learning Algorithms for Shallow Spiking Neural Networks
5. Learning Algorithms for Deep Spiking Neural Networks
6. Neural Column-Inspired Spiking neural networks
7. ANN-SNN Algorithm Suitable for Ultra Energy Efficient Application
8. Spiking Deep Belief Networks for Fault Diagnosis
9. Conclusions

Authors

Hong Qu Professor, Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, China.

Dr. Hong Qu received the Ph.D. degree in computer science from the University of Electronic Science and Technology of China, Chengdu, China, in 2006. From 2007 to 2008, he was a Post-Doctoral Fellow with the Advanced Robotics and Intelligent Systems Laboratory, School of Engineering, University of Guelph, Guelph, ON, Canada. From 2014 to 2015, he was a Visiting Scholar with the Potsdam Institute for Climate Impact Research, Potsdam, Germany, and the Humboldt University of Berlin, Berlin, Germany. He is currently a Professor with the Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China. His current research interests include neural networks, machine learning, and big data.