🧠

neuromorphic-brain

An interactive educational simulator that models brain-inspired computing architectures, allowing users to explore spiking neural networks, neuromorphic chip designs, and the principles of biological neural computation. Users can visualize neuron firing patterns, synaptic connections, and experiment with event-driven processing paradigms.

🧠 Try it now

What is this?

🎯 Simulator Tips

📚 Glossary

Action Potential
A brief electrical impulse (spike) generated by a neuron when its membrane potential exceeds a threshold. This all-or-nothing signal propagates along the axon to communicate with other neurons. In neuromorphic systems, spikes are the fundamental unit of information exchange.
Axon
The long projection of a neuron that transmits electrical impulses (spikes) away from the cell body to other neurons. In neuromorphic hardware, axons are implemented as routing channels that deliver spike messages between artificial neurons.
Dendrite
The branching structures of a neuron that receive input signals from other neurons via synapses. In neuromorphic chips, dendritic processing is modeled by input accumulation circuits that integrate incoming spike signals.
Integrate-and-Fire Model
A simplified mathematical model of a neuron that accumulates (integrates) incoming signals over time and generates an output spike when the accumulated value exceeds a threshold, after which the neuron resets. The Leaky Integrate-and-Fire (LIF) variant adds a decay term for more biological realism.
Membrane Potential
The electrical voltage difference across a neuron's cell membrane. In biological neurons, it determines when a neuron fires. In neuromorphic models, it represents the internal state variable that integrates input and triggers spikes.
Spike-Timing-Dependent Plasticity (STDP)
A biological learning rule where the change in synaptic strength depends on the relative timing of pre- and post-synaptic spikes. If the pre-synaptic neuron fires shortly before the post-synaptic neuron, the synapse is strengthened; the reverse order leads to weakening.
Von Neumann Bottleneck
The fundamental performance limitation in conventional computer architectures caused by the separation of memory and processing units, requiring data to be constantly shuttled between them. Neuromorphic architectures avoid this by collocating computation and memory.
Neurosynaptic Core
A fundamental processing unit in neuromorphic chips that contains a group of artificial neurons, their synaptic connections, and local routing logic. Multiple cores are interconnected to form a complete neuromorphic processor.
Lateral Inhibition
A neural mechanism where an active neuron reduces the activity of its neighbors. In neuromorphic systems, this implements winner-take-all competition and is used for tasks like pattern recognition and feature selection.
Refractory Period
The brief period after a neuron fires during which it cannot fire again (absolute refractory period) or requires a stronger stimulus (relative refractory period). This biological constraint is implemented in neuromorphic models to regulate firing rates.
Asynchronous Circuit
An electronic circuit that operates without a global clock signal, processing data as events arrive. Neuromorphic chips often use asynchronous design principles to achieve event-driven, low-power operation.
Crossbar Array
A circuit topology used in some neuromorphic hardware where synaptic weights are stored at the intersections of horizontal and vertical wires, enabling highly parallel matrix-vector multiplication for neural computation.
Neural Coding
The way information is represented and transmitted by neurons. Rate coding encodes information in spike frequency, while temporal coding uses precise spike timing. Neuromorphic systems can exploit both coding schemes.
Hebbian Learning
A learning principle summarized as 'neurons that fire together, wire together.' It states that synaptic connections are strengthened when pre- and post-synaptic neurons are simultaneously active. STDP is a temporally precise form of Hebbian learning.
Soma
The cell body of a neuron that contains the nucleus and integrates incoming signals from dendrites. In neuromorphic hardware, the soma function is implemented by circuits that accumulate synaptic inputs and generate spikes.
Memristor
A two-terminal electronic component whose resistance changes based on the history of current flow through it. Memristors are promising for neuromorphic computing because they can naturally implement synaptic weight storage and plasticity in hardware.
Winner-Take-All (WTA) Network
A neural circuit motif where multiple neurons compete through lateral inhibition, and only the most strongly activated neuron produces output. WTA networks are used in neuromorphic systems for classification, pattern recognition, and attention mechanisms.
Neuromorphic Sensor
A sensor designed to operate on neuromorphic principles, producing asynchronous, event-driven output rather than fixed-rate frames. Dynamic Vision Sensors (DVS) or event cameras are examples that output pixel-level brightness changes as individual events with microsecond temporal resolution.
Spike Encoding
The process of converting analog input signals (such as sensor data or pixel values) into spike trains that can be processed by spiking neural networks. Methods include rate coding, temporal coding, delta modulation, and population coding.
Homeostatic Plasticity
A form of neural plasticity that maintains stable neural activity by adjusting neuron excitability or synaptic strengths in response to prolonged changes in activity levels. In neuromorphic systems, it prevents runaway excitation or complete silencing of neurons.
Reservoir Computing
A computational framework using a fixed, randomly connected recurrent neural network (the reservoir) where only the output layer is trained. Neuromorphic implementations of reservoir computing leverage the inherent dynamics of spiking networks for temporal pattern recognition.
Excitatory and Inhibitory Neurons
Two fundamental types of neurons in neural circuits. Excitatory neurons increase the firing probability of target neurons, while inhibitory neurons decrease it. The balance between excitation and inhibition is critical for stable neural computation in both biological and neuromorphic systems.
Plasticity Rule
A mathematical rule that defines how synaptic weights change in response to neural activity. In neuromorphic systems, plasticity rules such as STDP, BCM, and Oja's rule can be implemented directly in hardware, enabling on-chip learning without external supervision or backpropagation.
TrueNorth
IBM's neuromorphic chip containing 1 million programmable neurons and 256 million configurable synapses, operating at extremely low power (70mW) for pattern recognition tasks.
Loihi
Intel's neuromorphic research chip implementing 128 neuromorphic cores with programmable synaptic learning rules, enabling on-chip learning and adaptation without external training.
NorthPole
IBM's latest neuromorphic chip (2023) achieving unprecedented energy efficiency for neural network inference by eliminating the von Neumann bottleneck through distributed memory architecture.
Dynamic Vision Sensor (DVS)
A neuromorphic camera that detects pixel-level brightness changes asynchronously, producing events with microsecond temporal resolution and extreme dynamic range, mimicking retinal processing.
Neural ODE
A mathematical framework that models neural network layers as continuous transformations, connecting deep learning to dynamical systems theory and enabling more brain-like continuous-time processing.
Brainomorphic
A more specific term for hardware that closely mimics the physical structure and dynamics of biological neural tissue, going beyond functional neuromorphic approaches to replicate biological detail.
Liquid State Machine
A reservoir computing approach using a recurrent spiking neural network (the 'liquid') that transforms input into a high-dimensional representation, with only a readout layer being trained.
Spike Rate Coding
A neural coding scheme where information is encoded in the frequency of spikes over time. Higher firing rates represent stronger signals, similar to how brightness might increase neuron firing rates in the visual cortex.
Temporal Coding
A neural coding scheme where information is encoded in the precise timing of individual spikes rather than their frequency. Temporal coding can carry more information per spike than rate coding.
Dendritic Computing
The emerging understanding that dendrites (input branches of neurons) perform sophisticated local computations, not just passive signal transmission. Neuromorphic designs increasingly model dendritic processing.
BrainScaleS
A neuromorphic computing platform developed at Heidelberg University that operates in accelerated time (1000x faster than biological real-time), enabling rapid exploration of neural network dynamics.
Catastrophic Forgetting
A problem in traditional neural networks where learning new information erases previously learned knowledge. Neuromorphic systems with local learning rules like STDP are naturally resistant to this problem.
Spike Train
A sequence of discrete spikes (action potentials) produced by a neuron over time. The pattern of spikes encodes information and is the fundamental communication signal in neuromorphic systems.
Synaptic Weight
A numerical value representing the strength of a connection between two neurons. In neuromorphic hardware, synaptic weights are stored in memory elements (SRAM, memristors, or analog circuits) and modified during learning.
Leaky Integrate-and-Fire (LIF)
The most common simplified neuron model used in neuromorphic computing. Input currents are integrated (accumulated) with a leak (decay), and when the membrane potential reaches a threshold, a spike is emitted.
Neuromorphic Accelerator
A specialized hardware chip optimized for running spiking neural network computations, achieving orders-of-magnitude improvements in energy efficiency compared to general-purpose processors for neural network inference.
Bio-Inspired Computing
A broad term encompassing all computing approaches inspired by biological systems, including neuromorphic computing (brain), evolutionary algorithms (evolution), and swarm intelligence (collective behavior).
Address-Event Representation (AER)
A communication protocol used in neuromorphic hardware where spikes are encoded as addresses of the sending neuron and transmitted asynchronously, enabling efficient inter-chip communication.
Neural Dust
Ultraminiature wireless sensors designed to be implanted in the body to monitor neural activity, representing the extreme miniaturization of neuromorphic sensing technology.
Astrocyte
A type of brain cell once thought to be passive support, now known to modulate synaptic transmission and neural computation. Some neuromorphic systems model astrocyte-neuron interactions for improved learning.

🏆 Key Figures

Carver Mead (1990)

Pioneered the field of neuromorphic engineering in the late 1980s and early 1990s at Caltech. He coined the term 'neuromorphic' and demonstrated that analog VLSI circuits could emulate the neural computations performed by biological nervous systems, establishing the foundational principles of the field.

Mike Davies (2017)

Director of Intel's Neuromorphic Computing Lab and lead architect of the Loihi neuromorphic research chip. Loihi implements spiking neural networks in a manycore architecture with on-chip learning capabilities, supporting programmable synaptic plasticity rules and hierarchical connectivity.

Steve Furber (2006)

Professor at the University of Manchester who led the development of the SpiNNaker (Spiking Neural Network Architecture) project, a massively parallel computing platform designed to simulate large-scale spiking neural networks in real time using a million ARM processors.

Giacomo Indiveri (2011)

Pioneered mixed-signal analog/digital neuromorphic processor designs at ETH Zurich, creating DYNAP-SE chips that implement biologically realistic neural dynamics in silicon with ultra-low power consumption

Dharmendra Modha (2014)

Led IBM Research's brain-inspired computing group and created TrueNorth, a neuromorphic chip with 1 million neurons and 256 million synapses that consumes only 70 milliwatts of power

Chris Eliasmith (2012)

Created Nengo and the Semantic Pointer Architecture, providing theoretical frameworks and practical tools for building large-scale functional brain models on neuromorphic hardware

💬 Message to Learners

{'encouragement': 'The human brain is the most sophisticated information processing system we know of, performing incredible feats of perception, learning, and decision-making while consuming less energy than a light bulb. Neuromorphic computing seeks to unlock these secrets by building machines that compute the way brains do.', 'reminder': "Every expert was once a beginner. The most important step is the first one - and you've already taken it by being here.", 'action': "Explore the simulator! Try different settings, experiment freely, and don't be afraid to make mistakes - that's how the best learning happens.", 'dream': 'Perhaps a neuroscientist in Accra will design the chip that truly mimics brain learning. Perhaps an engineer in Beirut will build neuromorphic sensors that give prosthetic limbs a sense of touch. Brain-inspired computing is for everyone.', 'wiaVision': 'WIA Book believes the science of the brain belongs to all humanity. From Seoul to Kampala, from Damascus to Lima - understanding how neurons compute is your birthright. Free forever, in the spirit of Hongik-ingan.'}

Get Started

Free, no signup required

Get Started →