Action PotentialA brief electrical impulse (spike) generated by a neuron when its membrane potential exceeds a threshold. This all-or-nothing signal propagates along the axon to communicate with other neurons. In neuromorphic systems, spikes are the fundamental unit of information exchange.
AxonThe long projection of a neuron that transmits electrical impulses (spikes) away from the cell body to other neurons. In neuromorphic hardware, axons are implemented as routing channels that deliver spike messages between artificial neurons.
DendriteThe branching structures of a neuron that receive input signals from other neurons via synapses. In neuromorphic chips, dendritic processing is modeled by input accumulation circuits that integrate incoming spike signals.
Integrate-and-Fire ModelA simplified mathematical model of a neuron that accumulates (integrates) incoming signals over time and generates an output spike when the accumulated value exceeds a threshold, after which the neuron resets. The Leaky Integrate-and-Fire (LIF) variant adds a decay term for more biological realism.
Membrane PotentialThe electrical voltage difference across a neuron's cell membrane. In biological neurons, it determines when a neuron fires. In neuromorphic models, it represents the internal state variable that integrates input and triggers spikes.
Spike-Timing-Dependent Plasticity (STDP)A biological learning rule where the change in synaptic strength depends on the relative timing of pre- and post-synaptic spikes. If the pre-synaptic neuron fires shortly before the post-synaptic neuron, the synapse is strengthened; the reverse order leads to weakening.
Von Neumann BottleneckThe fundamental performance limitation in conventional computer architectures caused by the separation of memory and processing units, requiring data to be constantly shuttled between them. Neuromorphic architectures avoid this by collocating computation and memory.
Neurosynaptic CoreA fundamental processing unit in neuromorphic chips that contains a group of artificial neurons, their synaptic connections, and local routing logic. Multiple cores are interconnected to form a complete neuromorphic processor.
Lateral InhibitionA neural mechanism where an active neuron reduces the activity of its neighbors. In neuromorphic systems, this implements winner-take-all competition and is used for tasks like pattern recognition and feature selection.
Refractory PeriodThe brief period after a neuron fires during which it cannot fire again (absolute refractory period) or requires a stronger stimulus (relative refractory period). This biological constraint is implemented in neuromorphic models to regulate firing rates.
Asynchronous CircuitAn electronic circuit that operates without a global clock signal, processing data as events arrive. Neuromorphic chips often use asynchronous design principles to achieve event-driven, low-power operation.
Crossbar ArrayA circuit topology used in some neuromorphic hardware where synaptic weights are stored at the intersections of horizontal and vertical wires, enabling highly parallel matrix-vector multiplication for neural computation.
Neural CodingThe way information is represented and transmitted by neurons. Rate coding encodes information in spike frequency, while temporal coding uses precise spike timing. Neuromorphic systems can exploit both coding schemes.
Hebbian LearningA learning principle summarized as 'neurons that fire together, wire together.' It states that synaptic connections are strengthened when pre- and post-synaptic neurons are simultaneously active. STDP is a temporally precise form of Hebbian learning.
SomaThe cell body of a neuron that contains the nucleus and integrates incoming signals from dendrites. In neuromorphic hardware, the soma function is implemented by circuits that accumulate synaptic inputs and generate spikes.
MemristorA two-terminal electronic component whose resistance changes based on the history of current flow through it. Memristors are promising for neuromorphic computing because they can naturally implement synaptic weight storage and plasticity in hardware.
Winner-Take-All (WTA) NetworkA neural circuit motif where multiple neurons compete through lateral inhibition, and only the most strongly activated neuron produces output. WTA networks are used in neuromorphic systems for classification, pattern recognition, and attention mechanisms.
Neuromorphic SensorA sensor designed to operate on neuromorphic principles, producing asynchronous, event-driven output rather than fixed-rate frames. Dynamic Vision Sensors (DVS) or event cameras are examples that output pixel-level brightness changes as individual events with microsecond temporal resolution.
Spike EncodingThe process of converting analog input signals (such as sensor data or pixel values) into spike trains that can be processed by spiking neural networks. Methods include rate coding, temporal coding, delta modulation, and population coding.
Homeostatic PlasticityA form of neural plasticity that maintains stable neural activity by adjusting neuron excitability or synaptic strengths in response to prolonged changes in activity levels. In neuromorphic systems, it prevents runaway excitation or complete silencing of neurons.
Reservoir ComputingA computational framework using a fixed, randomly connected recurrent neural network (the reservoir) where only the output layer is trained. Neuromorphic implementations of reservoir computing leverage the inherent dynamics of spiking networks for temporal pattern recognition.
Excitatory and Inhibitory NeuronsTwo fundamental types of neurons in neural circuits. Excitatory neurons increase the firing probability of target neurons, while inhibitory neurons decrease it. The balance between excitation and inhibition is critical for stable neural computation in both biological and neuromorphic systems.
Plasticity RuleA mathematical rule that defines how synaptic weights change in response to neural activity. In neuromorphic systems, plasticity rules such as STDP, BCM, and Oja's rule can be implemented directly in hardware, enabling on-chip learning without external supervision or backpropagation.
TrueNorthIBM's neuromorphic chip containing 1 million programmable neurons and 256 million configurable synapses, operating at extremely low power (70mW) for pattern recognition tasks.
LoihiIntel's neuromorphic research chip implementing 128 neuromorphic cores with programmable synaptic learning rules, enabling on-chip learning and adaptation without external training.
NorthPoleIBM's latest neuromorphic chip (2023) achieving unprecedented energy efficiency for neural network inference by eliminating the von Neumann bottleneck through distributed memory architecture.
Dynamic Vision Sensor (DVS)A neuromorphic camera that detects pixel-level brightness changes asynchronously, producing events with microsecond temporal resolution and extreme dynamic range, mimicking retinal processing.
Neural ODEA mathematical framework that models neural network layers as continuous transformations, connecting deep learning to dynamical systems theory and enabling more brain-like continuous-time processing.
BrainomorphicA more specific term for hardware that closely mimics the physical structure and dynamics of biological neural tissue, going beyond functional neuromorphic approaches to replicate biological detail.
Liquid State MachineA reservoir computing approach using a recurrent spiking neural network (the 'liquid') that transforms input into a high-dimensional representation, with only a readout layer being trained.
Spike Rate CodingA neural coding scheme where information is encoded in the frequency of spikes over time. Higher firing rates represent stronger signals, similar to how brightness might increase neuron firing rates in the visual cortex.
Temporal CodingA neural coding scheme where information is encoded in the precise timing of individual spikes rather than their frequency. Temporal coding can carry more information per spike than rate coding.
Dendritic ComputingThe emerging understanding that dendrites (input branches of neurons) perform sophisticated local computations, not just passive signal transmission. Neuromorphic designs increasingly model dendritic processing.
BrainScaleSA neuromorphic computing platform developed at Heidelberg University that operates in accelerated time (1000x faster than biological real-time), enabling rapid exploration of neural network dynamics.
Catastrophic ForgettingA problem in traditional neural networks where learning new information erases previously learned knowledge. Neuromorphic systems with local learning rules like STDP are naturally resistant to this problem.
Spike TrainA sequence of discrete spikes (action potentials) produced by a neuron over time. The pattern of spikes encodes information and is the fundamental communication signal in neuromorphic systems.
Synaptic WeightA numerical value representing the strength of a connection between two neurons. In neuromorphic hardware, synaptic weights are stored in memory elements (SRAM, memristors, or analog circuits) and modified during learning.
Leaky Integrate-and-Fire (LIF)The most common simplified neuron model used in neuromorphic computing. Input currents are integrated (accumulated) with a leak (decay), and when the membrane potential reaches a threshold, a spike is emitted.
Neuromorphic AcceleratorA specialized hardware chip optimized for running spiking neural network computations, achieving orders-of-magnitude improvements in energy efficiency compared to general-purpose processors for neural network inference.
Bio-Inspired ComputingA broad term encompassing all computing approaches inspired by biological systems, including neuromorphic computing (brain), evolutionary algorithms (evolution), and swarm intelligence (collective behavior).
Address-Event Representation (AER)A communication protocol used in neuromorphic hardware where spikes are encoded as addresses of the sending neuron and transmitted asynchronously, enabling efficient inter-chip communication.
Neural DustUltraminiature wireless sensors designed to be implanted in the body to monitor neural activity, representing the extreme miniaturization of neuromorphic sensing technology.
AstrocyteA type of brain cell once thought to be passive support, now known to modulate synaptic transmission and neural computation. Some neuromorphic systems model astrocyte-neuron interactions for improved learning.