The use of very-large-scale integration (VLSI) systems with electronic analogue circuits to replicate the neuro-biological architectures seen in the nervous system is known as neuromorphic engineering, also referred to as neuromorphic computing.
Any device that does computations using silicon-based physical artificial neurons is referred to as a neuromorphic computer or chip. In recent years, analogue, digital, mixed-mode analog/digital VLSI, and software systems that incorporate neural system models have all been referred to as neuromorphic.
Spintronic memories, transistors, threshold switches, and oxide-based memristors can all be used to build neuromorphic computing at the hardware level.
Error backpropagation, such as with Python-based frameworks like snnTorch, or canonical learning methods from the biological learning literature, can be used to train software-based neuromorphic systems of spiking neural networks.
Understanding how the morphology of individual neurons, circuits, software, and overall architectures influences computations, influences information representation, influences robustness to damage, incorporates learning and development, adapts to local change (plasticity), and promotes evolutionary change is a key component of neuromorphic engineering.
In the late 1980s, Carver Mead created it
When designing artificial neural systems, such as vision systems, head-eye systems, auditory processors, and autonomous robots, whose physical architecture and design principles are based on those of biological nervous systems, neuromorphic engineering draws inspiration from biology, physics, mathematics, computer science, and electronic engineering.
Post a Comment