Current Research Projects at the NMI lab


Lifelong Learning Machines

Brain inspiration in AI can have a profound impact in making them robust to the dynamical nature of real-world problems. WIthin DARPA'a lifelong learning machines (L2M) program, we strive to bridge machine learning and neuroscience to reproduce the underpinnings of dynamical learning in AI. Furthermore, we expect findings from L2M to guide us in building neuromorphic learning machines that can learn continuously, expanding current state of knowledge with new ones. Our work pursues to approaches: learning using a hierarchical reinforcement learning that combines model-free and model-based approaches, and automatic learning of network to prevent catastrophic forgetting in sequential multi-task learning scenarios.

EXtremely Energy Efficient Collective ELectronics (EXCEL)

The primary focus of this multi-disciplinary research effort is to develop a new paradigm of computingthat enables hardware accelerated data-analytics that can extract information from unlabeled and unstructured data. This research is expected to uncover fundamentally new ways of harnessing coupled dynamical systems for solving computationally hard problems in an energy efficient way. With innovations in novel materials and devices, chip-scale dynamical system implementation, architectural changes and critical benchmarking, EXCEL will lay the foundation for a new non von-Neumann computing paradigm to achieve orders of magnitude improvement in computational energy efficiency. This project is in collaboration with University of Notre Dame, Georgiatec, UCSD, UCI and Penn State. At UCI the NMI lab will contribute inference and learning algorithms that exploit the intrinsic stochasticity in novel materials and devices. The properties of the devices innovated in this project share some similarities with biologiocal neurons. This similarity is expected to enable a novel computing paradigm and hardware that operates with energy efficiencies and proficiencies that approach those of the human brain.

more »

Digital Neuromorphic Computing Architectures

This project aims for a digital neuromorphic system that is able to provide flexible and dynamic learning (on-line). The Neural and Synaptic Array Transceiver (NSAT) is a digital neuromorphic procesor with highly programmable spiking neural dynamics and embedded, spike-driven synaptic plasticity. NSAT is mainly based on the assumptions that extreme efficiency in scalable neuromorphic learning machines for data-driven autonomy and algorithmic efficiency hinges on a combination of two factors: First, the establishment of neural algorithms in which the computational power of its basic operations are on the same order as that of a Multiply Accumulate (MAC) unit; Second, a neuromorphic design that emphasizes locally dense and globally sparse communication using hierarchical event-based communication to perform these operations using orders of magnitude less power than the MAC operation (e.g. of future GPUs 13pJ/MAC such as the GPUs targeted by the DARPA PERFECT program). To this end, we envision that the combination of these two factors will lead to at least 1,000 times better efficiency in solving complex cognitive tasks at human-level proficiency. This project is a collaboration between UC Irvine, UC San Siego and Intel Corporation.

On-line, spike-based deep learning and Stochastic Spiking Neural Networks

One ongoing challenge in brain-inspired (neuromorphic) computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. This research topic investigates learning rules that uses modulated membrane-based synaptic plasticity for learning deep representations in brain-inspired (neuromorphic), stochastic computing hardware. Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. We introduced the Synaptic Sampling Machine (SSM), a stochastic neural network model that uses synaptic unreliability as a means to stochasticity for sampling.

more »