Update JULAIN Talk Emre 15 November 2021 authored by Susanne Wenzel's avatar Susanne Wenzel
# JULAIN Talks We welcome **Emre Neftci**, new head of PGI-15, at FZJ and in the JULAIN community and are happy to announce that he will give an inauguration JULAIN talk on
We welcome Emre Neftci, new head of PGI-15, at FZJ and in the JULAIN community and are happy to announce that he will give an inauguration JULAIN talk on
**Meta-Training Neuromorphic Hardware with Differentiable Programming** **Meta-Training Neuromorphic Hardware with Differentiable Programming**
...@@ -13,6 +10,6 @@ as a virtual meeting at: Virtual Meeting using [BigBlueButton](https://webconf.f ...@@ -13,6 +10,6 @@ as a virtual meeting at: Virtual Meeting using [BigBlueButton](https://webconf.f
Online “life-long” learning at the edge is an aspirational goal of AI technologies. Neuromorphic hardware is particularly attractive in this regard, thanks to its emphasis on local learning algorithms and its potential compatibility with future and emerging devices. With recent advances in training neuromorphic hardware using differentiable programming, it is now possible to achieve competitive accuracy and performance compared to Deep Neural Networks (DNNs). In practice, however, the data-intensive and iterative training procedure that powers DNNs is incompatible with the device non-idealities and real-time operation that characterize neuromorphic hardware. In this talk, I will argue that gradient-based meta-learning can play a critical role in closing this gap, enabling accurate, few-shot learning, even in the presence of certain severe device non-idealities. As a result, meta-learning has the potential to redefine the target metrics used in the design of emerging nanodevices for neuromorphic computing." Online “life-long” learning at the edge is an aspirational goal of AI technologies. Neuromorphic hardware is particularly attractive in this regard, thanks to its emphasis on local learning algorithms and its potential compatibility with future and emerging devices. With recent advances in training neuromorphic hardware using differentiable programming, it is now possible to achieve competitive accuracy and performance compared to Deep Neural Networks (DNNs). In practice, however, the data-intensive and iterative training procedure that powers DNNs is incompatible with the device non-idealities and real-time operation that characterize neuromorphic hardware. In this talk, I will argue that gradient-based meta-learning can play a critical role in closing this gap, enabling accurate, few-shot learning, even in the presence of certain severe device non-idealities. As a result, meta-learning has the potential to redefine the target metrics used in the design of emerging nanodevices for neuromorphic computing."
Emre Neftci received a M.Sc. degree in physics from Ecole Polytechnique Federale de Lausanne, Switzerland, and a Ph.D. degree from the Institute of Neuroinformatics, University of Zurich and ETH Zurich, in 2010. Before joining FZJ he was Assistant Professor with the Department of Cognitive Sciences and Computer Science, University of California at Irvine. He was a post-doctoral fellow at the Insitute of Neural Computation (INC), UCSD investigating models for probabilistic state-dependent sensorimotor processing in large-scale multi-neuron systems. In his current research he explores the bridges between neuroscience and machine learning, with a focus on the theoretical and computational modeling of learning algorithms that are best suited to neuromorphic hardware and non-von Neumann computing architectures. **Emre Neftci** received a M.Sc. degree in physics from Ecole Polytechnique Federale de Lausanne, Switzerland, and a Ph.D. degree from the Institute of Neuroinformatics, University of Zurich and ETH Zurich, in 2010. Before joining FZJ he was Assistant Professor with the Department of Cognitive Sciences and Computer Science, University of California at Irvine. He was a post-doctoral fellow at the Insitute of Neural Computation (INC), UCSD investigating models for probabilistic state-dependent sensorimotor processing in large-scale multi-neuron systems. In his current research he explores the bridges between neuroscience and machine learning, with a focus on the theoretical and computational modeling of learning algorithms that are best suited to neuromorphic hardware and non-von Neumann computing architectures.