Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Home 
 People 
 Current 
 Executive Committee 
 Postdocs 
 Visitors 
 Students 
 Research 
 Publications 
 Conferences 
 Workshops 
 Sponsorship 
 Talks 
 Seminars 
 Postdoc Seminars Archive 
 Quantum Lunch 
 Quantum Lunch Archive 
 P/T Colloquia 
 Archive 
 Ulam Scholar 
 
 Postdoc Nominations 
 Student Requests 
 Student Program 
 Visitor Requests 
 Description 
 Past Visitors 
 Services 
 General 
 
 History of CNLS 
 
 Maps, Directions 
 CNLS Office 
 T-Division 
 LANL 
 
Forrest Sheldon

Postdoc
T-4/CNLS

Statistical Physics, Neuromorphic Computing

Forrest Sheldon

Office: TA-3, Bldg 1690, Room 136
Mail Stop: B258
Phone: (505) 667-9331
Fax: (505) 665-2659

fsheldon@lanl.gov
home page

 Educational Background/Employment:
  • B.S. (2009) Physics, Duke University
  • Ph.D. (2019) Physics, UCSD

Research Interests:

    I am interested in a variety of problems across statistical physics, neuromorphic computing and inference. At LANL, I currently divide my time between 3 projects:
  • Statistical Physics: We are interested in understanding the network dynamics of nanoscale electrical devices known as memristors or resistors with memory. These two-terminal devices act like a switch whose resistance can be varied between a conducting and insulating state by the flow of current. When arranged in networks, these devices interact through the distribution of currents, shifting current flows as their resistances vary. We are interested in the behavior of these networks in the large-N limit, where we can identify phases of behavior similar to that of magnetic systems. Recently, we have shown that we expect a phase of glassy behavior in these devices by means of a replica calculation on the network Lyapunov function.
  • Neuromorphic and Analog Computing: Computing with distributed analog systems is an approach that hopes to improve performance by embedding algorithms directly in hardware. Within this, neuromorphic computing hopes to mimic the massively parallel architectures of neural systems to achieve fast, low-power and fault tolerant systems. Memristors are often compared to a synapse in silica as their conductance is directly controlled by their use. We are interested in the asymptotic dynamics of memristive circuits and how they might be applied to computational problems. We have demonstrated that networks of memristors possess a Lyapunov function and are working to connect this with applications in optimization. Additionally, high performance spiking neural hardware has received industry support in the form of Intel's Loihi and IBM's TrueNorth neuromorphic chips. We've been working on an approach to embed backpropagation directly into neuromorphic hardware, opening the possibility of rapid, low-power training of spiking neural networks.
  • Inference: The analysis of spin glasses is complicated by the fact that the magnetic/pure states are specific to an individual system and are generally not easily extracted from the Hamiltonian. We are applying machine learning techniques to infer the effective Hamiltonian from trajectories from the system, and from these effective descriptions extracting information about the structure of the pure states and dynamics of the system.

Selected Recent Publications:

  1. A. Renner et al. Implementing Backpropagation for Learning on Neuromorphic Spiking Hardware, Proceedings of the NICE Workshop NICE'20,1-3 (2020).
  2. F. Sheldon and F. Caravelli The Computational Capacity of Mem-LRC Reservoirs, Proceedings of the NICE Workshop NICE'20,1-4 (2020).
  3. F. Sheldon, F. Traversa and M. Di Ventra Taming a non-convex landscape with dynamical long-range order: memcomputing Ising benchmarks, Phys. Rev. E 100 (5),053311 (2019).
LANL Operated by the Triad National Security, LLC for the National Nuclear Security Administration of the US Department of Energy.
Copyright © 2003 LANS, LLC | Disclaimer/Privacy