Research

[Out of date. Updates coming soon....]

Chaos and dynamical phase transitions in recurrent neural networks

The firing patterns of neurons in the central nervous system often exhibit strong temporal irregularity and spatial heterogeneity. These properties are the outcome of the intrinsic chaotic dynamics of neural circuits. Here, we prove the universality of the transition to chaos using a mathematical framework for a broad class of networks with realistic connectivity architectures; we reveal the dependence of this transition on the form of the nonlinear input-output transfer function and synaptic time constants.

We theoretically study the onset of chaos using rate-based dynamics of sparsely connected networks composed of multiple neuronal subpopulations. We find that the sharpness of the local non-linearity has a crucial impact on the onset of chaos. In spiking networks, a sharp onset of chaos exists only in the limit of a slow synaptic time constant. We show that in that case the onset of chaos is a second-order phase transition, and proper Scaling analysis provides a quantitative description of the crossover from stationary rates to fluctuating rates for biologically realistic parameters.

Computation and dynamics in hierarchical neural networks

Many neuronal circuits in the brain (e.g., different sensory pathways, the hippocampus, and the cerebellum) are organized in a hierarchical fashion. However, the computational benefits of the layered structure are still not well defined, in particular when one considers that given enough neurons, a single-layer architecture can act as a universal function approximator [Hornik, 1991].

We introduce a simple multilayered perceptron network, generally considered as a plausible biological model. The input layer consists of high dimensional manifolds, each representing fluctuations of a prototypical stimulus. The difficulty of the decoding depends on the number and size of the manifolds. Neurons in the network are trained locally to perform binary classification with random labeling; namely, they fire in response to a unique subset of the prototypical stimuli and remain silent for all others. Using dynamic mean-field theory, we show that emergent correlations between different representations limit the decoding performance of shallow architectures. These correlations do not vanish as the width of the layer is increased.

In contrast, the layer-by-layer dynamics of a hierarchical structure can gradually remove correlations, allowing errorless decoding of the activities in the penultimate layer. For a given number of available neurons, we find the optimal depth that maximizes the capacity, defined by the size and number of manifolds that can be decoded. Importantly, the framework is learning-rule agnostic, and training can be implemented using a broad range of local plasticity schemes.

  • Kadmon J, Sompolinsky H.,Optimal Architectures in a Solvable Model of Deep Networks, NIPS, 2016
  • Kadmon J, Sompolinsky H., Computation and dynamics in hierarchical neural networks, in prep.
  • Kadmon J, Sompolinsky H, Dynamical mean-field theory for feedforward networks – path integral approach. in prep.

Statistical mechanics of high-dimensional inference: low-rank tensor decomposition

Often, large, high dimensional datasets collected across multiple modalities can be organized as a higher order tensor. Low-rank tensor decomposition then arises as a powerful and widely used tool to discover simple low dimensional structures underlying such data. However, we currently lack a theoretical understanding of the algorithmic behavior of low-rank tensor decomposition. We derive Bayesian approximate message passing (AMP) algorithms for recovering arbitrarily shaped low-rank tensors buried within noise, and we employ dynamic mean field theory to precisely characterize their performance. Our theory reveals the existence of phase transitions between easy, hard and impossible inference regimes, and displays an excellent match with simulations. Moreover it reveals several qualitative surprises compared to the behavior of symmetric, cubic tensor decomposition. Finally, we compare our AMP algorithm to the most commonly used algorithm, the alternating least squares (ALS), and demonstrate that AMP significantly outperforms ALS in the presence of noise.

Cortico-cerebellar dynamics in the learning and execution of a motor task

Classical theories posit that cerebellar granule cells massively outnumber their inputs in order to produce decorrelated, diverse, and generic expansions that facilitate arbitrary pattern separation by downstream Purkinje cells. The largest input to the cerebellum comes from the neocortex via pons. Here we use simultaneous two-photon Ca2+ imaging in premotor layer 5 pyramidal and granule cells during a motor planning task to examine information transfer from neocortex to cerebellum. Surprisingly, in expert mice, granule cell responses were highly similar to cortical output, with causal pontine contributions to high layer 5 – granule cell correlations. Ensemble activities of granule cells were both dense and redundant, with little evidence of expansion relative to layer 5. By contrast, early in learning, granule cell representations were more diverse and less correlated to cortical outputs. Response redundancy increased with task performance and produced much higher fidelity encoding of cortical activity and of behavior. These data suggest a major extension of prevailing theories: rather than generic expansion, cortico-cerebellar dynamics can shape to learned tasks to provide more extensive and reliable encoding at the cost of less response diversity and input transformation.

  • M.J. Wagner, T.H. Kim, J. Kadmon, N.D. Nguyen, S. Ganguli, M.J. Schnitzer, L. Luo, Under review

Some older biophysics projects

Wave functions of the transverse modes.

The physics behind honeycomb construction

How do hornets and honeybees build their hives? These structures exhibit remarkable symmetry and regularity that requires much precision. Furthermore, some species construct their hives in total darkness! It was conjectured a century ago that hornets pour the wax on their body which hardens in its most stable structure -- a hexagonal lattice. But the stable structure would be a puddle on the floor! We show that by exploiting the acoustic modes of their structure, these insects can achieve perfect symmetry by tuning the structure of each cell to the echoes of a perturbed ultrasonic wave, very much like a piano-tuner uses a tuning-fork.

Molecular dynamics of protein-substrate bindings

The entry of a substrate into the active site is the first event in any enzymatic reaction. However, due to the short time interval between the encounter and the formation of the stable complex, the detailed steps are experimentally unobserved. Through the use of molecular dynamics techniques, an in-silicone simulation of the biophysical can be performed which allows 'observations' and 'measurements' that are impossible in-vitro. In this projects we have studies the encounter between palmitate molecule and the Toad Liver fatty acid binding protein, ending with the formation of a stable complex resemblance in the structure of other proteins of this family. Solving a Poisson-Boltzmann equation, coupling the electrical field and the distribution of molecules in the system gives insight into the forces operating on the system leading to the formation of the tight complex.