This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.

Special Issue: Statistical Physics and Neuroscience

Many important technological developments in neuroscience are creating new challenges and opportunities for statistical physics. Examples include multiple recordings of spiking neurons, gene expression measurements and connectomic data. It seems therefore timely and appropriate to revitalize the interactions between the statistical physics and neuroscience communities. JSTAT would like to play a catalytic role in this bridge-building. The starting point is this special issue of the journal, which brings together new conceptual and experimental advances in a unified form of interest to both communities.

 

Articles on Statistical Physics and Neuroscience

Self-organized criticality in a network of interacting neurons

J D Cowan et al J. Stat. Mech. (2013) P04030

This paper contains an analysis of a simple neural network that exhibits self-organized criticality. Such criticality follows from the combination of a simple neural network with an excitatory feedback loop that generates bistability, in combination with an anti-Hebbian synapse in its input pathway. Using the methods of statistical field theory, we show how one can formulate the stochastic dynamics of such a network as the action of a path integral, which we then investigate using renormalization group methods. The results indicate that the network exhibits hysteresis in switching back and forward between its two stable states, each of which loses its stability at a saddle–node bifurcation. The renormalization group analysis shows that the fluctuations in the neighborhood of such bifurcations have the signature of directed percolation. Thus, the network states undergo the neural analog of a phase transition in the universality class of directed percolation. The network replicates the behavior of the original sand-pile model of Bak, Tang and Wiesenfeld in that the fluctuations about the two states show power-law statistics.

A simple method for estimating the entropy of neural activity

Michael J Berry II et al J. Stat. Mech. (2013) P03015

The number of possible activity patterns in a population of neurons grows exponentially with the size of the population. Typical experiments explore only a tiny fraction of the large space of possible activity patterns in the case of populations with more than 10 or 20 neurons. It is thus impossible, in this undersampled regime, to estimate the probabilities with which most of the activity patterns occur. As a result, the corresponding entropy—which is a measure of the computational power of the neural population—cannot be estimated directly. We propose a simple scheme for estimating the entropy in the undersampled regime, which bounds its value from both below and above. The lower bound is the usual 'naive' entropy of the experimental frequencies. The upper bound results from a hybrid approximation of the entropy which makes use of the naive estimate, a maximum entropy fit, and a coverage adjustment. We apply our simple scheme to artificial data, in order to check their accuracy; we also compare its performance to those of several previously defined entropy estimators. We then apply it to actual measurements of neural activity in populations with up to 100 cells. Finally, we discuss the similarities and differences between the proposed simple estimation scheme and various earlier methods.

Statistical mechanics of complex neural systems and high dimensional data

Madhu Advani et al J. Stat. Mech. (2013) P03014

Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

Grid cells on the ball

Federico Stella et al J. Stat. Mech. (2013) P03013

What sort of grid cells do we expect to see in rodents who have spent their developmental period inside a large spherical cage? Or, in a different experimental paradigm, toddling on a revolving ball, with virtual reality simulating a coherently revolving surround? We consider a simple model of grid firing map formation, based on firing rate adaptation, that we have earlier analyzed when playing out on a flat environment. The model predicts that whether experienced on the outside or inside, a spherical environment induces one of a succession of grid maps realized as combinations of spherical harmonics, depending on the relation of the radius to the preferred grid spacing, itself related to the parameters of firing rate adaptation. Numerical simulations concur with analytical predictions.

Motif statistics and spike correlations in neuronal networks

Yu Hu et al J. Stat. Mech. (2013) P03012

Motifs are patterns of subgraphs of complex networks. We studied the impact of such patterns of connectivity on the level of correlated, or synchronized, spiking activity among pairs of cells in a recurrent network of integrate and fire neurons. For a range of network architectures, we find that the pairwise correlation coefficients, averaged across the network, can be closely approximated using only three statistics of network connectivity. These are the overall network connection probability and the frequencies of two second order motifs: diverging motifs, in which one cell provides input to two others, and chain motifs, in which two cells are connected via a third intermediary cell. Specifically, the prevalence of diverging and chain motifs tends to increase correlation. Our method is based on linear response theory, which enables us to express spiking statistics using linear algebra, and a resumming technique, which extrapolates from second order motifs to predict the overall effect of coupling on network correlation. Our motif-based results seek to isolate the effect of network architecture perturbatively from a known network state.

The simplest maximum entropy model for collective behavior in a neural network

Gašper Tkačik et al J. Stat. Mech. (2013) P03011

Recent work emphasizes that the maximum entropy principle provides a bridge between statistical mechanics models for collective behavior in neural networks and experiments on networks of real neurons. Most of this work has focused on capturing the measured correlations among pairs of neurons. Here we suggest an alternative, constructing models that are consistent with the distribution of global network activity, i.e. the probability that K out of N cells in the network generate action potentials in the same small time bin. The inverse problem that we need to solve in constructing the model is analytically tractable, and provides a natural 'thermodynamics' for the network in the limit of large N. We analyze the responses of neurons in a small patch of the retina to naturalistic stimuli, and find that the implied thermodynamics is very close to an unusual critical point, in which the entropy (in proper units) is exactly equal to the energy.

Towards a self-consistent description of irregular and asynchronous cortical activity

Néstor Parga J. Stat. Mech. (2013) P03010

Experimental evidence shows that cortical activity exhibits correlated variability, often referred to as noise correlations. Reported correlation coefficients cover a wide range of values, from moderate to very small ones. There is an evident need of models and mathematical techniques with which to guide the interpretation of these results. However, the very existence of correlated variability is responsible for the technical difficulties that have prevented theory from making enough progress in determining how noise correlations are related to neuron and network properties. Here we review recent work that we have done to develop a program to study these issues. Given that noise correlations depend on the behavioral state, understanding how they are generated is a critical problem that has to be solved before biophysical models can be used to study behavioral tasks.

Dynamic state estimation based on Poisson spike trains—towards a theory of optimal encoding

Alex Susemihl et al J. Stat. Mech. (2013) P03009

Neurons in the nervous system convey information to higher brain regions by the generation of spike trains. An important question in the field of computational neuroscience is how these sensory neurons encode environmental information in a way which may be simply analyzed by subsequent systems. Many aspects of the form and function of the nervous system have been understood using the concepts of optimal population coding. Most studies, however, have neglected the aspect of temporal coding. Here we address this shortcoming through a filtering theory of inhomogeneous Poisson processes. We derive exact relations for the minimal mean squared error of the optimal Bayesian filter and, by optimizing the encoder, obtain optimal codes for populations of neurons. We also show that a class of non-Markovian, smooth stimuli are amenable to the same treatment, and provide results for the filtering and prediction error which hold for a general class of stochastic processes. This sets a sound mathematical framework for a population coding theory that takes temporal aspects into account. It also formalizes a number of studies which discussed temporal aspects of coding using time-window paradigms, by stating them in terms of correlation times and firing rates. We propose that this kind of analysis allows for a systematic study of temporal coding and will bring further insights into the nature of the neural code.

Reconstruction of sparse connectivity in neural networks from spike train covariances

Volker Pernice and Stefan Rotter J. Stat. Mech. (2013) P03008

The inference of causation from correlation is in general highly problematic. Correspondingly, it is difficult to infer the existence of physical synaptic connections between neurons from correlations in their activity. Covariances in neural spike trains and their relation to network structure have been the subject of intense research, both experimentally and theoretically. The influence of recurrent connections on covariances can be characterized directly in linear models, where connectivity in the network is described by a matrix of linear coupling kernels. However, as indirect connections also give rise to covariances, the inverse problem of inferring network structure from covariances can generally not be solved unambiguously.

Here we study to what degree this ambiguity can be resolved if the sparseness of neural networks is taken into account. To reconstruct a sparse network, we determine the minimal set of linear couplings consistent with the measured covariances by minimizing the L1 norm of the coupling matrix under appropriate constraints. Contrary to intuition, after stochastic optimization of the coupling matrix, the resulting estimate of the underlying network is directed, despite the fact that a symmetric matrix of count covariances is used for inference.

The performance of the new method is best if connections are neither exceedingly sparse, nor too dense, and it is easily applicable for networks of a few hundred nodes. Full coupling kernels can be obtained from the matrix of full covariance functions. We apply our method to networks of leaky integrate-and-fire neurons in an asynchronous–irregular state, where spike train covariances are well described by a linear model.

Computational modelling of memory retention from synapse to behaviour

Mark C W van Rossum and Maria Shippi J. Stat. Mech. (2013) P03007

One  of our most intriguing mental abilities is the capacity to store information and recall it from memory. Computational neuroscience has been influential in developing models and concepts of learning and memory. In this tutorial review we focus on the interplay between learning and forgetting. We discuss recent advances in the computational description of the learning and forgetting processes on synaptic, neuronal, and systems levels, as well as recent data that open up new challenges for statistical physicists.

Spatio-temporal spike train analysis for large scale networks using the maximum entropy principle and Monte Carlo method

Hassan Nasser et al J. Stat. Mech. (2013) P03006

Understanding the dynamics of neural networks is a major challenge in experimental neuroscience. For that purpose, a modelling of the recorded activity that reproduces the main statistics of the data is required. In the first part, we present a review on recent results dealing with spike train statistics analysis using maximum entropy models (MaxEnt). Most of these studies have focused on modelling synchronous spike patterns, leaving aside the temporal dynamics of the neural activity. However, the maximum entropy principle can be generalized to the temporal case, leading to Markovian models where memory effects and time correlations in the dynamics are properly taken into account. In the second part, we present a new method based on Monte Carlo sampling which is suited for the fitting of large-scale spatio-temporal MaxEnt models. The formalism and the tools presented here will be essential to fit MaxEnt spatio-temporal models to large neural ensembles.

The effect of nonstationarity on models inferred from neural data

Joanna Tyrcha et al J. Stat. Mech. (2013) P03005

Neurons subject to a common nonstationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished with machine learning techniques, provided that the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to two data sets: one from salamander retinal ganglion cells and the other from a realistic computational cortical network model. We show that many aspects of the concerted activity of the salamander retinal neurons can be traced simply to the external input. A model of non-interacting neurons subject to a nonstationary external field outperforms a model with stationary input with couplings between neurons, even accounting for the differences in the number of model parameters. When couplings are added to the nonstationary model, for the retinal data, little is gained: the inferred couplings are generally not significant. Likewise, the distribution of the sizes of sets of neurons that spike simultaneously and the frequency of spike patterns as a function of their rank (Zipf plots) are well explained by an independent-neuron model with time-dependent external input, and adding connections to such a model does not offer significant improvement. For the cortical model data, robust couplings, well correlated with the real connections, can be inferred using the nonstationary model. Adding connections to this model slightly improves the agreement with the data for the probability of synchronous spikes but hardly affects the Zipf plot.

Passage-time coding with a timing kernel inferred from irregular cortical spike sequences

Yasuhiro Tsubo et al J. Stat. Mech. (2013) P03004

In vivo cortical neurons exhibit highly irregular spike sequences. However, the computational implications of this irregular firing with respect to neural codings are not yet fully understood. Recently, we formulated neuronal firing as a stochastic process to translate the firing rate determined by synaptic input into an irregular sequence of inter-spike intervals (ISIs). We previously determined that steady-state distributions of ISIs obey a power-law in the majority of neurons recorded from the sensorimotor cortex of a rat performing a forelimb-movement task. This observation led us to a hypothesis for a neural code with irregular spiking, which we termed as 'constrained maximization of firing-rate entropy' (CMFE). This hypothesis asserts that noisy neuronal activity maximizes steady-state firing-rate entropy under the joint constraints of energy consumption and uncertainty over output spike trains. CMFE previously assumed that the values of the rate parameters in two successive ISIs are random and uncorrelated with each other. However, this is an oversimplification and is unrealistic. In addition, the CMFE hypothesis seems to contradict our perception that the firing-rate distribution is dependent on external stimuli. Therefore, it is necessary to explore a rate-coding scheme that can resolve these issues. In this study, we review the concept of CMFE and extend it to incorporate the correlated nature of a firing rate sequence in a biological nervous system. We introduced passage-time coding of stimulus with the timing kernel inferred from irregular cortical spike sequences in our previous study. We showed how the timing kernel determines the amount of information in spike sequences.

Beyond mean field theory: statistical field theory for neural networks

Michael A Buice and Carson C Chow J. Stat. Mech. (2013) P03003

Mean field theories have been a stalwart for studying the dynamics of networks of coupled neurons. They are convenient because they are relatively simple and possible to analyze. However, classical mean field theory neglects the effects of fluctuations and correlations due to single neuron effects. Here, we consider various possible approaches for going beyond mean field theory and incorporating correlation effects. Statistical field theory methods, in particular the Doi–Peliti–Janssen formalism, are particularly useful in this regard.

Ising models for neural activity inferred via selective cluster expansion: structural and coding properties

John Barton and Simona Cocco J. Stat. Mech. (2013) P03002

We describe the selective cluster expansion (SCE) of the entropy, a method for inferring an Ising model which describes the correlated activity of populations of neurons. We re-analyze data obtained from multielectrode recordings performed in vitro on the retina and in vivo on the prefrontal cortex. Recorded population sizes N range from N = 37 to 117 neurons. We compare the SCE method with the simplest mean field methods (corresponding to a Gaussian model) and with regularizations which favor sparse networks (L1 norm) or penalize large couplings (L2 norm). The network of the strongest interactions inferred via mean field methods generally agree with those obtained from SCE. Reconstruction of the sampled moments of the distributions, corresponding to neuron spiking frequencies and pairwise correlations, and the prediction of higher moments including three-cell correlations and multi-neuron firing frequencies, is more difficult than determining the large-scale structure of the interaction network, and, apart from a cortical recording in which the measured correlation indices are small, these goals are achieved with the SCE but not with mean field approaches. We also find differences in the inferred structure of retinal and cortical networks: inferred interactions tend to be more irregular and sparse for cortical data than for retinal data. This result may reflect the structure of the recording. As a consequence, the SCE is more effective for retinal data when expanding the entropy with respect to a mean field reference S − SMF, while expansions of the entropy S alone perform better for cortical data.

The effects of noise on binocular rivalry waves: a stochastic neural field model

Matthew A Webber and Paul C Bressloff J. Stat. Mech. (2013) P03001

We analyze the effects of extrinsic noise on traveling waves of visual perception in a competitive neural field model of binocular rivalry. The model consists of two one-dimensional excitatory neural fields, whose activity variables represent the responses to left-eye and right-eye stimuli, respectively. The two networks mutually inhibit each other, and slow adaptation is incorporated into the model by taking the network connections to exhibit synaptic depression. We first show how, in the absence of any noise, the system supports a propagating composite wave consisting of an invading activity front in one network co-moving with a retreating front in the other network. Using a separation of time scales and perturbation methods previously developed for stochastic reaction–diffusion equations, we then show how extrinsic noise in the activity variables leads to a diffusive-like displacement (wandering) of the composite wave from its uniformly translating position at long time scales, and fluctuations in the wave profile around its instantaneous position at short time scales. We use our analysis to calculate the first-passage-time distribution for a stochastic rivalry wave to travel a fixed distance, which we find to be given by an inverse Gaussian. Finally, we investigate the effects of noise in the depression variables, which under an adiabatic approximation lead to quenched disorder in the neural fields during propagation of a wave.