Quantitative Biology - Neurons and Cognition Publications (50)

Search

Quantitative Biology - Neurons and Cognition Publications

The brain is a paradigmatic example of a complex system as its functionality emerges as a global property of local mesoscopic and microscopic interactions. Complex network theory allows to elicit the functional architecture of the brain in terms of links (correlations) between nodes (grey matter regions) and to extract information out of the noise. Here we present the analysis of functional magnetic resonance imaging data from forty healthy humans during the resting condition for the investigation of the basal scaffold of the functional brain network organization. Read More


Psychological traumas are the main cause of post-traumatic stress disorder, which can be either simple or complex. Psychological traumas of various kinds are also present in a wide range of psychological conditions, including disorganised attachment, personality disorders, eating disorders, bipolar disorder and schizophrenia. For such conditions, traumatic experiences are often regarded as an exacerbating factor of symptoms induced by another major causative agent, often of a genetic-biological nature. Read More


Computer vision has made remarkable progress in recent years. Deep neural network (DNN) models optimized to identify objects in images exhibit unprecedented task-trained accuracy and, remarkably, some generalization ability: new visual problems can now be solved more easily based on previous learning. Biological vision (learned in life and through evolution) is also accurate and general-purpose. Read More


The study of synchronization in populations of coupled biological oscillators is fundamental to many areas of biology to include neuroscience, cardiac dynamics and circadian rhythms. Studying these systems may involve tracking the concentration of hundreds of variables in thousands of individual cells resulting in an extremely high-dimensional description of the system. However, for many of these systems the behaviors of interest occur on a collective or macroscopic scale. Read More


In real-world applications, observations are often constrained to a small fraction of a system. Such spatial subsampling can be caused by the inaccessibility or the sheer size of the system, and cannot be overcome by longer sampling. Spatial subsampling can strongly bias inferences about a system's aggregated properties. Read More


A recent Editorial by Slotnick (2017) reconsiders the findings of our paper on the accuracy of false positive rate control with cluster inference in fMRI (Eklund et al, 2016). In this commentary we respond to a number of misrepresentations of our work and discuss potential problems with Slotnick's own method. Read More


The retina is a complex nervous system which encodes visual stimuli before higher order processing occurs in the visual cortex. In this study we evaluated whether information about the stimuli received by the retina can be retrieved from the firing rate distribution of Retinal Ganglion Cells (RGCs), exploiting High-Density 64x64 MEA technology. To this end, we modeled the RGC population activity using mean-covariance Restricted Boltzmann Machines, latent variable models capable of learning the joint distribution of a set of continuous observed random variables and a set of binary unobserved random units. Read More


At the point of a second order phase transition also termed as a critical point, systems display long range order and their macroscopic behaviors are independent of the microscopic details making up the system. Due to these properties, it has long been speculated that biological systems that show similar behavior despite having very different microscopics, may be operating near a critical point. Recent methods in neuroscience are making it possible to explore whether criticality exists in neural networks. Read More


Deep neural networks have been developed drawing inspiration from the brain visual pathway, implementing an end-to-end approach: from image data to video object classes. However building an fMRI decoder with the typical structure of Convolutional Neural Network (CNN), i.e. Read More


Despite the population of the noninvasive, economic, comfortable, and easy-to-install photoplethysmography (PPG) sensor, a mathematically rigorous and stable algorithm to simultaneously extract the fundamental physiological information, including the instantaneous heart rate (IHR) and the instantaneous respiratory rate (IRR), from the single-channel PPG signal is lacking. A novel signal processing technique, called the de-shape synchrosqueezing transform, is provided to tackle this challenging task. The algorithm is applied to two publicly available batch databases, one is collected during intense physical activities, for the reproducibility purpose and the state-of-the-art results are obtained compared with existing reported outcomes. Read More


We show that macro-molecular self-assembly can recognize and classify high-dimensional patterns in the concentrations of $N$ distinct molecular species. Similar to associative neural networks, the recognition here leverages dynamical attractors to recognize and reconstruct partially corrupted patterns. Traditional parameters of pattern recognition theory, such as sparsity, fidelity, and capacity are related to physical parameters, such as nucleation barriers, interaction range, and non-equilibrium assembly forces. Read More


We study the stochastic dynamics of strongly-coupled excitable elements on a tree network. The peripheral nodes receive independent random inputs which may induce large spiking events propagating through the branches of the tree and leading to global coherent oscillations in the network. This scenario may be relevant to action potential generation in certain sensory neurons, which possess myelinated distal dendritic tree-like arbors with excitable nodes of Ranvier at peripheral and branching nodes and exhibit noisy periodic sequences of action potentials. Read More


The control of brain dynamics provides great promise for the enhancement of cognitive function in humans, and by extension the betterment of their quality of life. Yet, successfully controlling dynamics in neural systems is particularly challenging, not least due to the immense complexity of the brain and the large set of interactions that can affect any single change. While we have gained some understanding of the control of single neurons, the control of large-scale neural systems---networks of multiply interacting components---remains poorly understood. Read More


Artificial intelligence research to a great degree focuses on the brain and behaviors that the brain generates. But the brain, an extremely complex structure resulting from millions of years of evolution, can be viewed as a solution to problems posed by an environment existing in space and time. The environment generates signals that produce sensory events within an organism. Read More


Current theories hold that brain function is highly related to long-range physical connections through axonal bundles, namely extrinsic connectiv-ity. However, obtaining a groupwise cortical parcellation based on extrinsic connectivity remains challenging. Current parcellation methods are compu-tationally expensive; need tuning of several parameters or rely on ad-hoc constraints. Read More


Finding the common structural brain connectivity network for a given population is an open problem, crucial for current neuro-science. Recent evidence suggests there's a tightly connected network shared between humans. Obtaining this network will, among many advantages , allow us to focus cognitive and clinical analyses on common connections, thus increasing their statistical power. Read More


It has been demonstrated that the statistical power of many neuroscience studies is very low, so that the results are unlikely to be robustly reproducible. How are neuroscientists and the journals in which they publish responding to this problem? Here I review the sample size justifications provided for all 15 papers published in one recent issue of the leading journal Nature Neuroscience. Of these, only one claimed it was adequately powered. Read More


A complex system can be represented and analyzed as a network, where nodes represent the units of the network and edges represent connections between those units. For example, a brain network represents neurons as nodes and axons between neurons as edges. In many networks, some nodes have a disproportionately high number of edges. Read More


Network analyses in nervous system disorders involves constructing and analyzing anatomical and functional brain networks from neuroimaging data to describe and predict the clinical syndromes that result from neuropathology. A network view of neurological disease and clinical syndromes facilitates accurate quantitative characterizations and mathematical models of complex nervous system disorders with relatively simple tools drawn from the field of graph theory. Networks are predominantly constructed from in vivo data acquired using physiological and neuroimaging techniques at the macroscale of nervous system organization. Read More


Deep neural networks (DNN) trained in a supervised way suffer from two known problems. First, the minima of the objective function used in learning correspond to data points (also known as rubbish examples or fooling images) that lack semantic similarity with the training data. Second, a clean input can be changed by a small, and often imperceptible for human vision, perturbation, so that the resulting deformed input is misclassified by the network. Read More


Much of the information the brain processes and stores is temporal in nature - a spoken word or a handwritten signature is defined as much by how it unfolds in time as by its spatial structure at any given moment in time. It remains unclear how neural circuits encode such patterns. We show that the same recurrent neural network model can simultaneously encode time-varying sensory and motor patterns as continuous neural trajectories. Read More


Recurrent networks of dynamic elements frequently exhibit emergent collective oscillations, which can display substantial regularity even when the individual elements are considerably noisy. How noise-induced dynamics at the local level coexists with regular oscillations at the global level is still unclear. Here we show that a combination of stochastic recurrence-based initiation with deterministic refractoriness in an excitable network can reconcile these two features, leading to maximum collective coherence for an intermediate noise level. Read More


First-passage time problems are ubiquitous across many fields of study including transport processes in semiconductors and biological synapses, evolutionary game theory and percolation. Despite their prominence, first-passage time calculations have proven to be particularly challenging. Analytical results to date have often been obtained under strong conditions, leaving most of the exploration of first-passage time problems to direct numerical computations. Read More


Simultaneous recordings from N electrodes generate N-dimensional time series that call for efficient representations to expose relevant aspects of the underlying dynamics. Binning the time series defines neural activity vectors that populate the N-dimensional space as a density distribution, especially informative when the neural dynamics performs a noisy path through metastable states (often a case of interest in neuroscience); this makes clustering in the N-dimensional space a natural choice. We apply a variant of the 'mean-shift' algorithm to perform such clustering, and validate it on an Hopfield network in the glassy phase, in which metastable states are uncorrelated from memory attractors. Read More


Alzheimer's disease (AD) causes alterations of brain network structure and function. The latter consists of connectivity changes between oscillatory processes at different frequency channels. We proposed a multi-layer network approach to analyze multiple-frequency brain networks inferred from magnetoencephalographic recordings during resting-states in AD subjects and age-matched controls. Read More


Functions of brain areas in complex animals are believed to rely on the dynamics of networks of neurons rather than on single neurons. On the other hand, the network dynamics reflect and arise from the integration and coordination of the activity of populations of single neurons. Understanding how single-neurons and neural-circuits dynamics complement each other to produce brain functions is thus of paramount importance. Read More


Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another into components interpretable as the unique information of one variable, or redundant and synergy components. In this work we extend the framework of Williams and Beer (2010) focusing on the lattices that underpin the decomposition. We generalize the type of constructible lattices and examine the relations between the terms in different lattices, for example relating bivariate and trivariate decompositions. Read More


Discussions of the hippocampus often focus on place cells, but many neurons are not place cells in any given environment. Here we describe the collective activity in such mixed populations, treating place and non-place cells on the same footing. We start with optical imaging experiments on CA1 in mice as they run along a virtual linear track, and use maximum entropy methods to approximate the distribution of patterns of activity in the population, matching the correlations between pairs of cells but otherwise assuming as little structure as possible. Read More


Classification of human behavior is key to developing closed-loop Deep Brain Stimulation (DBS) systems, which may be able to decrease the power consumption and side effects of the existing systems. Recent studies have shown that the Local Field Potential (LFP) signals from both Subthalamic Nuclei (STN) of the brain can be used to recognize human behavior. Since the DBS leads implanted in each STN can collect three bipolar signals, the selection of a suitable pair of LFPs that achieves optimal recognition performance is still an open problem to address. Read More


Discourse varies with age, education, psychiatric state and historical epoch, but the ontogenetic and cultural dynamics of discourse structure remain to be quantitatively characterized. To this end we investigated word graphs obtained from verbal reports of 200 subjects ages 2-58, and 676 literary texts spanning ~5,000 years. In healthy subjects, lexical diversity, graph size, and long-range recurrence departed from initial near-random levels through a monotonic asymptotic increase across ages, while short-range recurrence showed a corresponding decrease. Read More


A brain-computer interface (BCI) is a system that aims for establishing a non-muscular communication path for subjects who had suffer from a neurodegenerative disease. Many BCI systems make use of the phenomena of event-related synchronization and de-synchronization of brain waves as a main feature for classification of different cognitive tasks. However, the temporal dynamics of the electroencephalographic (EEG) signals contain additional information that can be incorporated into the inference engine in order to improve the performance of the BCIs. Read More


The theoretical basis for the neuronal coding, associated with the short term degradation in synaptic transmission, is matter of debate in the literature. In fact, electrophysiological signals are characterized as inversely proportional to stimulus intensity. Among theoretical descriptions for this phenomenon, models based on $1/f$-dependency are employed to investigate the biophysical properties of the short term synaptic depression. Read More


Numerous measurements in the brain of the impedance between two extracellular electrodes have shown that it is approximately resistive in the range of biological interest, $<10\,$kHz, and has a value close to that expected from the conductivity of physiological saline and the extracellular volume fraction in brain tissue. Recent work from the group of Claude B\'edard and Alain Destexhe has claimed that the impedance of the extracellular space is some three orders of magnitude greater than these values and also displays a $1/\sqrt{f}$ frequency dependence (above a low-frequency corner frequency). Their measurements were performed between an intracellular electrode and an extracellular electrode. Read More


Multivariate Pattern (MVP) classification holds enormous potential for decoding visual stimuli in the human brain by employing task-based fMRI data sets. There is a wide range of challenges in the MVP techniques, i.e. Read More


Neuroengineering is faced with unique challenges in repairing or replacing complex neural systems that are composed of many interacting parts. These interactions form intricate patterns over large spatiotemporal scales, and produce emergent behaviors that are difficult to predict from individual elements. Network science provides a particularly appropriate framework in which to study and intervene in such systems, by treating neural elements (cells, volumes) as nodes in a graph and neural interactions (synapses, white matter tracts) as edges in that graph. Read More


We investigate the behavior of a model neuron that receives a biophysically-realistic noisy post-synaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity, and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. Read More


The computational properties of neural systems are often thought to be implemented in terms of their network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit (MSU) recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a state space representation of the dynamics, but would wish to have access to its governing equations for in-depth analysis. Read More


Brain development during adolescence is marked by substantial changes in brain structure and function, leading to a stable network topology in adulthood. However, most prior work has examined the data through the lens of brain areas connected to one another in large-scale functional networks. Here, we apply a recently-developed hypergraph approach that treats network connections (edges) rather than brain regions as the unit of interest, allowing us to describe functional network topology from a fundamentally different perspective. Read More


According to the theory of efficient coding, sensory systems are adapted to represent natural scenes with high fidelity and at minimal metabolic cost. Testing this hypothesis for sensory structures performing non-linear computations on high dimensional stimuli is still an open challenge. Here we develop a method to characterize the sensitivity of the retinal network to perturbations of a stimulus. Read More


We revisit the problem of deriving the mean-field values of avalanche critical exponents in systems with absorbing states. These are well-known to coincide with those of an un-biased branching process. Here, we show that for at least 4 different universality classes (directed percolation, dynamical percolation, the voter model or compact directed percolation class, and the Manna class of stochastic sandpiles) this common result can be obtained by mapping the corresponding Langevin equations describing each of these classes into a random walker confined close to the origin by a logarithmic potential. Read More


How much information do large brain networks integrate as a whole over the sum of their parts? Can the dynamical complexity of such networks be globally quantified in an information-theoretic way and be meaningfully coupled to brain function? Recently, measures of dynamical complexity such as integrated information have been proposed. However, problems related to the normalization and Bell number of partitions associated to these measures make these approaches computationally infeasible for large-scale brain networks. Our goal in this work is to address this problem. Read More


Deep convolutional neural networks (CNNs) trained for object classification have a number of striking similarities with the primate ventral visual stream. In particular, activity in early, intermediate, and late layers is closely related to activity in V1, V4, and the inferotemporal cortex (IT). This study further compares activity in late layers of object-classification CNNs to activity patterns reported in the IT electrophysiology literature. Read More


Oscillators coupled in a network can synchronize with each other to yield a coherent population rhythm. If multiple such networks are coupled together, the question arises whether these rhythms will synchronize. We investigate the impact of noise on this synchronization for strong inhibitory pulse-coupling and find that increasing the noise can synchronize the population rhythms, even if the noisy inputs to different oscillators are completely uncorrelated. Read More


Previous research has shown positive correlations between EEG alpha activity and performing creative tasks. In this study, expert classical musicians (n=4) were asked to play their instrument while being monitored with a wireless EEG headset. Data was collected during two rehearsal types: (a) in their regular, fixed ensemble;; (b) in an improvised, mixed ensemble with unfamiliar musicians and less rehearsal time. Read More


A major area in neuroscience is the study of how the brain processes spatial information. Neurons in the brain represent external stimuli via neural codes. These codes often arise from regions of space called receptive fields: each neuron fires at a high rate precisely when the animal is in the corresponding receptive field. Read More


The primate visual system has an exquisite ability to discriminate partially occluded shapes. Recent electrophysiological recordings suggest that response dynamics in intermediate visual cortical area V4, shaped by feedback from prefrontal cortex (PFC), may play a key role. To probe the algorithms that may underlie these findings, we build and test a model of V4 and PFC interactions based on a hierarchical predictive coding framework. Read More


We consider the model of interacting neurons proposed in [8] and use the statistical selection procedure proposed in [4] to infer the interaction graph from observed neural activity (spike trains) of the first olfactory relay of an insect. We propose a pruning method to deal with small sample sizes. We use simulations to test the efficiency of the estimator and to fix the parameters involved in its construction. Read More


The accurate diagnosis and assessment of neurodegenerative disease and traumatic brain injuries (TBI) remain open challenges. Both cause cognitive and functional deficits due to focal axonal swellings (FAS), but it is difficult to deliver a prognosis due to our limited ability to assess damaged neurons at a cellular level in vivo. We simulate the effects of neurodegenerative disease and TBI using convolutional neural networks (CNNs) as our model of cognition. Read More


Voxel-based lesion-symptom mapping (VLSM) is an important method for basic and translational human neuroscience research. VLSM leverages modern neuroimaging analysis techniques to build on the classic approach of examining the relationship between location of brain damage and cognitive deficits. Testing an association between deficit severity and lesion status in each voxel involves very many individual tests and requires statistical correction for multiple comparisons. Read More