Quantitative Biology - Neurons and Cognition Publications (50)

Search

Quantitative Biology - Neurons and Cognition Publications

In voxel-based neuroimage analysis, lesion features have been the main focus in disease prediction due to their interpretability with respect to the related diseases. However, we observe that there exists another type of features introduced during the preprocessing steps and we call them "\textbf{Procedural Bias}". Besides, such bias can be leveraged to improve classification accuracy. Read More


Recurrently coupled networks of inhibitory neurons robustly generate oscillations in the gamma band. Nonetheless, the corresponding Wilson-Cowan type firing rate equation for such an inhibitory population does not generate such oscillations without an explicit time delay. We show that this discrepancy is due to a voltage-dependent spike-synchronization mechanism inherent in networks of spiking neurons which is not captured by standard firing rate equations. Read More


Reinforcement learning (RL) has recently regained popularity, with major achievements such as beating the European game of Go champion. Here, for the first time, we show that RL can be used efficiently to train a spiking neural network (SNN) to perform object recognition in natural images without using an external classifier. We used a feedforward convolutional SNN and a temporal coding scheme where the most strongly activated neurons fire first, while less activated ones fire later, or not at all. Read More


The 'free energy principle' (FEP) has been suggested to provide a unified theory of the brain, integrating data and theory relating to action, perception, and learning. The theory and implementation of the FEP combines insights from Helmholtzian 'perception as inference', machine learning theory, and statistical thermodynamics. Here, we provide a detailed mathematical evaluation of a suggested biologically plausible implementation of the FEP that has been widely used to develop the theory. Read More


To understand how neurons and nervous systems first evolved, we need an account of the origins of neural elongations: Why did neural elongations (axons and dendrites) first originate, such that they could become the central component of both neurons and nervous systems? Two contrasting conceptual accounts provide different answers to this question. Braitenberg's vehicles provide the iconic illustration of the dominant input-output (IO) view. Here the basic role of neural elongations is to connect sensors to effectors, both situated at different positions within the body. Read More


Connectivity patterns of relevance in neuroscience and systems biology can be encoded in hierarchical modular networks (HMNs). Moreover, recent studies highlight the role of hierarchical modular organization in shaping brain activity patterns, providing an excellent substrate to promote both the segregation and integration of neural information. Here we propose an extensive numerical analysis of the critical spreading rate (or "epidemic" threshold) --separating a phase with endemic persistent activity from one in which activity ceases-- on diverse HMNs. Read More


Cromwell's rule (also known as the zero priors paradox) refers to the constraint of classical probability theory that if one assigns a prior probability of 0 or 1 to a hypothesis, then the posterior has to be 0 or 1 as well (this is a straightforward implication of how Bayes's rule works). Relatedly, hypotheses with a very low prior cannot be updated to have a very high posterior without a tremendous amount of new evidence to support them (or to make other possibilities highly improbable). Cromwell's rule appears at odds with our intuition of how humans update probabilities. Read More


Understanding how the brain learns to compute functions reliably, efficiently and robustly with noisy spiking activity is a fundamental challenge in neuroscience. Most sensory and motor tasks can be described as dynamical systems and could presumably be learned by adjusting connection weights in a recurrent biological neural network. However, this is greatly complicated by the credit assignment problem for learning in recurrent network, e. Read More


Understanding how recurrent neural circuits can learn to implement dynamical systems is a fundamental challenge in neuroscience. The credit assignment problem, i.e. Read More


Neural time-series data contain a wide variety of prototypical signal waveforms (atoms) that are of significant importance in clinical and cognitive research. One of the goals for analyzing such data is hence to extract such 'shift-invariant' atoms. Even though some success has been reported with existing algorithms, they are limited in applicability due to their heuristic nature. Read More


Avalanches with power-law distributed size parameters have been observed in neuronal networks. This observation might be a manifestation of the self-organized criticality (SOC). Yet, the physiological mechanicsm of this behavior is currently unknown. Read More


Cortical microcircuits are very complex networks, but they are composed of a relatively small number of stereotypical motifs. Hence one strategy for throwing light on the computational function of cortical microcircuits is to analyze emergent computational properties of these stereotypical microcircuit motifs. We are addressing here the question how spike-timing dependent plasticity (STDP) shapes the computational properties of one motif that has frequently been studied experimentally: interconnected populations of pyramidal cells and parvalbumin-positive inhibitory cells in layer 2/3. Read More


A fundamental aspect of limitations in learning any computation in neural architectures is characterizing their optimal capacities. An important, widely-used neural architecture is known as autoencoders where the network reconstructs the input at the output layer via a representation at a hidden layer. Even though capacities of several neural architectures have been addressed using statistical physics methods, the capacity of autoencoder neural networks is not well-explored. Read More


We consider the problem of finding the spectrum of an operator taking the form of a low-rank (rank one or two) non-normal perturbation of a self-adjoint operator, motivated by a number of problems of applied interest which take this form. We use the fact that the system is a low rank perturbation of a symmetric problem, together with a simple idea of classical differential geometry (the envelope of a family of curves) to completely analyze the spectrum of this type of operator. We use these techniques to analyze three problems of this form: a model of the oculomotor integrator due to Anastasio and Gad (2007), a continuum integrator model, and a nonlocal model of phase separation due to Rubinstein and Sternberg (1992). Read More


A key problem with neuroprostheses and brain monitoring interfaces is that they need extreme energy efficiency. One way of lowering energy is to use the low power modes avail- able on the processors embedded in these devices. We present a technique to predict when neuronal activity of interest is likely to occur, so that the processor can run at nominal operating frequency at those times, and be placed in low power modes otherwise. Read More


Here, we present a novel approach to solve the problem of reconstructing perceived stimuli from brain responses by combining probabilistic inference with deep learning. Our approach first inverts the linear transformation from latent features to brain responses with maximum a posteriori estimation and then inverts the nonlinear transformation from perceived stimuli to latent features with adversarial training of convolutional neural networks. We test our approach with a functional magnetic resonance imaging experiment and show that it can generate state-of-the-art reconstructions of perceived faces from brain activations. Read More


Collective behavior among coupled dynamical units can emerge in various forms as a result of different coupling topologies as well as different types of coupling functions. Chimera states have recently received ample attention as a fascinating manifestation of collective behavior, in particular describing a symmetry breaking spatiotemporal pattern where synchronized and desynchronized states coexist in a network of coupled oscillators. In this perspective, we review the emergence of different chimera states, focusing on the effects of different coupling topologies that describe the interaction network connecting the oscillators. Read More


Theoretical arguments and empirical evidence in neuroscience suggests that organisms represent or model their environment by minimizing a variational free-energy bound on the surprise associated with sensory signals from the environment. In this paper, we study phase transitions in coupled dissipative dynamical systems (complex Ginzburg-Landau equations) under a variety of coupling conditions to model the exchange of a system (agent) with its environment. We show that arbitrary coupling between sensory signals and the internal state of a system -- or those between its action and external (environmental) states -- do not guarantee synchronous dynamics between external and internal states: the spatial structure and the temporal dynamics of sensory signals and action (that comprise the system's Markov blanket) have to be pruned to produce synchrony. Read More


In this paper, we develop an approach to modeling high-dimensional networks with a large number of nodes arranged in a hierarchical and modular structure. We propose a novel multi-scale factor analysis (MSFA) model which partitions the massive spatio-temporal data defined over the complex networks into a finite set of regional clusters. To achieve further dimension reduction, we represent the signals in each cluster by a small number of latent factors. Read More


Neuroimaging data can be represented as networks of nodes and edges that capture the topological organization of the brain connectivity. Graph theory provides a general and powerful framework to study these networks and their structure at various scales. By way of example, community detection methods have been widely applied to investigate the modular structure of many natural networks, including brain functional connectivity networks. Read More


Neurotransmitter receptor molecules, concentrated in synaptic membrane domains along with scaffolds and other kinds of proteins, are crucial for signal transmission across chemical synapses. In common with other membrane protein domains, synaptic domains are characterized by low protein copy numbers and protein crowding, with rapid stochastic turnover of individual molecules. We study here in detail a stochastic lattice model of the receptor-scaffold reaction-diffusion dynamics at synaptic domains that was found previously to capture, at the mean-field level, the self-assembly, stability, and characteristic size of synaptic domains observed in experiments. Read More


In this work we challenge the main conclusions of Gu et al work (Controllability of structural brain networks. Nature communications 6, 8414, doi:10.1038/ncomms9414, 2015) on brain controllability. Read More


In a published paper \cite{Sengupta2016}, we have proposed that the brain (and other self-organized biological and artificial systems) can be characterized via the mathematical apparatus of a gauge theory. The picture that emerges from this approach suggests that any biological system (from a neuron to an organism) can be cast as resolving uncertainty about its external milieu, either by changing its internal states or its relationship to the environment. Using formal arguments, we have shown that a gauge theory for neuronal dynamics -- based on approximate Bayesian inference -- has the potential to shed new light on phenomena that have thus far eluded a formal description, such as attention and the link between action and perception. Read More


The accelerated path of technological development, particularly at the interface between hardware and biology has been suggested as evidence for future major technological breakthroughs associated to our potential to overcome biological constraints. This includes the potential of becoming immortal, having expanded cognitive capacities thanks to hardware implants or the creation of intelligent machines. Here I argue that several relevant evolutionary and structural constraints might prevent achieving most (if not all) these innovations. Read More


Bifurcation theory is a powerful tool for studying how the dynamics of a neural network model depends on its underlying neurophysiological parameters. However, bifurcation theory has been developed mostly for smooth dynamical systems and for continuous-time non-smooth models, which prevents us from understanding the changes of dynamics in some widely used classes of artificial neural network models. This article is an attempt to fill this gap, through the introduction of algorithms that perform a semi-analytical bifurcation analysis of a spin-glass-like neural network model with binary firing rates and discrete-time evolution. Read More


A fundamental goal in network neuroscience is to understand how activity in one region drives activity elsewhere, a process referred to as effective connectivity. Here we propose to model this causal interaction using integro-differential equations and causal kernels that allow for a rich analysis of effective connectivity. The approach combines the tractability and flexibility of autoregressive modeling with the biophysical interpretability of dynamic causal modeling. Read More


In a spiking neural network (SNN), individual neurons operate autonomously and only communicate with other neurons sparingly and asynchronously via spike signals. These characteristics render a massively parallel hardware implementation of SNN a potentially powerful computer, albeit a non von Neumann one. But can one guarantee that a SNN computer solves some important problems reliably? In this paper, we formulate a mathematical model of one SNN that can be configured for a sparse coding problem for feature extraction. Read More


Dynamic balance in human locomotion can be assessed through the local dynamic stability (LDS) method. Whereas gait LDS has been used successfully in many settings and applications, little is known about its sensitivity to individual characteristics of healthy adults. Therefore, we reanalyzed a large dataset of accelerometric data measured for 100 healthy adults from 20 to 70 years of age performing 10 min. Read More


In this study we describe a methodology to realize visual images cognition in the broader sense, by a cross-modal stimulation through the auditory channel. An original algorithm of conversion from bi-dimensional images to sounds has been established and tested on several subjects. Our results show that subjects where able to discriminate with a precision of 95\% different sounds corresponding to different test geometric shapes. Read More


The human brain processes a wide variety of inputs and does so either consciously or subconsciously. According to the Global Workspace theory, conscious processing involves broadcasting of information to several regions of the brain and subconscious processing involves more localized processing. This theoretical paper aims to expand on some of the aspects of the Global Workspace theory: how the properties of incoming information result in it being processed subconsciously or consciously; why processing can be either be sustained or short-lived; how the Global Workspace theory may apply both to real-time sensory input as well as to internally retained information. Read More


The purpose of this paper is to provide a brief overview of nine assessments of face processing skills. These tests have been used commonly in recent years to gauge the skills of perspective 'super-recognisers' with respect to the general population. In the literature, a person has been considered to be a 'super-recogniser' based on superior scores on one or more of these tests (cf. Read More


Stochastic resonance is a phenomenon in which noise enhances the response of a system to an input signal. The brain is an example of a system that has to detect and transmit signals in a noisy environment, suggesting that it is a good candidate to take advantage of SR. In this work, we aim to identify the optimal levels of noise that promote signal transmission through a simple network model of the human brain. Read More


Computational models in fields such as computational neuroscience are often evaluated via stochastic simulation or numerical approximation. Fitting these models implies a difficult optimization problem over complex, possibly noisy parameter landscapes. Bayesian optimization (BO) has been successfully applied to solving expensive black-box problems in engineering and machine learning. Read More


The human brain displays a complex network topology, whose structural organization is widely studied using diffusion tensor imaging. The original geometry from which emerges the network topology is known, as well as the localization of the network nodes in respect to the brain morphology and anatomy. One of the most challenging problems of current network science is to infer the latent geometry from the mere topology of a complex network. Read More


Fixed point networks are dynamic networks that encode stimuli via distinct output patterns. Although such networks are omnipresent in neural systems, their structures are typically unknown or poorly characterized. It is therefore valuable to use a supervised approach for resolving how a network encodes distinct inputs of interest, and the superposition of those inputs from sampled multiple node time series. Read More


Despite the relative simplicity of C. elegans, its locomotion machinery is not yet well understood. We focus on the generation of dorsoventral body bends. Read More


The Network of Noisy Leaky Integrate and Fire (NNLIF) model describes the behavior of a neural network at mesoscopic level. It is one of the simplest self-contained mean-field models considered for that purpose. Even so, a lot of questions remain open today. Read More


We propose a novel discrete model of central pattern generators (CPG), neuronal ensembles generating rhythmic activity. The model emphasizes the role of nonsynaptic interactions and the diversity of electrical properties in nervous systems. Neurons in the model release different neurotransmitters into the shared extracellular space (ECS) so each neuron with the appropriate set of receptors can receive signals from other neurons. Read More


2017May
Affiliations: 1Department of Computer Science, School of Science, Aalto University, Espoo, Finland, 2Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland, 3Department of Neuroscience and Biomedical Engineering, School of Science, Aalto University, Espoo, Finland, 4Department of Computer Science, School of Science, Aalto University, Espoo, Finland, 5Department of Computer Science, School of Science, Aalto University, Espoo, Finland

Graph-theoretical methods have rapidly become a standard tool in studies of the structure and function of the human brain. Whereas the structural connectome can be fairly straightforwardly mapped onto a complex network, there are more degrees of freedom in constructing networks that represent functional connections between brain areas. For fMRI data, such networks are typically built by aggregating the BOLD signal time series of voxels into larger entities (such as Regions of Interest in some brain atlas), and determining the connection strengths between these from some measure of time-series correlations. Read More


Although the brain has long been considered a potential inspiration for future computing, Moore's Law - the scaling property that has seen revolutions in technologies ranging from supercomputers to smart phones - has largely been driven by advances in materials science. As the ability to miniaturize transistors is coming to an end, there is increasing attention on new approaches to computation, including renewed enthusiasm around the potential of neural computation. Recent advances in neurotechnologies, many of which have been aided by computing's rapid progression over recent decades, are now reigniting this opportunity to bring neural computation insights into broader computing applications. Read More


Nature is abundant in oscillatory activity, with oscillators that have the remarkable ability of synchronizing to external events. Using electrocorticographic (ECoG) recordings from a subject rhythmically producing consonant-vowel syllables (CVSs) we show that neural oscillators recorded at individual ECoG electrodes become precisely synchronized to initiations of the production of CVSs (i.e. Read More


Neurons and networks in the cerebral cortex must operate reliably despite multiple sources of noise. To evaluate the impact of both input and output noise, we determine the robustness of single-neuron stimulus selective responses, as well as the robustness of attractor states of networks of neurons performing memory tasks. We find that robustness to output noise requires synaptic connections to be in a balanced regime in which excitation and inhibition are strong and largely cancel each other. Read More


The study of action selection in humans can present challenges of task design since our actions are usually defined by many degrees of freedom and therefore occupy a large action-space. While saccadic eye-movement offers a more constrained paradigm for investigating action selection, the study of reach-and-grasp in upper limbs has often been defined by more complex scenarios, not easily interpretable in terms of such selection. Here we present a novel motor behaviour task which addresses this by limiting the action space to a single degree of freedom in which subjects have to track (using a stylus) a vertical coloured target line displayed on a tablet computer, whilst ignoring a similarly oriented distractor line in a different colour. Read More


Cognitive arithmetic studies the mental processes used in solving math problems. This area of research explores the retrieval mechanisms and strategies used by people during a common cognitive task. Past research has shown that human performance in arithmetic operations is correlated to the numerical size of the problem. Read More


Objective: The coupling between neuronal populations and its magnitude have been shown to be informative for various clinical applications. One method to estimate brain connectivity is with electroencephalography (EEG) from which the cross-spectrum between different sensor locations is derived. We wish to test the efficacy of tensor factorisation in the estimation of brain connectivity. Read More


What are the roles of central and peripheral vision in human scene recognition? Larson and Loschky (2009) showed that peripheral vision contributes more than central vision in obtaining maximum scene recognition accuracy. However, central vision is more efficient for scene recognition than peripheral, based on the amount of visual area needed for accurate recognition. In this study, we model and explain the results of Larson and Loschky (2009) using a neurocomputational modeling approach. Read More


In spite of many years of research, the mechanism of avian magnetoreception remains a mystery due to its seemingly insurmountable intricacies. Recently Xie and colleagues proposed that IscA1 can act as a protein biocompass due to the measured intrinsic ferromagneticity, and thus named it MagR. However, Meister's calculations showed that the interaction energy of the magnetic moment of IscA1 with Earth's magnetic field is five magnitudes smaller than thermal fluctuation at room temperature. Read More


Distribution of acetylcholinesterase (AChE) activity in the septum of the telencephalon in man was studied in 15 human brains using the acetylthiocholine method. Highest activity of AChE was found in the nucleus of the diagonal band and nucleus accumbens, and lowest in the Lateral nucleus. Comparison of histochemical results with cellular structure and wi. Read More


Experience of time is one of the primordial human experiences which is deeply tied to human consciousness. But despite this intimate relation of time with human conscious experience, time has proved to be very elusive. Particularly in physics, though there is already some understanding of time, there are still so many paradoxes that plague this understanding. Read More


The present study reports interesting findings in regard to emotional arousal based activities while listening to two Hindustani classical ragas of contrast emotion. EEG data was taken on 5 naive listeners while they listened to two ragas Bahar and Mia ki Malhar which are conventionally known to portray contrast emotions. The EEG data were analyzed with the help of two robust non linear tools viz. Read More