journal
MENU ▼
Read by QxMD icon Read
search

Frontiers in Computational Neuroscience

journal
https://read.qxmd.com/read/30733673/modeling-emotions-associated-with-novelty-at-variable-uncertainty-levels-a-bayesian-approach
#1
Hideyoshi Yanagisawa, Oto Kawamata, Kazutaka Ueda
Acceptance of novelty depends on the receiver's emotional state. This paper proposes a novel mathematical model for predicting emotions elicited by the novelty of an event under different conditions. It models two emotion dimensions, arousal and valence, and considers different uncertainty levels. A state transition from before experiencing an event to afterwards is assumed, and a Bayesian model estimates a posterior distribution as being proportional to the product of a prior distribution and a likelihood function...
2019: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30760994/a-simplified-model-of-communication-between-time-cells-accounting-for-the-linearly-increasing-timing-imprecision
#2
Mustafa Zeki, Fuat Balcı
Many organisms can time intervals flexibly on average with high accuracy but substantial variability between the trials. One of the core psychophysical features of interval timing functions relates to the signatures of this timing variability; for a given individual, the standard deviation of timed responses/time estimates is nearly proportional to their central tendency (scalar property). Many studies have aimed at elucidating the neural basis of interval timing based on the neurocomputational principles in a fashion that would explain the scalar property...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30745868/balanced-active-core-in-heterogeneous-neuronal-networks
#3
Qing-Long L Gu, Songting Li, Wei P Dai, Douglas Zhou, David Cai
It is hypothesized that cortical neuronal circuits operate in a global balanced state, i.e., the majority of neurons fire irregularly by receiving balanced inputs of excitation and inhibition. Meanwhile, it has been observed in experiments that sensory information is often sparsely encoded by only a small set of firing neurons, while neurons in the rest of the network are silent. The phenomenon of sparse coding challenges the hypothesis of a global balanced state in the brain. To reconcile this, here we address the issue of whether a balanced state can exist in a small number of firing neurons by taking account of the heterogeneity of network structure such as scale-free and small-world networks...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30723401/a-general-model-of-ion-passive-transmembrane-transport-based-on-ionic-concentration
#4
Vincent Qiqian Wang, Shenquan Liu
Current mainstream neural computing is based on the electricity model proposed by Hodgkin and Huxley in 1952, the core of which is ion passive transmembrane transport controlled by ion channels. However, studies on the evolutionary history of ion channels have shown that some neuronal ion channels predate the neurons. Thus, to deepen our understanding of neuronal activities, ion channel models should be applied to other cells. Expanding the scope of electrophysiological experiments from nerve to muscle, animal to plant, and metazoa to protozoa, has lead the discovery of a number of ion channels...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30713494/predicting-illusory-contours-without-extracting-special-image-features
#5
Albert Yankelovich, Hedva Spitzer
Boundary completion is one of the desired properties of a robust object boundary detection model, since in real-word images the object boundaries are commonly not fully and clearly seen. An extreme example of boundary completion occurs in images with illusory contours, where the visual system completes boundaries in locations without intensity gradient. Most illusory contour models extract special image features, such as L and T junctions, while the task is known to be a difficult issue in real-world images...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30687055/modeling-the-encoding-of-saccade-kinematic-metrics-in-the-purkinje-cell-layer-of-the-cerebellar-vermis
#6
Hari Teja Kalidindi, Thomas George Thuruthel, Cecilia Laschi, Egidio Falotico
Recent electrophysiological observations related to saccadic eye movements in rhesus monkeys, suggest a prediction of the sensory consequences of movement in the Purkinje cell layer of the cerebellar oculomotor vermis (OMV). A definite encoding of real-time motion of the eye has been observed in simple-spike responses of the combined burst-pause Purkinje cell populations, organized based upon their complex-spike directional tuning. However, the underlying control mechanisms that could lead to such action encoding are still unclear...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30687054/saccade-velocity-driven-oscillatory-network-model-of-grid-cells
#7
Ankur Chauhan, Karthik Soman, V Srinivasa Chakravarthy
Grid cells and place cells are believed to be cellular substrates for the spatial navigation functions of hippocampus as experimental animals physically navigated in 2D and 3D spaces. However, a recent saccade study on head fixated monkey has also reported grid-like representations on saccadic trajectory while the animal scanned the images on a computer screen. We present two computational models that explain the formation of grid patterns on saccadic trajectory formed on the novel Images. The first model named Saccade Velocity Driven Oscillatory Network -Direct PCA (SVDON-DPCA) explains how grid patterns can be generated on saccadic space using Principal Component Analysis (PCA) like learning rule...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30687053/bio-inspired-analysis-of-deep-learning-on-not-so-big-data-using-data-prototypes
#8
Thalita F Drumond, Thierry Viéville, Frédéric Alexandre
Deep artificial neural networks are feed-forward architectures capable of very impressive performances in diverse domains. Indeed stacking multiple layers allows a hierarchical composition of local functions, providing efficient compact mappings. Compared to the brain, however, such architectures are closer to a single pipeline and require huge amounts of data, while concrete cases for either human or machine learning systems are often restricted to not-so-big data sets. Furthermore, interpretability of the obtained results is a key issue: since deep learning applications are increasingly present in society, it is important that the underlying processes be accessible and understandable to every one...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30670958/on-the-influence-of-structural-connectivity-on-the-correlation-patterns-and-network-synchronization
#9
Parisa Sadat Nazemi, Yousef Jamali
Since brain structural connectivity is the foundation of its functionality, in order to understand brain abilities, studying the relation between structural and functional connectivity is essential. Several approaches have been applied to measure the role of the structural connectivity in the emergent correlation/synchronization patterns. In this study, we investigates the cross-correlation and synchronization sensitivity to coupling strength between neural regions for different topological networks. We model the neural populations by a neural mass model that express an oscillatory dynamic...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30666194/hippocampal-neurogenesis-reduces-the-dimensionality-of-sparsely-coded-representations-to-enhance-memory-encoding
#10
Anthony J DeCostanzo, Chi Chung Alan Fung, Tomoki Fukai
Adult neurogenesis in the hippocampal dentate gyrus (DG) of mammals is known to contribute to memory encoding in many tasks. The DG also exhibits exceptionally sparse activity compared to other systems, however, whether sparseness and neurogenesis interact during memory encoding remains elusive. We implement a novel learning rule consistent with experimental findings of competition among adult-born neurons in a supervised multilayer feedforward network trained to discriminate between contexts. From this rule, the DG population partitions into neuronal ensembles each of which is biased to represent one of the contexts...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30622467/highly-heterogeneous-excitatory-connections-require-less-amount-of-noise-to-sustain-firing-activities-in-cortical-networks
#11
Hisashi Kada, Jun-Nosuke Teramae, Isao T Tokuda
Cortical networks both in vivo and in vitro sustain asynchronous irregular firings with extremely low frequency. To realize such self-sustained activity in neural network models, balance between excitatory and inhibitory activities is known to be one of the keys. In addition, recent theoretical studies have revealed that another feature commonly observed in cortical networks, i.e., sparse but strong connections and dense weak connections, plays an essential role. The previous studies, however, have not thoroughly considered the cooperative dynamics between a network of such heterogeneous synaptic connections and intrinsic noise...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30622466/hough-transform-implementation-for-event-based-systems-concepts-and-challenges
#12
Sajjad Seifozzakerini, Wei-Yun Yau, Kezhi Mao, Hossein Nejati
Hough transform (HT) is one of the most well-known techniques in computer vision that has been the basis of many practical image processing algorithms. HT however is designed to work for frame-based systems such as conventional digital cameras. Recently, event-based systems such as Dynamic Vision Sensor (DVS) cameras, has become popular among researchers. Event-based cameras have a significantly high temporal resolution (1 μs), but each pixel can only detect change and not color. As such, the conventional image processing algorithms cannot be readily applied to event-based output streams...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30618694/biological-mechanisms-for-learning-a-computational-model-of-olfactory-learning-in-the-manduca-sexta-moth-with-applications-to-neural-nets
#13
Charles B Delahunt, Jeffrey A Riffell, J Nathan Kutz
The insect olfactory system, which includes the antennal lobe (AL), mushroom body (MB), and ancillary structures, is a relatively simple neural system capable of learning. Its structural features, which are widespread in biological neural systems, process olfactory stimuli through a cascade of networks where large dimension shifts occur from stage to stage and where sparsity and randomness play a critical role in coding. Learning is partly enabled by a neuromodulatory reward mechanism of octopamine stimulation of the AL, whose increased activity induces synaptic weight updates in the MB through Hebbian plasticity...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30618693/transfer-of-spatial-contact-information-among-limbs-and-the-notion-of-peripersonal-space-in-insects
#14
Volker Dürr, Malte Schilling
Internal representation of far-range space in insects is well established, as it is necessary for navigation behavior. Although it is likely that insects also have an internal representation of near-range space, the behavioral evidence for the latter is much less evident. Here, we estimate the size and shape of the spatial equivalent of a near-range representation that is constituted by somatosensory sampling events. To do so, we use a large set of experimental whole-body motion capture data on unrestrained walking, climbing and searching behavior in stick insects of the species Carausius morosus to delineate 'action volumes' and 'contact volumes' for both antennae and all six legs...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30618692/suppression-of-parkinsonian-beta-oscillations-by-deep-brain-stimulation-determination-of-effective-protocols
#15
Eli J Müller, Peter A Robinson
A neural field model of the corticothalamic-basal ganglia system is developed that describes enhanced beta activity within subthalamic and pallidal circuits in Parkinson's disease (PD) via system resonances. A model of deep brain stimulation (DBS) of typical clinical targets, the subthalamic nucleus (STN) and globus pallidus internus (GPi), is added and studied for several distinct stimulation protocols that are used for treatment of the motor symptoms of PD and that reduce pathological beta band activity (13-30 Hz) in the corticothalamic-basal ganglia network...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30618691/entropy-uncertainty-and-the-depth-of-implicit-knowledge-on-musical-creativity-computational-study-of-improvisation-in-melody-and-rhythm
#16
Tatsuya Daikoku
Recent neurophysiological and computational studies have proposed the hypothesis that our brain automatically codes the n th-order transitional probabilities (TPs) embedded in sequential phenomena such as music and language (i.e., local statistics in n th-order level), grasps the entropy of the TP distribution (i.e., global statistics), and predicts the future state based on the internalized n th-order statistical model. This mechanism is called statistical learning (SL). SL is also believed to contribute to the creativity involved in musical improvisation...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30618690/brain-network-analysis-and-classification-based-on-convolutional-neural-network
#17
Lu Meng, Jing Xiang
Background: Convolution neural networks (CNN) is increasingly used in computer science and finds more and more applications in different fields. However, analyzing brain network with CNN is not trivial, due to the non-Euclidean characteristics of brain network built by graph theory. Method: To address this problem, we used a famous algorithm "word2vec" from the field of natural language processing (NLP), to represent the vertexes of graph in the node embedding space, and transform the brain network into images, which can bridge the gap between brain network and CNN...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30574083/a-novel-approach-for-modeling-neural-responses-to-joint-perturbations-using-the-narmax-method-and-a-hierarchical-neural-network
#18
Runfeng Tian, Yuan Yang, Frans C T van der Helm, Julius P A Dewald
The human nervous system is an ensemble of connected neuronal networks. Modeling and system identification of the human nervous system helps us understand how the brain processes sensory input and controls responses at the systems level. This study aims to propose an advanced approach based on a hierarchical neural network and non-linear system identification method to model neural activity in the nervous system in response to an external somatosensory input. The proposed approach incorporates basic concepts of Non-linear AutoRegressive Moving Average Model with eXogenous input (NARMAX) and neural network to acknowledge non-linear closed-loop neural interactions...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30555315/a-compartmental-model-to-investigate-local-and-global-ca-2-dynamics-in-astrocytes
#19
Evan Cresswell-Clay, Nathan Crock, Joël Tabak, Gordon Erlebacher
Intracellular Ca2+ dynamics in astrocytes can be triggered by neuronal activity and in turn regulate a variety of downstream processes that modulate neuronal function. In this fashion, astrocytic Ca2+ signaling is regarded as a processor of neural network activity by means of complex spatial and temporal Ca2+ dynamics. Accordingly, a key step is to understand how different patterns of neural activity translate into spatiotemporal dynamics of intracellular Ca2+ in astrocytes. Here, we introduce a minimal compartmental model for astrocytes that can qualitatively reproduce essential hierarchical features of spatiotemporal Ca2+ dynamics in astrocytes...
2018: Frontiers in Computational Neuroscience
https://read.qxmd.com/read/30555314/why-do-durations-in-musical-rhythms-conform-to-small-integer-ratios
#20
Andrea Ravignani, Bill Thompson, Massimo Lumaca, Manon Grube
One curious aspect of human timing is the organization of rhythmic patterns in small integer ratios. Behavioral and neural research has shown that adjacent time intervals in rhythms tend to be perceived and reproduced as approximate fractions of small numbers (e.g., 3/2). Recent work on iterated learning and reproduction further supports this: given a randomly timed drum pattern to reproduce, participants subconsciously transform it toward small integer ratios. The mechanisms accounting for this "attractor" phenomenon are little understood, but might be explained by combining two theoretical frameworks from psychophysics...
2018: Frontiers in Computational Neuroscience
journal
journal
42053
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"