journal
https://read.qxmd.com/read/37432867/graph-regularized-tensor-regression-a-domain-aware-framework-for-interpretable-modeling-of-multiway-data-on-graphs
#61
JOURNAL ARTICLE
Yao Lei Xu, Kriton Konstantinidis, Danilo P Mandic
Modern data analytics applications are increasingly characterized by exceedingly large and multidimensional data sources. This represents a challenge for traditional machine learning models, as the number of model parameters needed to process such data grows exponentially with the data dimensions, an effect known as the curse of dimensionality. Recently, tensor decomposition (TD) techniques have shown promising results in reducing the computational costs associated with large-dimensional models while achieving comparable performance...
June 20, 2023: Neural Computation
https://read.qxmd.com/read/37432864/maximal-memory-capacity-near-the-edge-of-chaos-in-balanced-cortical-e-i-networks
#62
JOURNAL ARTICLE
Takashi Kanamaru, Takao K Hensch, Kazuyuki Aihara
We examine the efficiency of information processing in a balanced excitatory and inhibitory (E-I) network during the developmental critical period, when network plasticity is heightened. A multimodule network composed of E-I neurons was defined, and its dynamics were examined by regulating the balance between their activities. When adjusting E-I activity, both transitive chaotic synchronization with a high Lyapunov dimension and conventional chaos with a low Lyapunov dimension were found. In between, the edge of high-dimensional chaos was observed...
June 20, 2023: Neural Computation
https://read.qxmd.com/read/37432863/attention-in-a-family-of-boltzmann-machines-emerging-from-modern-hopfield-networks
#63
JOURNAL ARTICLE
Toshihiro Ota, Ryo Karakida
Hopfield networks and Boltzmann machines (BMs) are fundamental energy-based neural network models. Recent studies on modern Hopfield networks have broadened the class of energy functions and led to a unified perspective on general Hopfield networks, including an attention module. In this letter, we consider the BM counterparts of modern Hopfield networks using the associated energy functions and study their salient properties from a trainability perspective. In particular, the energy function corresponding to the attention module naturally introduces a novel BM, which we refer to as the attentional BM (AttnBM)...
June 20, 2023: Neural Computation
https://read.qxmd.com/read/37432862/optimal-burstiness-in-populations-of-spiking-neurons-facilitates-decoding-of-decreases-in-tonic-firing
#64
JOURNAL ARTICLE
Sylvia C L Durian, Mark Agrios, Gregory W Schwartz
A stimulus can be encoded in a population of spiking neurons through any change in the statistics of the joint spike pattern, yet we commonly summarize single-trial population activity by the summed spike rate across cells: the population peristimulus time histogram (pPSTH). For neurons with a low baseline spike rate that encode a stimulus with a rate increase, this simplified representation works well, but for populations with high baseline rates and heterogeneous response patterns, the pPSTH can obscure the response...
June 20, 2023: Neural Computation
https://read.qxmd.com/read/37187163/deep-clustering-with-a-constraint-for-topological-invariance-based-on-symmetric-infonce
#65
JOURNAL ARTICLE
Yuhui Zhang, Yuichiro Wada, Hiroki Waida, Kaito Goto, Yusaku Hino, Takafumi Kanamori
We consider the scenario of deep clustering, in which the available prior knowledge is limited. In this scenario, few existing state-of-the-art deep clustering methods can perform well for both noncomplex topology and complex topology data sets. To address the problem, we propose a constraint utilizing symmetric InfoNCE, which helps an objective of the deep clustering method in the scenario of training the model so as to be efficient for not only noncomplex topology but also complex topology data sets. Additionally, we provide several theoretical explanations of the reason that the constraint can enhances the performance of deep clustering methods...
June 12, 2023: Neural Computation
https://read.qxmd.com/read/37187169/dynamic-modeling-of-spike-count-data-with-conway-maxwell-poisson-variability
#66
JOURNAL ARTICLE
Ganchao Wei, Ian H Stevenson
In many areas of the brain, neural spiking activity covaries with features of the external world, such as sensory stimuli or an animal's movement. Experimental findings suggest that the variability of neural activity changes over time and may provide information about the external world beyond the information provided by the average neural activity. To flexibly track time-varying neural response properties, we developed a dynamic model with Conway-Maxwell Poisson (CMP) observations. The CMP distribution can flexibly describe firing patterns that are both under- and overdispersed relative to the Poisson distribution...
May 15, 2023: Neural Computation
https://read.qxmd.com/read/37187168/optimization-and-learning-with-randomly-compressed-gradient-updates
#67
JOURNAL ARTICLE
Zhanliang Huang, Yunwen Lei, Ata Kabán
Gradient descent methods are simple and efficient optimization algorithms with widespread applications. To handle high-dimensional problems, we study compressed stochastic gradient descent (SGD) with low-dimensional gradient updates. We provide a detailed analysis in terms of both optimization rates and generalization rates. To this end, we develop uniform stability bounds for CompSGD for both smooth and nonsmooth problems, based on which we develop almost optimal population risk bounds. Then we extend our analysis to two variants of SGD: batch and mini-batch gradient descent...
May 15, 2023: Neural Computation
https://read.qxmd.com/read/37187167/conductance-based-phenomenological-nonspiking-model-a-dimensionless-and-simple-model-that-reliably-predicts-the-effects-of-conductance-variations-on-nonspiking-neuronal-dynamics
#68
JOURNAL ARTICLE
Loïs Naudin, Laetitia Raison-Aubry, Laure Buhry
The modeling of single neurons has proven to be an indispensable tool in deciphering the mechanisms underlying neural dynamics and signal processing. In that sense, two types of single-neuron models are extensively used: the conductance-based models (CBMs) and the so-called phenomenological models, which are often opposed in their objectives and their use. Indeed, the first type aims to describe the biophysical properties of the neuron cell membrane that underlie the evolution of its potential, while the second one describes the macroscopic behavior of the neuron without taking into account all of its underlying physiological processes...
May 15, 2023: Neural Computation
https://read.qxmd.com/read/37187166/posterior-covariance-information-criterion-for-weighted-inference
#69
JOURNAL ARTICLE
Yukito Iba, Keisuke Yano
For predictive evaluation based on quasi-posterior distributions, we develop a new information criterion, the posterior covariance information criterion (PCIC). PCIC generalizes the widely applicable information criterion (WAIC) so as to effectively handle predictive scenarios where likelihoods for the estimation and the evaluation of the model may be different. A typical example of such scenarios is the weighted likelihood inference, including prediction under covariate shift and counterfactual prediction...
May 15, 2023: Neural Computation
https://read.qxmd.com/read/37187162/efficient-decoding-of-compositional-structure-in-holistic-representations
#70
JOURNAL ARTICLE
Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J Kymn, Bruno A Olshausen, E Paxon Frady, Friedrich T Sommer
We investigate the task of retrieving information from compositional distributed representations formed by hyperdimensional computing/vector symbolic architectures and present novel techniques that achieve new information rate bounds. First, we provide an overview of the decoding techniques that can be used to approach the retrieval task. The techniques are categorized into four groups. We then evaluate the considered techniques in several settings that involve, for example, inclusion of external noise and storage elements with reduced precision...
May 15, 2023: Neural Computation
https://read.qxmd.com/read/37037041/generalization-analysis-of-pairwise-learning-for-ranking-with-deep-neural-networks
#71
JOURNAL ARTICLE
Shuo Huang, Junyu Zhou, Han Feng, Ding-Xuan Zhou
Pairwise learning is widely employed in ranking, similarity and metric learning, area under the ROC curve (AUC) maximization, and many other learning tasks involving sample pairs. Pairwise learning with deep neural networks was considered for ranking, but enough theoretical understanding about this topic is lacking. In this letter, we apply symmetric deep neural networks to pairwise learning for ranking with a hinge loss ϕh and carry out generalization analysis for this algorithm. A key step in our analysis is to characterize a function that minimizes the risk...
May 12, 2023: Neural Computation
https://read.qxmd.com/read/36944243/automatic-hyperparameter-tuning-in-sparse-matrix-factorization
#72
JOURNAL ARTICLE
Ryota Kawasumi, Koujin Takeda
We study the problem of hyperparameter tuning in sparse matrix factorization under a Bayesian framework. In prior work, an analytical solution of sparse matrix factorization with Laplace prior was obtained by a variational Bayes method under several approximations. Based on this solution, we propose a novel numerical method of hyperparameter tuning by evaluating the zero point of the normalization factor in a sparse matrix prior. We also verify that our method shows excellent performance for ground-truth sparse matrix reconstruction by comparing it with the widely used algorithm of sparse principal component analysis...
May 12, 2023: Neural Computation
https://read.qxmd.com/read/36944238/modern-artificial-neural-networks-is-evolution-cleverer
#73
REVIEW
Andreas Bahmer, Daya Gupta, Felix Effenberger
Machine learning tools, particularly artificial neural networks (ANN), have become ubiquitous in many scientific disciplines, and machine learning-based techniques flourish not only because of the expanding computational power and the increasing availability of labeled data sets but also because of the increasingly powerful training algorithms and refined topologies of ANN. Some refined topologies were initially motivated by neuronal network architectures found in the brain, such as convolutional ANN. Later topologies of neuronal networks departed from the biological substrate and began to be developed independently as the biological processing units are not well understood or are not transferable to in silico architectures...
April 18, 2023: Neural Computation
https://read.qxmd.com/read/37037043/scalable-variational-inference-for-low-rank-spatiotemporal-receptive-fields
#74
JOURNAL ARTICLE
Lea Duncker, Kiersten M Ruda, Greg D Field, Jonathan W Pillow
An important problem in systems neuroscience is to characterize how a neuron integrates sensory inputs across space and time. The linear receptive field provides a mathematical characterization of this weighting function and is commonly used to quantify neural response properties and classify cell types. However, estimating receptive fields is difficult in settings with limited data and correlated or high-dimensional stimuli. To overcome these difficulties, we propose a hierarchical model designed to flexibly parameterize low-rank receptive fields...
April 6, 2023: Neural Computation
https://read.qxmd.com/read/37037042/sensitivity-to-control-signals-in-triphasic-rhythmic-neural-systems-a-comparative-mechanistic-analysis-via-infinitesimal-local-timing-response-curves
#75
JOURNAL ARTICLE
Zhuojun Yu, Jonathan E Rubin, Peter J Thomas
Similar activity patterns may arise from model neural networks with distinct coupling properties and individual unit dynamics. These similar patterns may, however, respond differently to parameter variations and specifically to tuning of inputs that represent control signals. In this work, we analyze the responses resulting from modulation of a localized input in each of three classes of model neural networks that have been recognized in the literature for their capacity to produce robust three-phase rhythms: coupled fast-slow oscillators, near-heteroclinic oscillators, and threshold-linear networks...
April 6, 2023: Neural Computation
https://read.qxmd.com/read/37037040/deep-learning-solution-of-the-eigenvalue-problem-for-differential-operators
#76
JOURNAL ARTICLE
Ido Ben-Shaul, Leah Bar, Dalia Fishelov, Nir Sochen
Solving the eigenvalue problem for differential operators is a common problem in many scientific fields. Classical numerical methods rely on intricate domain discretization and yield nonanalytic or nonsmooth approximations. We introduce a novel neural network-based solver for the eigenvalue problem of differential self-adjoint operators, where the eigenpairs are learned in an unsupervised end-to-end fashion. We propose several training procedures for solving increasingly challenging tasks toward the general eigenvalue problem...
April 6, 2023: Neural Computation
https://read.qxmd.com/read/36944244/echo-enhanced-embodied-visual-navigation
#77
JOURNAL ARTICLE
Yinfeng Yu, Lele Cao, Fuchun Sun, Chao Yang, Huicheng Lai, Wenbing Huang
Visual navigation involves a movable robotic agent striving to reach a point goal (target location) using vision sensory input. While navigation with ideal visibility has seen plenty of success, it becomes challenging in suboptimal visual conditions like poor illumination, where traditional approaches suffer from severe performance degradation. We propose E3VN (echo-enhanced embodied visual navigation) to effectively perceive the surroundings even under poor visibility to mitigate this problem. This is made possible by adopting an echoer that actively perceives the environment via auditory signals...
March 16, 2023: Neural Computation
https://read.qxmd.com/read/36944241/classification-from-positive-and-biased-negative-data-with-skewed-labeled-posterior-probability
#78
JOURNAL ARTICLE
Shotaro Watanabe, Hidetoshi Matsui
The binary classification problem has a situation where only biased data are observed in one of the classes. In this letter, we propose a new method to approach the positive and biased negative (PbN) classification problem, which is a weakly supervised learning method to learn a binary classifier from positive data and negative data with biased observations. We incorporate a method to correct the negative influence due to a skewed confidence, which is represented by the posterior probability that the observed data are positive...
March 16, 2023: Neural Computation
https://read.qxmd.com/read/36944240/reward-maximization-through-discrete-active-inference
#79
JOURNAL ARTICLE
Lancelot Da Costa, Noor Sajid, Thomas Parr, Karl Friston, Ryan Smith
Active inference is a probabilistic framework for modeling the behavior of biological and artificial agents, which derives from the principle of minimizing free energy. In recent years, this framework has been applied successfully to a variety of situations where the goal was to maximize reward, often offering comparable and sometimes superior performance to alternative approaches. In this article, we clarify the connection between reward maximization and active inference by demonstrating how and when active inference agents execute actions that are optimal for maximizing reward...
March 16, 2023: Neural Computation
https://read.qxmd.com/read/36944237/strong-allee-effect-synaptic-plasticity-rule-in-an-unsupervised-learning-environment
#80
JOURNAL ARTICLE
Eddy Kwessi
Synaptic plasticity, or the ability of a brain to change one or more of its functions or structures at the synaptic level, has generated and is still generating a lot of interest from the scientific community especially from neuroscientists. These interests went into high gear after empirical evidence was collected that challenged the established paradigm that human brain structures and functions are set from childhood and only modest changes were expected beyond. Early synaptic plasticity rules or laws to that regard include the basic Hebbian rule that proposed a mechanism for strengthening or weakening of synapses (weights) during learning and memory...
March 16, 2023: Neural Computation
journal
journal
31799
4
5
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.