We have located links that may give you full text access.
Nonsynaptic Error Backpropagation in Long-Term Cognitive Networks.
We introduce a neural cognitive mapping technique named long-term cognitive network (LTCN) that is able to memorize long-term dependencies between a sequence of input and output vectors, especially in those scenarios that require predicting the values of multiple dependent variables at the same time. The proposed technique is an extension of a recently proposed method named short-term cognitive network that aims at preserving the expert knowledge encoded in the weight matrix while optimizing the nonlinear mappings provided by the transfer function of each neuron. A nonsynaptic, backpropagation-based learning algorithm powered by stochastic gradient descent is put forward to iteratively optimize four parameters of the generalized sigmoid transfer function associated with each neuron. Numerical simulations over 35 multivariate regression and pattern completion data sets confirm that the proposed LTCN algorithm attains statistically significant performance differences with respect to other well-known state-of-the-art methods.
Full text links
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app