We have located links that may give you full text access.
Mirror Descent of Hopfield Model.
Neural Computation 2023 July 13
Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent. While originally developed for convex optimization, it has increasingly been applied in the field of machine learning. In this study, we propose a novel approach for using mirror descent to initialize the parameters of neural networks. Specifically, we demonstrate that by using the Hopfield model as a prototype for neural networks, mirror descent can effectively train the model with significantly improved performance compared to traditional gradient descent methods that rely on random parameter initialization. Our findings highlight the potential of mirror descent as a promising initialization technique for enhancing the optimization of machine learning models.
Full text links
Related Resources
Trending Papers
Executive Summary: State-of-the-Art Review: Unintended Consequences: Risk of Opportunistic Infections Associated with Long-term Glucocorticoid Therapies in Adults.Clinical Infectious Diseases 2024 April 11
Autoimmune Hemolytic Anemias: Classifications, Pathophysiology, Diagnoses and Management.International Journal of Molecular Sciences 2024 April 13
Clinical practice guidelines on the management of status epilepticus in adults: A systematic review.Epilepsia 2024 April 13
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app