Hopfield network

From WikiMD's Food, Medicine & Wellness Encyclopedia

Hopfield Network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It serves as a content-addressable memory system with binary threshold nodes, meaning it can recover a memory from any partial input. The Hopfield network is characterized by its fully interconnected neurons, where each neuron is connected to every other neuron in the network. This structure allows the network to converge to a stable state, which corresponds to a memory or pattern stored within the network.

Overview[edit | edit source]

A Hopfield network consists of a single layer that contains one or more neurons, with each neuron being connected to every other neuron in the network; however, there are no self-connections. The connections between the neurons are symmetric; the weight from neuron i to neuron j is the same as the weight from neuron j to neuron i. This symmetry is crucial for the network to function properly and ensures that the energy function decreases over time, leading the network to a stable state.

Functioning[edit | edit source]

The network operates in two modes: storage and retrieval. During the storage phase, the network learns patterns by adjusting the weights of the connections between neurons based on the input patterns. The Hebbian learning rule, which is often summarized as "neurons that fire together, wire together," is typically used for this purpose. In the retrieval phase, the network can recover stored patterns from incomplete or noisy input. This is achieved by updating the state of the neurons iteratively until the network converges to a stable state, which represents the retrieved memory.

Energy Function[edit | edit source]

The Hopfield network is governed by an energy function that decreases as the network updates its state, ensuring that the network will eventually converge to a stable state. This energy function is a key concept in understanding the dynamics of Hopfield networks and their ability to function as associative memory systems.

Applications[edit | edit source]

Hopfield networks have been applied in various fields, including pattern recognition, optimization problems, and associative memory. Their ability to recover full patterns from partial or corrupted inputs makes them particularly useful in tasks that require robustness to noise and incomplete data.

Limitations[edit | edit source]

Despite their advantages, Hopfield networks have limitations. They can store only a limited number of patterns, typically around 0.15 times the number of neurons in the network, before their performance degrades. Additionally, they can converge to spurious states, which are not among the patterns stored during the training phase.

Recent Developments[edit | edit source]

With the advent of deep learning and more complex neural network architectures, the use of Hopfield networks has declined. However, they remain an area of interest for researchers exploring the fundamentals of neural computation and associative memory. Recent developments have extended the original model to continuous states and deeper architectures, broadening the scope of their applications.

See Also[edit | edit source]


Neural Network Stub

This neural network related article is a stub. You can help WikiMD by expanding it.

Wiki.png

Navigation: Wellness - Encyclopedia - Health topics - Disease Index‏‎ - Drugs - World Directory - Gray's Anatomy - Keto diet - Recipes

Search WikiMD


Ad.Tired of being Overweight? Try W8MD's physician weight loss program.
Semaglutide (Ozempic / Wegovy and Tirzepatide (Mounjaro) available.
Advertise on WikiMD

WikiMD is not a substitute for professional medical advice. See full disclaimer.

Credits:Most images are courtesy of Wikimedia commons, and templates Wikipedia, licensed under CC BY SA or similar.


Contributors: Prab R. Tumpati, MD