Artificial neural networks learn better when they spend time not learning at all: Research – ET HealthWorld


Washington (US): Researchers discuss how to mimic sleep patterns The human brain in Artificial Neural Networks The latter may help reduce the risk of catastrophic forgetting, boosting their utility across a spectrum of research interests.

“When we sleep, the brain is very busy, repeating what we learned during the day,” said Maxim Bazenov, PhD, professor of medicine and sleep researcher at the University of California San Diego School of Medicine. “Sleep helps reorganize memories and present them most effectively.”

In previously published work, Bazhenov and colleagues noted how sleep builds rational memory, the ability to remember arbitrary or implicit associations between things, people, or events, and protects against forgetting old memories.

Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: when artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting. .

“In contrast, the human brain is constantly learning and incorporating new data into existing knowledge,” Bazenov said, “and it usually learns best when new training is combined with periods of sleep for memory consolidation.”

Writing in the November 18, 2022 issue of PLOS Computational Biology, senior author Bazenov and colleagues discuss how biological models can help reduce the risk of catastrophic forgetting in artificial neural networks, boosting their utility across a spectrum of research interests.

The scientists used a spiking neural network that artificially mimics the natural neural system: instead of information being transmitted continuously, it is transmitted as discrete events (spikes) at specific time points.

  Heart checkup should be done at how many intervals those working out in the gym, these are the reasons for the attack

They found that when spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was reduced. Like the human brain, the study authors said, “sleep” for networks allows them to replay old memories without explicitly using old training data.

Memories are represented in the human brain by patterns of synaptic weight—the strength or amplitude of the connection between two neurons.

Bazenov said, “When we learn new information, neurons fire in a specific sequence and this increases synapses between them. During sleep, the spiking patterns learned during our waking state are spontaneously repeated. This is called reactivation or replay.

Synaptic plasticityThe ability to change, or mold, is still in place during sleep and may further enhance the synaptic weight patterns that represent memory, helping prevent forgetting or enabling the transfer of knowledge from old to new tasks.”

When Bazenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting.

“That means these networks can learn continuously, just like humans or animals. Understanding how the human brain processes information during sleep could help improve memory in human subjects. Sleep rhythms May lead to better memory.

“In other projects, we use computer models to develop optimal strategies for applying stimuli during sleep, such as auditory tones, that enhance sleep rhythms and improve learning. This may be particularly important when memory is non-optimal.” is, such as when memory declines. In certain conditions such as aging or Alzheimer’s disease.”

  “Bodybuilders Are Losers Because...”: Andrew Tate Made a Shocking Claim about Bodybuilders in 2022

.



Source link

Leave a Comment