Thursday, January 21, 2016

Brain's Memory Capacity is 10 Times Greater Than Previously Thought

Scientists have discovered that the brain’s capacity for memory storage is far greater than previously hypothesized. How much greater? Well, the Salk Institute research would increase conservative estimates by a factor of 10, up to at least a petabyte, putting the brain’s capacity on par with the World Wide Web

The researchers recently published their work in eLife.
 
Memories and thoughts result from electrical and chemical activity in the brain. Information flows between neurons via synapses as chemicals called neurotransmitters.

Synapse dysfunction can lead to a myriad of neurological disorders. But synapses exhibit varying levels of plasticity, which dictates how influential one neuron is over a neuron it’s connected to. According to the researchers, a signal traveling form one neuron to another only activates the second neuron between 10 and 20 % of the time.
 
“The number of different strengths can be measured in bits,” write the researchers. “The total storage capacity of the brain therefore depends on both the number of synapses and the number of distinguishable synaptic strengths.”
 
The research team built a 3-D reconstruction of a rat’s hippocampus tissue. They noticed that in about 10% of neurons, a single axon (output) formed two synapses when reaching out to a second neuron. They discovered that these synapses could vary in size by increments as small as 8%, which led the team to determine that there are about 26 different categories of synapse size.
 
“This number translates into a storage capacity of roughly 4.7 bits of information per synapse,” the researchers write. “This estimate is markedly higher than previous suggestions. It implies that the total memory capacity of the brain—with its many trillions of synapses—may have been underestimated by an order of magnitude.”
 
The research is shedding more light on how the human brain—which generates about 20 watts of continuous power—can influence computer design, particularly for computers “that employ ‘deep learning’ and artificial neural nets—techniques capable of sophisticated learning and analysis, such as speech, object recognition, and translation,” according to the Salk Institute. 
Subscribe to NBIC News