Reckon you could memorise the entire Internet?

Silly question, but the answer, amazingly, is … er, potentially, yes! I’m not saying anyone actually could, or even would want to, but our understanding of our memory potential just got a major shot in the arm. Researchers have calculated that the brain’s capacity is somewhere in the region of 1 petabyte (which apparently is equal to a quadrillion bytes, but I guess you knew that already, right)!

According to recent research, it seems the human brain may well be capable of storing as much information as is currently stored on the entire Internet (an amount that is truly unimaginable)! It has the capacity to store so much information that Terry Sejnowski, of the Salk Institute in La Jolla, California, said: “Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10.”

The last word in efficiency!

What’s more, the brain, that soft, squidgy little thing in your head, can do it all while using just enough power to run a very dim light bulb. Contrast this with a computer of the same capacity; it would need 1 gigawatt of power, or roughly the output of a nuclear power station! And you thought your brain wasn’t up to speed! Hah!

The research team worked on a tiny portion of a rat’s brain (a piece of the hippocampus, actually – see the glossary for details of what that actually is, if you’re unsure). They spent an entire year studying this very, very small piece of tissue microscopically, tracing the connections of each and every brain cell. A monumental task, even on such a tiny piece of brain tissue. To give you an idea how small this sample was, if you put 20 of them side by side they would be about the width of a human hair! Incidentally, all mammals have brains that are biologically quite similar, so using a portion of a rat’s brain wasn’t just a shot in the dark, it was about as close as you can get to a human brain (although it’s unclear why they didn’t just use a sample from a post-mortem human brain).

brain cells, also called neurons, share information via chemical neurotransmitters.They weren’t just counting the cells either, which would have been quite a job in itself, they were studying their composition. Let me explain a bit about neurons (brain cells): they look a bit like misshapen balloons, with long tendrils snaking out of them (they’re called dendrites and axons). The axons send out neurotransmitters, in the form of a cocktail of chemical molecules, and tiny spines on the dendrites receive the neurotransmitters coming from other neurons. Information, in the form of those neurotransmitters, crosses a gap between neurons, which is called a synapse. The receiving neuron can then, in turn, fire out its own message to others, though often they don’t do anything in response.

Recalling strengthens the pathways between neurons

When you remember something the neurons retread the path that was forged when the memory was formed, strengthening the neural network. It seems the more often a brain cell gets used, the higher the odds are that it will transmit information to another cell. And the point of contact (the synapse) actually gets bigger. The biggest of them is about sixty times the size of the smallest! Which gives you some idea of how valuable it is to repeat things in your head that you’ve memorised – by doing so, you’re actually changing the physical composition of your brain and making the connections between brain cells larger as you do so.

A co-author of the study, Tom Bartol, explains that one brain cell chattering to another across a bigger synapse has, in effect, a louder voice than one communicating across a smaller synapse. But until now researchers haven’t really understood how many sizes of neurons there were, and how much they changed in response to information received.

Then Bartol, Sejnowski and their colleagues noticed something odd; sometimes (about 1 in 10) a single axon would connect with another brain cell at two different points on a dendrite, not just one. It was sending the same information to both, presumably, but across synapses of different sizes. The implication was that the information received was different at each point.

Researchers worked out that with all the various sizes of neurones and synapses, a far greater volume of information could be generated and stored by brain cells than was previously thought. It was a far more complex setup than the binary system computer use, which can only be either ‘on’ or ‘off’ (i.e. a zero or a one). Their calculations resulted in their assertion that the brain’s capacity is probably ten times greater than previously believed.

Efficiency is the reason for low power consumption

One of the researchers’ findings is that most neurones don’t respond to other neurones’ messages (presumably the transmitting neurone just gets a ‘busy’ signal!). This is the brain acting in a highly efficient way. Most of its component parts, most of the time, are idle and unresponsive!

Even so, if the average brain cell spends most of its time just sitting back and relaxing, that still doesn’t fully explain why a computer of the same immense capacity would require fifty million times more power to do the same work.

Nuclear power plant. No neurons here, just a huge amount of heat and steam.

If your brain ran like a computer, this would need to be the power supply!

Bartol explains that it might have something to do with the fact that a computer uses electrons flowing in a wire, and as is well known, electric current flowing through a wire can produce a lot of heat (that’s the noise you sometimes notice from your computer … it’s the fan, keeping the CPU cool enough not to get damaged beyond repair with the heat generated by the CPU’s frantic processing). As Bartol says, the biochemical pathways in the brain are presumably much more efficient than anything currently in use in computers.

Don’t underplay your brain’s power!

We all tend to think our memory isn’t very good, and I’ve said before on this site, we really need to be more positive in our judgement of it. It seems the more we learn about the brain and the memory the more astonishing it really is. We carry around with us, in our heads, a biological miracle. We are capable of amazing feats of memory, but because we’re born with it and because we use it all the time, we do tend to get blasé about the brain. If it doesn’t seem to be performing particularly well, it’s probably because we aren’t making use of the memory techniques available to us.

Consider some of the outstanding feats of memory you can read about on this site, by such individuals as Harry Lorayne, Dominic O’Brien, Tony Buzan and others. They aren’t aliens from another, more advanced civilisation. They’re just like you and me, but they’re using memory techniques they’ve learned and practised. We need to keep in mind that what they can do, we can do too. I’m not saying it’s easy, but it’s possible. And even if we could only do a tenth as good as the star memory performers, can you imagine how much of an improvement that would be?


News and Research

>> News and Research links in the sidebar >>



Would you like to submit a page for RMI? It's easy, just fill in the details below.