Intel Fellow Mark Bohr told attendees at an engineering conference in San Francisco this morning to look to biological systems to figure out how to engineer the future of computing.

Bohr said that the human brain is far better at computing because it operates on very low power (barely perceptible electrical signals), computes in a parallel fashion, and integrates many more senses than computers currently do (i.e.: sound, sight, touch, smell and taste). He made the comments at the annual International Solid State Circuits Conference.

Picking up where Jeff Hawkins, Palm co-founder and brain researcher, left off with a talk a year ago, Bohr said microprocessors should follow the same evolution the brain did as it evolved from lizards to humans. “We’re trying to emulate nature,” he said.

He noted that brain cells are 50 microns in diameter, with dendrites used to input data from other neurons and axons used to output the processed result. The cells operate on 10 to 20 millivolts, far less than electronic circuits operate on in today’s typical computers. And although brain circuits operate at 100 Hertz, far slower than 3.2-gigahertz microprocessors, they are massively parallel, meaning they have lots of processors working simultaneously on the same problems, where computers tackle just a few at a time.

The brain gets by on 20 watts of power, while computer systems need 40, Bohr said. There are 10 to the 11th power neurons in the brain and 10 to the 14th synapses, while there are “only” 10 to the 8th transistors and 10 to the 11th connections in a chip.

All of this suggests, Bohr said, that we have a long way to go before we hit the peak of microprocessor evolution. At the moment, Intel has a new 45-nanometer Nehalem microprocessor with 731 million transistors on it. With every new generation of manufacturing technology, it can fit twice the transistors on the same size chip. This year, it will move to 32-nm chips for the first time.

Then will come 22-nm chips, and then 16-nm. By the time Intel moves to 16-nm, it will need a new way to print design images on top of silicon substrates. At that point, it will move to a kind of X-ray technology, dubbed extreme ultraviolet lithography. At that generation, it will also use much more 3-D chip stacking, optical interconnection, and sensor integration in chips. Back in 1971, Intel could put 2,000 transistors on a chip. By the time it gets to 16-nm in the coming years, the number will be in the billions. That’s clearly going to be enough to put today’s supercomputers in a package the size of an iPhone. It’ll be fun when we get there.

Of course, Jeff Hawkins had a very different idea for how to create a computer based on the human brain, well beyond just the “hardware” considerations that Bohr is talking about. It’s one thing to hear a brainiac like Jeff Hawkins talk about these things. It’s another to hear the words from a manufacturing-process chief at the world’s biggest chip maker.