Lehrer wrote a different article I just found which explains just another reason why the project will probably fail.
“We have already shown that the model can scale up,” Markram says. “What is holding us back now are the computers.” The numbers speak for themselves. Markram estimates that in order to accurately simulate the trillion synapses in the human brain, you’d need to be able to process about 500 petabytes of data (peta being a million billion, or 10 to the fifteenth power). That’s about 200 times more information than is stored on all of Google’s servers. (Given current technology, a machine capable of such power would be the size of several football fields.) Energy consumption is another huge problem. The human brain requires about 25 watts of electricity to operate. Markram estimates that simulating the brain on a supercomputer with existing microchips would generate an annual electrical bill of about $3 billion . But if computing speeds continue to develop at their current exponential pace, and energy efficiency improves, Markram believes that he’ll be able to model a complete human brain on a single machine in ten years or less.
For now, however, the mind is still the ideal machine. Those intimidating black boxes from IBM in the basement are barely sufficient to model a thin slice of rat brain. The nervous system of an invertebrate exceeds the capabilities of the fastest supercomputer in the world. “If you’re interested in computing,” Schürmann says, “then I don’t see how you can’t be interested in the brain. We have so much to learn from natural selection. It’s really the ultimate engineer.”