Go Backhardware index
click on home button to open mainpage



When the balloon of the New Economy explodes, it leaves many millionaires behind, and many broke as well. The stockmarkets collapsed and went back to normal proportions. Investors became very carefull, bankers did not lend money anymore to anyone just having a good idea, economy went in a slump: a recession. Underdeveloped countries in the aftermath had huge debts they could no longer cope with.

The new (internet) economy came to an abrupt halt and developments were no longer news of the day. It took the economy a few years to speed up again. Mainly because industrialists and investors became wary to invest in anything. This over-prudency and the international public debt structure is the main reason that the world took overly long to recover from this recession.


Our brains are maybe 10,000 times faster than most comtemporary computers. We have 10 billion neurons, each with about 10,000 synaptic connections to other neurons. A frequent guess at the computational power per synapse is something like 100 multiplications per second.

On the other hand, each decade computers get roughly 1000 times faster by cost. This statement is a generally accepted, slightly revised variant of Moore's law, first formulated in 1965.

Over the past three decades many people have extrapolated Moore's law to predict the date when machines will match brains. The most frequent estimate is 2020 plusminus a few years (underestimating synapses by a factor of 1000 will cost a decade or so). Such an educated guess motivated Schmidhuber to study computer science.

Calculating machines were introduced by Wilhelm Schickard (1623), Pascal (1640), and Wilhem Leibniz (1670). The first working general purpose computer was completed by Konrad Zuse in 1941. In the year 2041, only 100 years later, their fastest descendants will presumably outperform brains by a factor of a million, at least in terms of raw computing power. Compare Schmidhuber's law.

Where are the limits? How much can you compute with the "ultimate laptop" (S. Lloyd, Nature 406, 1047-1054, 2000) with 1 kg of mass and 1 liter of volume? Answer: not more than 10^51 operations per second on not more than 10^32 bits (compare H. J. Bremermann: Minimum energy requirements of information transfer and computing, International Journal of Theoretical Physics, 21, 203-217, 1982). The massively parallel laptop's temperature would be roughly 10^9 degrees Kelvin. As we compress it such that it approaches its Schwarzschild Radius (where it will become a black hole), it still cannot perform more than 10^51 operations per second. But now it may work in a serial fashion, as the communication time around the black hole horizon equals the time to flip a bit. Two additional centuries of Moore's law seem necessary to achieve the Bremermann limit.

Long before that, in 2141, roughly half a millennium after Schickard, and 200 years after Zuse, there should be affordable hardware with a million times the raw computational power of all current human brains combined.

This text republished with the kind permission of Juergen Schmidhuber (copyright 2001)
url: www.idsia.ch/~juergen/raw.html
Links towards other parts of Juergen's site can be found on the original page. In this text url's have been replaced by THoCP's own to avoid messing up our structure.


Go BackTime Line Last Updated on June 20, 2006 For suggestions  please mail the editors 

Footnotes & References