In 2016, I asked on ai.stackexchange: [...] can $1000.00 buy enough operations per second to be approximately equal to the computational power of a human brain? This led to some interesting followups about the Landauer limit, the relationship between information and energy, and quantum computers. The full thread went as follows:
DJG:
In his 1999 book "The Age of Spiritual Machines", Ray Kurzweil predicted that in 2009, a $1000 computing device would be able to perform a trillion operations per second. Additionally, he claimed that in 2019, a $1000 computing device would be approximately equal to the computational ability of the human brain (due to Moore's Law and exponential growth.)
Did Kurzweil's first prediction come true? Are we on pace for his second prediction to come true? If not, how many years off are we?
BlindKungFuMaster:
"The development of CPUs didn't quite keep up with Kurzweil's predictions. But if you also allow for GPUs, his prediction for 2009 is pretty accurate.
I think Moore's law slowed down recently and has now been pretty much abandoned by the industry. How much that will affect the 2019 prediction remains to be seen. Maybe the industry will hit its stride again with non-silicon based chips, maybe not.
And of course whether hitting Kurzweil's estimate of the computing power of the human brain will make an appreciable difference for the development of [Artificial General Intelligence] is another question altogether."
Ankur:
1) Yes we do have computing systems that does fall in to teraFLOPS range.
2) The human brain is a biological system and saying it has some sort of FLOPS ability is just plain dumb because there is no way to take a human brain and measure it's FLOPS. You could say "hey by looking at the neurons activity using fMRI we can reach some sort of approximation" but comparing the result of this approach with the way FLOPS are measured in computers will be comparing apples with oranges, which again is dumb.
DJG: Why don't we measure it in energy consumed instead, with some sort of efficiency factor that denotes how much of the heat is being generated by useful computation (as opposed to supportive biological processes?)
Ankur: Heat is just another factor to optimise. You want to maximise the FLOPS and minimise heat generation (aka energy consumption of the system). People, in high performance computing, first focus to maximise FLOPS generally as they want their algorithms to run fast and later focus on heat depending on the requirements.
Peteris: it's not a currently useful measure, because the "heat being generated by useful computation" (e.g. Landauer limit) is many orders of magnitude smaller than the waste heat of even the most efficient computing devices that we can build. For both modern electronic computers and biological neurons, despite their enormous efficiency differences, they still effectively are 100% waste heat, spending millions of times more power than theoretically necessary for that computation.
DJG: Perhaps this indicates a flaw in or incompletion of the theory.
Ankur: Given a list of 100 numbers, who would be fast[er]to compute their sum, a computer or a human? This simple example shows that what human brain does and what computers does are completely different things. We built computers to perform a given sequence of arithmetic and logic operations as fast as possible because human brains are very very slow to do this task.
DJG: The average person would be slower than a computer. Certain autistic individuals might be faster.
This piqued my curiosity about the Landauer limit, so I asked about it in another thread:
Is this trade-off inevitable, or did it have to do with how the experiment was performed? Has the trade-off been quantified?
No comments:
Post a Comment