Koomey's Law of the computation efficiency of digital systems over time is a natural extension of Moore's Law. That is, that using smaller transistors tends to decrease the energy per computation of these sytems. And while human ingenuity continues to find ways of building smaller and smaller transistors, these transistors are becoming less and less predictable, such that digital designers are having a harder and harder time of figuring out how to build reliable systems while using them. So while no real system can sustain exponential growth forever, Moore's Law appears to still be holding while Koomey's is not.
It is estimated that the human brain is still somewhere on the order of five orders of magnitude more efficient at solving problems than our current best digital systems. And with the slowing down of Koomey's Law, we may never reach that efficiency with digital computers alone. Thus, there is a strong motivation to explore alternative types of computation. Analog computers are incredibly efficient when precision requirements are low, while digital computers tend to win when precision requirements are high. The adaptability of neuromorphic and asynchronous digital computers allow them to efficiently utilize unreliable parts.
To build the most efficient computers of the future, it is likely one will need to leverage many different computational domains depending on the problem at hand. This presentation covers reconfigurable, hybrid computers and core technologies built to explore this idea.
Researchers should cite this work as follows:
203 Physics, Purdue Universtiy, West Lafayette, IN