Rather than the ever-accelerating advancement predicted by Kurzweil, we believe that progress toward this understanding is fundamentally slowed by the complexity brake. Our ability to achieve this understanding, via either the AI or the neuroscience approaches, is itself a human cognitive act, arising from the unpredictable nature of human ingenuity and discovery. Progress here is deeply affected by the ways in which our brains absorb and process new information, and by the creativity of researchers in dreaming up new theories. It is also governed by the ways that we socially organize research work in these fields, and disseminate the knowledge that results. At Vulcan and at the Allen Institute for Brain Science, we are working on advanced tools to help researchers deal with this daunting complexity, and speed them in their research. Gaining a comprehensive scientific understanding of human cognition is one of the hardest problems there is. We continue to make encouraging progress. But by the end of the century, we believe, we will still be wondering if the singularity is near.
The basic issue with how quickly a scientifically adequate account of human intelligence can be developed. We call this issue the complexity brake. As we go deeper and deeper in our understanding of natural systems, we typically find that we require more and more specialized knowledge to characterize them, and we are forced to continuously expand our scientific theories in more and more complex ways. Understanding the detailed mechanisms of human cognition is a task that is subject to this complexity brake.
Paul Allen has funded detailed brain mapping projects. He also funded Virgin Galactic (suborbital space tourism).
Just as in neuroscience, the AI-based route to achieving singularity-level computer intelligence seems to require many more discoveries, some new Nobel-quality theories, and probably even whole new research approaches that are incommensurate with what we believe now. This kind of basic scientific progress doesn't happen on a reliable exponential growth curve. So although developments in AI might ultimately end up being the route to the singularity, again the complexity brake slows our rate of progress, and pushes the singularity considerably into the future.
If the singularity is going to occur on anything like Kurzweil's timeline, though, then we absolutely require a massive acceleration of our scientific progress in understanding every facet of the human brain.
2045 may be too early for far greater than human level AGI or brain emulation which is to trigger a Singularity.
However, other massive technological productivity boosters should kick in to trigger a Technological explosion of capability.
Surprising possibilities in energy breakthroughs seem likely to happen between 2012-2030.
Space access could be radically changed by a reusable Spacex rocket. This will be followed by solar electric sails and nuclear fusion for space propulsion.
Massive computer hardware boosts will come from memristors, optical computing, onchip photonics, breakthroughs with quantum computers.
I still think molecular nanotechnology will happen in 2025-2040.
Advanced additive manufacturing and molecular nanotechnology will create very interesting capabilities.
I anticipate the world to have a few radical changes by 2035.
Significant enhancements to the tools and productivity boosters to increase the rate of scientific and technological improvement will change the situation.
I would lay my Singularity odds at
10% by 2045
50% by 2065
90% by 2075
99% by 2080
I am expecting significant life extension improvements to start kicking in by 2035. So I am expecting to live to see the result of the prediction out through 2080.
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks