I’ve decided to christen the next decade the teenies. Firstly, I’ve still heard no other suggestions; secondly, it’s phonetically consistent with the noughties and the twenties; and thirdly, the name is so downright awfully bad it’s almost quite good. So the teenies it is.
I’ve been scratching my head about these predictions for the last few days. By and large, I feel like I’m just predicting the obvious — which is a bit of a let down. However, when I look at the noughties, while the specific details were not predictable, the general trends were pretty obvious already in 2000. So perhaps predicting the seemingly obvious is not such a bad idea. And what seems obvious to me often is anything but obvious to others, indeed many will flatly disagree with my predictions. So, here goes. Hopefully these precitions are specific enough that I’ll be able to perform a decent analysis come 2020 to see how well I fared.
First up, things generally will become more energy efficient and we will see more solar power. But overall not much will change in energy — we’ll keep on using oil and coal and pumping out lots of CO2.
Chinese GDP on a PPP basis will be roughly comparible to that of the US and the EU (i.e. within 25%). India will be about half their size. The UK and France won’t be in the top 10 countries anymore, though they will still like to think that they are. China will become increasingly associated with luxury designer goods.
Computers will become about 50x faster, though I’m a bit nervous about this prediction. Later in the decade we will have major trouble with silicon chip technology. We might also see computer power overshoot general consumer demand which would spell serious trouble for the big chip manufactures. Everything goes very multi-core, even your cell phone. The graphics card market collapses due to them overshooting consumer demand* and possibly being subsumed by new CPUs.
All things internet and mobile will continue to grow. Smart “phones” will become fully funcational computers. You’ll be able to connect your smart phone to a large monitor, keyboard, mouse, projector etc., just like you’d do with a PC today. It will even become your wallet as you’ll be able to use it to pay for things at the supermarket. The expanding internet will swallow up most of TV and radio. High definition video conferencing will become common, making distance collaboration significantly more natural. High definition matters as it will allow people to have a wider field of view and to more clearly see facial expressions.
Machine learning will grow in importance due to ever increasing quantities of data, computer power, and better algorithms. It mostly won’t be publically seen, however, much like how it’s heavily used in Google and a few financial and pharmaceutical companies at the moment.
Significant progress will be made in understanding the brain. We will have a rough high level sketch of how the brain works, and some of its processes we will understand quite well. We probably still won’t understand cortical function very well, that will take longer.
More groups will start AGI projects, particularly from 2015 onwards. These groups will become increasingly mainstream, serious and well funded. This will be driven by faster computers, better machine learning algorithms and a better understanding of the brain’s architecture. Some of these groups will produce small AGIs that will learn to do some interesting things, but they will be nowhere near human level intelligence. They will, however, be preparing the way for this. Concern at the dangers of artificial intelligence will become less fringe but it won’t go mainstream.
In short, I’m predicting a bigger brighter expanded version of the last few years — nothing particularly radical. I think the real significance of the teenies will be to lay the foundations for more important things to come.
* UPDATE 15/1/2010: I’ve thought a bit about the main criticism of my predictions above, namely that the graphics chip business will collapse. As a result I’ve decided to soften my prediction. I’m now thinking that 10 more years probably won’t be enough for it to collapse due to overshooting demand. Going to 3D creates 2x the computational demand, going to higher resolution can create 5x demand, and better quality and more sophisticated graphics techniques can drive another 10x, maybe a bit more. Overall this approximately 100x might be enough to drive demand through until the end of the teenies. If a collapse does come, I think it will more likely be due to somebody like Intel getting aggressive and building cutting edge GPUs into their CPU chips thus making GPUs redundant.