The Teenies

I’ve decided to christen the next decade the teenies.  Firstly, I’ve still heard no other suggestions; secondly, it’s phonetically consistent with the noughties and the twenties; and thirdly, the name is so downright awfully bad it’s almost quite good.  So the teenies it is.

I’ve been scratching my head about these predictions for the last few days.  By and large, I feel like I’m just predicting the obvious — which is a bit of a let down.  However, when I look at the noughties, while the specific details were not predictable, the general trends were pretty obvious already in 2000.  So perhaps predicting the seemingly obvious is not such a bad idea.  And what seems obvious to me often is anything but obvious to others, indeed many will flatly disagree with my predictions.  So, here goes.  Hopefully these precitions are specific enough that I’ll be able to perform a decent analysis come 2020 to see how well I fared.

First up, things generally will become more energy efficient and we will see more solar power.  But overall not much will change in energy — we’ll keep on using oil and coal and pumping out lots of CO2.

Chinese GDP on a PPP basis will be roughly comparible to that of the US and the EU (i.e. within 25%).  India will be about half their size.  The UK and France won’t be in the top 10 countries anymore, though they will still like to think that they are.  China will become increasingly associated with luxury designer goods.

Computers will become about 50x faster, though I’m a bit nervous about this prediction.  Later in the decade we will have major trouble with silicon chip technology.  We might also see computer power overshoot general consumer demand which would spell serious trouble for the big chip manufactures.  Everything goes very multi-core, even your cell phone.  The graphics card market collapses due to them overshooting consumer demand* and possibly being subsumed by new CPUs.

All things internet and mobile will continue to grow.  Smart “phones” will become fully funcational computers.  You’ll be able to connect your smart phone to a large monitor, keyboard, mouse, projector etc., just like you’d do with a PC today.  It will even become your wallet as you’ll be able to use it to pay for things at the supermarket.  The expanding internet will swallow up most of TV and radio.  High definition video conferencing will become common, making distance collaboration significantly more natural.  High definition matters as it will allow people to have a wider field of view and to more clearly see facial expressions.

Machine learning will grow in importance due to ever increasing quantities of data, computer power, and better algorithms.  It mostly won’t be publically seen, however, much like how it’s heavily used in Google and a few financial and pharmaceutical companies at the moment.

Significant progress will be made in understanding the brain.  We will have a rough high level sketch of how the brain works, and some of its processes we will understand quite well.  We probably still won’t understand cortical function very well, that will take longer.

More groups will start AGI projects, particularly from 2015 onwards.  These groups will become increasingly mainstream, serious and well funded.  This will be driven by faster computers, better machine learning algorithms and a better understanding of the brain’s architecture.  Some of these groups will produce small AGIs that will learn to do some interesting things, but they will be nowhere near human level intelligence.  They will, however, be preparing the way for this.  Concern at the dangers of artificial intelligence will become less fringe but it won’t go mainstream.

In short, I’m predicting a bigger brighter expanded version of the last few years — nothing particularly radical.  I think the real significance of the teenies will be to lay the foundations for more important things to come.

* UPDATE 15/1/2010: I’ve thought a bit about the main criticism of my predictions above, namely that the graphics chip business will collapse.  As a result I’ve decided to soften my prediction.  I’m now thinking that 10 more years probably won’t be enough for it to collapse due to overshooting demand.  Going to 3D creates 2x the computational demand, going to higher resolution can create 5x demand, and better quality and more sophisticated graphics techniques can drive another 10x, maybe a bit more.  Overall this approximately 100x might be enough to drive demand through until the end of the teenies.  If a collapse does come, I think it will more likely be due to somebody like Intel getting aggressive and building cutting edge GPUs into their CPU chips thus making GPUs redundant.

This entry was posted in Uncategorized. Bookmark the permalink.

24 Responses to The Teenies

  1. Kevembuangga says:

    More groups will start AGI projects, particularly from 2015 onwards. These groups will become increasingly mainstream, serious and well funded.

    That sounds more like a wish than a prediction 😉
    In such matters the “public feeling” is of more import than any technical point and I don’t see any way to forecast public opinion about AGI (the public is dumb even when the public is VCs)

  2. Shane Legg says:

    @Kevembuangga

    That there will be more people trying their hand at AGI in the 2015 to 2020 date range than now shouldn’t be all that surprising given that AGI is slowly rising in respectability and visibility (from rather low levels of both I should add).

    Is it a wish on my part? Not really. If there are more groups trying AGI some years from now, most likely in ways that I think won’t work, well, that’s interesting but it won’t have much impact on me.

    More significant to me personally is whether myself and a few colleagues can find an appropriate funding source. We’re poking around looking at options. Given the nature of the project, VCs don’t seem like the right initial source.

  3. Roko says:

    You’ve missed biotech and personalized genetics, which are probably poised to do interesting things in the teens. (teenies? It’ll never catch on!). Biofuels really do seem like they’ll work, and personalized genetics fulfills a real consumer need – health.

    Also, look out for those military robots, the teens will be their decade.

    Lastly, the big thing in computer gaming this decade will be an improved interface. We’ll see 3D games played with the gamer wearing 3D glasses and the image projected onto a wall or screen. We’ll see more games that read your mind, I think, a la http://www.emotiv.com/ .

  4. Roko says:

    > The graphics card market collapses due to them overshooting consumer demand and possibly being subsumed by new CPUs.

    Disagree. We still don’t have photorealistic graphics. Fancy a bet that high-end gaming PCs will still have dedicated GPU chips somewhere in them in 2020, and that those chips will cost at least $200 for top-of-the-range and at least $100 for mid range?

  5. Kevembuangga says:

    Fancy a bet that high-end gaming PCs will still have dedicated GPU chips s etc…

    Oh! Yessss…
    We should all be very, very grateful to the video game junkies: they provide the market to fund CPU chips progress.
    Even the DOD wouldn’t be up to the task.
    (I said something like this somewhere else already)

  6. Tim Tyler says:

    Re: “The graphics card market collapses due to them overshooting consumer demand and possibly being subsumed by new CPUs.” – this seems like the most obvious thing to disagree with.

  7. Tim Tyler says:

    “The period from 1910 to 1919, sometimes referred to as the 1910s”.

    We may well have the 2010s – i.e. the twenty-tens – a term which could well be abbreviated sometimes.

  8. yamahaeleven says:

    As a mere consumer of computers and associated technology, I have an insatiable desire for vastly more computer and graphic processing power. I agree with Roko, several magnitudes of increase should just about cover stereographic complete immersion interfaces, which would provide me with a satisfactory computer experience.

    Pessimistic predictions of the ability of silicon to keep pace with demand has not held up in the past, I see no reason to doubt its continued progression. Such pessimism seems like Malthusianism. I fail to see how humans will suddenly become stupid at some random point in the future. That form of economic prediction has proven wrong for almost two thousand years after it was invented, and almost 300 years since Malthus famously wrote about it, but normally intelligent people still frequently succumb to its discomforting ideals.

  9. I will add my prediction about AI planning.
    In the 2010s, it will become possible to solve bigger problems.
    One improvement will come from using an experience from previous planning tasks. The pattern databases are a beginning of it.
    The used state representation will change to support the transfer of learning. The AI planning will start using more structured state representations. That will allow to see analogies between parts of the states.

  10. Aki_Izayoi says:

    Doesn’t Crysis have photo-realistic graphics on its highest setting?

  11. Kevembuangga says:

    As of predictions I just stumbled upon this one:
    The Singularity will not be what you think!

  12. roberto says:

    I think machine translation and face recognition can have quite a strong impact on the next decade.

    I also predict a sudden rise of wellness, anti-aging and personalized medicine clinics.

    The biggest short term life changing innovation is easy to predict and it’s already happening: Internet via wi-fi or cellular network everywhere and all people carring a super smartphone every time. This can bring a sudden rise of collective intelligence.

  13. Shane Legg says:

    @ Roko

    I didn’t predict anything bio because I don’t feel that I know it well enough to predict anything about it (in case that wasn’t clear).

    Bio fuels already do work in Brazil.

    Yeah, 3D head glasses would be a big deal, and they might happen. But it’s one of those things that seems obvious but years and years go by and they don’t improve much. If they do take off, then 3D graphics might have some more life. Otherwise I think it will show significant diminishing value at 10x current high end levels of performance.

    Your bet doesn’t really reflect my prediction: the market can collapse and there still be these graphics cards in high end gaming PCs.

  14. Shane Legg says:

    @Roberto

    Machine translation and face recognition are difficult things to make predictions about. Currently computers can do these things quite well, but not nearly perfectly. To eliminate the remaining few percent of errors would seem to require a major breakthrough. For example, near perfect translation would require the computer to understand the nature of the the world. In other words, we may not truly solve these problems until we have something quite close to AGI. Before this point, for most tasks trained humans are cheaper, more reliable and more accurate.

    My guess is that machine translation and face recognition will improve and increase in usage, but their application will remain limited to certain situations where errors aren’t very important.

  15. david says:

    Realtime photorealism (im talking indistinguishable from photograph) of complex scenes is far far more than 10x today’s performance. Until that point I dont see any collapse, but what I do think is that gpu’s and cpu’s will have merged before that so the consideration may not even apply.

  16. Shane Legg says:

    @David: I agree that photorealism requires more than 10x current high end graphics cards. My question is whether people will care enough to pay.

    Consider sound: most people can hear the difference between uncompressed music and the mp3 quality you typically get from downloaded music. Yet the difference is small enough that the vast majority of people don’t care.

  17. Mike says:

    Your predictions seem fairly conservative for the most part and will probably be much more accurate than any Kurzweil will make.

    I think the cost for producing photo-realistic video games would be enormous. Already games can be pretty expensive to develop. The cycle for introducing new game systems is also taking longer now. We probably won’t see a playstation 4 or xbox 720 for a while. Maybe not until 2015 or later. Even when they do come out, most companies may not take advantage of the computing capacity (due to expenses). I don’t necessarily believe there will be a market collapse for GPU’s, however there will be diminishing returns to more computing power.

    What about TV’s coming to market that can be viewed in 3-dimensions without the need for glasses? There are a bunch of companies that are working on this already. A lot of this tech could fuel the need for more powerful graphics cards (or computational power) and hard drive capacity. Perhaps the next generation of video games will take advantage of the 3-D TV’s. It doesn’t seem improbable that you couldn’t see these in the next 5 to 10 years. Although it is true that 3-D TV’s are something that have been “just around the corner” for decades. However, people like James Cameron are pushing the technology (at least on the production side), so it may be more likely to happen.

    I think it is probable that we could get a connectome of the human brain within 4 to 7 years. A connectome is only part of the story, however. Henry Markram believes a whole brain simulation is possible to do within 10 years, but 10 to 20 years seems more likely to me.

  18. Kevembuangga says:

    however there will be diminishing returns to more computing power.

    This is the whole point about the Singularity: does the exponentiality of progress beats the decay in marginal returns?
    I think not, energy wise (second law of thermodynamics) the bottom line is always negative, we squander “gratuitous” energy just for the fun of it.
    But this doesn’t mean we should not enjoy the fun.

  19. Shane Legg says:

    @Mike

    Yes, 3D has always been “just around the corner”. This time I think it’s really coming, albeit in a few more years. One driving force is that cinemas are going 3D in a major way and so people are getting used to the idea. Also you really need HD to do 3D, and of course that’s big now with TVs. Still, having to wear glasses to watch it is a pain, so my guess is that people will have their TV in normal mode and then just switch to 3D when they’re really watching something like a sports match, movie or 3D game.

    As for the computing demands, they don’t change all that much. The internal model of the game remains the same, the only difference is that the 3D graphics system has to now produce images from two points of view for the stereoscopic effect. This doubles the demands on the graphics card, but that’s it.

    In 15-25 years I think we will have the hardware to simulate a system the size of the brain, however we won’t be able to scan a brain in enough detail to make it work. We’ll have the overall wiring diagram for a brain, but we won’t know many basic things like the strength of the synapses from that. Of course, 25 years is a long time, and maybe much better scanning methods will come along. But my guess is that we still won’t be able to do it by then.

  20. mitchell porter says:

    “I’m predicting a bigger brighter expanded version of the last few years — nothing particularly radical.”

    That prediction never works. Or, maybe it works for the parts of the culture which aren’t changing much, but there’s always some big new thing. The 00s had the war on terror, the 90s had the Internet, the 80s had the end of Soviet communism, the 70s had the oil crisis, the 60s had their cultural revolution, the 50s had the nuclear arms race, the 40s had a world war…

  21. Shane Legg says:

    @Mitchell: I’m not aiming to predict everything, I’m just trying to make predictions where I think I can.

  22. Pingback: Goodbye 2010 | vetta project

Comments are closed.