I like the fact that most people here are sceptical about projecting current trends into the future - a big mistake often made in bad science fiction.
However, I think the OP has a point here. Allow me to explain.
The obvious mistake to make here is to confuse the rise of tablets and touch with the future. This is the mistake Microsoft has made with Windows 8. While the tablet market is still growing, tablets can never do what desktops can. However, this depends on the definition of "tablet" and "desktop". The desktop as the concept of a big screen, mouse and keyboard isn't going to disappear. Tablets as the concept of a "small screen with touch input" cannot offer the functionality of a desktop. Touch is inconvenient rather than convenient for many tasks. Touch is not the future per se.
Note how my definition of desktop did not actually include the desktop itself, as defined by a full (or midi) tower PC. I think it's quite likely that these machines will start to disappear, possibly with a few exceptions but I'm not even sure about that.
Think about it. We can be pretty sure that tablets will be as powerful as desktop PCs are now within 5 years (possibly even less), at least in CPU performance. Now think about connecting the tablet to mouse and keyboard and the screen. I expect this to be possible by using but one connection within 5 years as well (Thunderbolt, though expensive at the moment, is already capable of doing so). You can connect the tablet at home to your "desktop", work on a report or excel sheet, save your files, take the tablet with you in the train to work and play angry birds, check your email &c., then connect the tablet to your desktop environment at work. No synchronisation is necessary, all files are accessible and available everywhere and at all times &c.
There is but one issue here, as noted by most other contributors: tablets aren't as powerful as PCs and they never will be due to heat issues &c. That is definitely true, certainly in the case of the technology we currently use. But it need not be a problem.
I'm the "guy who is good with computers" so my family and friends come to me when they need a problem solved or when they need advice on what computer or component to buy. I check hardware sites every day and am generally rather well aware of what is going on.
I'll tell you what is going on, at least in the CPU area: performance has become "sufficient". It simply has become good enough. Even hardware websites now say so. Recently, an article stated that (translating here) "A modern PC has (more than) enough computing power for all possible tasks and since the advent of the SSD there isn't really any bottleneck anymore which needs to be improved". That's a hardware website, specialised in computer hardware. They write reviews about new hardware for a living. And they have to admit that it doesn't really matter anymore what you buy when it comes to CPU performance.
They're not alone. Intel, for the first time ever as far as I can tell, has recently decided to use the new production process (22 nm) for a reduction in power consumption rather than use it to increase performance. TDP went down from 95 watt to 77 watt for their top range CPUs.
Similarly, enthusiasts start building more and more micro-ATX or even mini-ATX PCs. Full towers become increasingly rare. I bought a cheap $60 AMD CPU (alternatively one might buy an Intel Pentium) and the thing is: I haven't noticed a difference with my old $280 i7 860, and I've never felt either CPU fell short. We're talking about a 3 year old CPU and an extremely cheap half year old CPU here.
This means that tablets will be "fast enough" within 5 years. For consumers that is, some tasks obviously still require a lot of computing power. But we have supercomputers for that. Number crunching in excel or SPSS or Stata, encryption and converting video formats can all be done sufficiently fast on a tablet in 5 years.
The only exception are GPUs and, consequently, GPU intensive tasks. Even today's GPUs often lack the power to play games at high resolutions with all effects enabled. Whereas upgrading a CPU isn't very useful in most cases, upgrading a GPU is almost always beneficial. The top range cards also consume 200 watt or more. You won't find similar performance in tablets anytime soon, and when the time has come it won't be enough. However, modular GPUs are a technical possibility with thunderbolt. It's definitely possible to have a modular GPU provide the required performance to a tablet in 5 years. Then it's also possible to do rendering &c. while using a tablet as main computer.
Technically, it's perfectly possible to have tablets replace the desktop TOWER in 5 years. Of course, whether it will be possible to consumers to do so is dubious, since it requires companies to make the required components. The problem is that this requires the cooperation and integration of many technologies. The OS (in this case Android) would have to support modular GPUs and would have to offer much more functionality, modular GPUs have to be produced and an integration of screen, mouse and keyboard with a tablet has to be possible (which requires both hardware and software support). At the same time, any one of these improvements separately doesn't offer any benefits, so there has to be a company to take charge and get things done. This could be Apple or Google.
(Yes, I expect Microsoft to play no role in this, ironically by making the switch to an integration of mobile OS and desktop OS based on the ridiculous assumption that touch is the future).