When you start introducing upgrades into consoles, then it adds another variance to the hardware that developers need to take into account - if they decide to support it at all. It kind of defeats the point of consoles as a platform, e.g. it doesn't change, thus predictable. This is how they can squeeze high performance games out of seemingly simple hardware, developers don't need to worry about a wide variety of hardware and optimizations.
There has always been this leapfrogging of consoles over PCs when they first come out. This is limited to the mainstream and budget sectors only really, as no console could ever compete with a high end PC in any generation, but thats because you're comparing a $400 launch console to a $2500 PC...
Mainstream PCs have never really been geared towards games though - sure, they have a GPU, but it's only good for 720p for 1-2 years. The CPU though will last a fair while.
Old PCs, Pentium 4 era etc, really do show their age now. For word processing and email, they'll last indefinitely... web browsing though, thats another matter entirely. Browse the Internet with an old 2GHz P4 and you'll quickly realise that it isn't as quick as it should be, all that javascript and memory hogging behavior can give it a bit of a workout. When you throw Flash into the equation then you will really notice the performance hit (or more precisely, the inefficiencies of Flash). Browsing the modern Web with something like a Pentium 3 and you have to turn flash off, and even then, if you running on 256MB RAM... heh, that's painful.