Anyone remember when creative skipped the Audigy 3 and went straight to 4? Stellar performance increase - massive and explosive sound - doubled transistor count - and yeah, it was nothing but a minor revision of the Audigy 2, released as version 4 and then later re-badged as the X-Fi Xtreme. Though the comparison is not entirely fair, since there are a number of fundamental differences in the architecture, the same speculations were made.
Nvidia seems to be spending a lot of time and resource on software development, which as they've pointed out in previous announcements; is there strong point. But i can't help but think about the hole they're digging. In effect, they're concentrating on creating software accelerated hardware acceleration, which can be a good thing, but often results in some very bad holes forming. They run the risk of creating over-specialized software enhancements and neglecting more general hardware performance. As we've seen in many games and applications; but mainly games; hardware performance is ok, but it doesn't shine until the software/driver enhancements are made - and made specifically for that game. Not every game or app will receive this treatment and there is the risk of Nvidia (and ATI), being overly selective about the enhancements they bring with driver tweaks to a select number of games/apps, ignoring nearly everything else. The more software released, the more they need to optimize and thus, the more selective they become, much like what Intel was going to do with Larrabee. I'm not saying software enhancement is bad, just over-reliance on software tweaks can be a dangerous long term path.
Additionally, Nvidia is normally a very vocal company, when they have something stellar in the works, they really let the world know. When they go quiet, much like with Fermi, i get concerned. People know of Fermi, but it's not generating the same buzz we normally associate with Nvidia, details on real performance are significantly lacking, which really makes me second guess the real performance. There is plenty of speculation with regard to the architecture in that it is more GPGPU centric and that Gaming performance may suffer as a result. Mixed with multiple delays, i'm feeling some deja vue, ATI with the x1000 and x2000 releases. Guess we won't know until somethings actually released.
Sorry, i should point out, this was not meant to be a rant. The ATI x1 and x2 releases were not failures, they just didn't provide the level of performance and improvement that people were expecting. When ATI released the x3 and x4 series with the new architectures, there were signs of hope, and it wasn't until the x5 when the real potential was unleashed (but could be seen with the x4). With Nvidia, the same could happen, Fermi will be the trial, working out the kinks in the architecture, sowing the seeds for later generations that are built on it, which might explain the delays as they work out the software element. But of course, this all remains in a shroud of speculation, but we may not need to wait much longer to find out.