The Future of NVidia
So, what lies in the future for NVidia? It’s both cloudy & certain at the same time.
Financially, NVidia is doing ok. They just announced their financials with a disappointing $105 Million Loss. They’re finding it difficult to shake the lossess associated with the recent mobile GPU settlement, which destroyed their $38 Million profit for the quarter. They actually brought in $776 Million, which is up 17% from last quarter, but down 13% from last year. Once they finally shed the settlements from the mobile GPU lawsuit, they’ll be on much better financial footing. The growing interest in GPGPU is going to be huge for NVidia.
It’s a well known and accepted truth that GPU-computing will continue to grow. CPU’s speed increases have dropped significantly to a meager 5-10% each year, while GPU’s offer a one-time improvement of near-mythical proportions (100’s to 1000’s of X’s), and then a regular improvement of nearly 50% each year. It’s a no-brainer to see that becoming a driving focus of applications and simulations in the near term, with things already popping up like the MachStudio Pro suite, Adobe Premiere, and more.
However, NVidia’s place in this is harder to determine. NVidia has single-handedly overtaken GPGPU computing with their CUDA suite, currently the #1 GPGPU toolkit in use. It’s taught in computer science schools now, used by high-end computing companies (like the Department of Energy), and extremely easy to setup and get started. The success of CUDA has become instrumental in the design of it’s successor OpenCL, and anyone using the two can see the obvious influences. OpenCL, tho, is not vendor-locked and will work with a wide variety of processors like the ATI chips and, most worryingly, the Larrabee from Intel.
The Intel Larrabee is currently vaporware, with specifications and capabilities changing every time they talk about it so nobody knows if it will have the power to crush NVidia or simply be the next “Microsoft Bob” of the tech industry. No matter which way it goes, Intel has shown they have the drive and resources to develop GPU-style massively parallel processors, and will continue to be a thorn in the side of NVidia for many years to come.
But until Larrabee arrives, NVidia is going to enjoy the quite sizable lead they have in GPU computing, as evidenced by initiatives like OptiX, PhysX, CUDA, CULAPack, and more. NVidia has actively engaged the software community to build tools around their CUDA systems, providing out-of-the-box value in areas like Physics simulations, ray-tracing, and high-end mathematics. Combine the drive to GPGPU with NVidia’s offerings to extremely high-end visualization, through the 4-GPU 16GB QuadroPlex systems, and they simply own the high-end marketplaces.
But NVidia isn’t content to sit back and watch, they’ve got new things on the horizon they assure me. While they, obviously, wouldn’t make any official statements on the GT300 chipset, something is definitely on the horizon. My guess: Once the Larrabee finally comes to market, we’ll see new chips from all the major players (ATI, AMD, NVidia) in response and they will blow our minds.
