Thursday, January 07, 2010

Nvidia - background data

Okay, here we go with a couple of posts from the City of Heroes forums, archiving them here. These were written before Intel killed off the Larrabee project, so there are references to Intel's current GPU plans when these were posted.

Post #1

Re: video cards

Originally Posted by Memphis_Bill View Post
I'm wondering if that's not just because they want to be seen as *more than* a gaming company. The GPU's been turning into just "more processors" for a while now (Folding@Home, for instance, can use some of them for computations, as can some graphics apps - just off the top of my head.) So, really, pushing that would (to me) make a bit more sense than just "You can push 5000 FPS over 9 screens @192000x14500" as far as what they want to highlight.

I was under the impression that was a result over the stink between them and Intel over chip support, patents/licensing or some such.
(note, this rambles as I try to fit everything together)

These two parts are actually related. Nvidia doesn't just want to be seen as something other than a gaming company... they have to be seen as something other than a gaming company.

The reality is, the stink over the Intel chipset licensing was a bit of a red herring. The short version is that Nvidia claimed that their pre-existing licensing for the system bus used on the Core platform also gave them license to the use the Front Side Bus developed for the Nehalem architecture. Intel said no, it didn't. Intel didn't actually explicitly tell Nvidia that they could not ever have a license for the Bus Logic system used in the Nehalem architecture. Intel just told Nvidia they couldn't use their pre-existing contracts.

What this meant is that in order to have chipsets for Core I7 processors Nvidia would have to come to a new contractual agreement with Intel... and Intel had set that price higher than Nvidia was willing pay.

The other problem with Nvidia's lines about why they were leaving the chipset market is that they abandoned the AMD platform. Not even 2 years ago the best AMD motherboards were shipping with Nvidia chipsets. Then, when Socket AM3 hit, Nvidia announced several chipset solutions... then never delivered.

If Nvidia had been serious about Intel's licensing being the reason why they left the chipset market, they would have been pushing their chipsets for both AMD and Intel platforms at the same time. They weren't. They had already axed development of one branch, and evidently, were looking to axe the other in a convenient manner. Now, I can't fault Nvidia for leaving the chipset market. The other third parties like Via and SiS had been winding down chipset support for other processors for years, and Nvidia took Uli out of the market. I can, and do, fault Nvidia for not being straight-forward with their investors about why Nvidia left the chipset market.


Now the reason(s) behind these product moves are partially found in both Intel Larrabee and AMD Fusion. Part of the goal of the Larrabee project wasn't just to develop a competitive performance graphics solution based on the x86 architecturer, it was also to merge an x86 based GPU onto an x86 based CPU. Both AMD and Intel will be launching Central Processors that include integrated graphics... on the same physical chip.

These new CPU/GPU hybrids will virtually eliminate many of the design issues OEM's have with current integrated graphics motherboards. TheGPU will be able to take advantage of the processor's cooling, and the display hardware and memory access components will be greatly simplified. The design changes will free up wiring going to and from the Central Processor and System chipset.

In the Integrated graphics world as pictured by Intel and AMD, a third party graphics solution doesn't actually exist. There's no design spacewhere a third party like Nvidia, SiS, or Via could plop in their own graphics solution. This means that the only place for a 3rd party graphics vendor, within just a couple of years, is going to be as an add-in card vendor.

From Nvidia's point of view this wouldn't quite be a bad thing since AMD / ATi is their only real competitor. Via's S3 Chrome series never gained any traction. PowerVR is only playing in the mobile segment if memory serves correctly, Matrox is only interested in TripleHead2Go, and 3DLabs hasn't been heard from since 2006.

However, with Intel entering the add-in card market, things don't look so rosy for Nvidia. One of the (many) reasons Intel's current graphics processors resemble an electrolux is that an integrated GPU can only run at limited clockspeeds, and in Intel's case, they don't actually make a GPU, they only make graphics accelerators that still leverage the CPU for some operations. In an external card though, Intel can crank the clockspeeds and cores... and Intel's gotten really good at software rendering through x86. As a mid-range part, Larrabee based cards are (were, still will be) a nightmare for Nvidia.

The fact is, the largest selling graphics segment.. is mid-range and low-end cards. Nvidia and AMD make very little money off of their high-end cards. Most of their operating cash comes from the midrange sets, like the GeforceFX 5200, Radeon 9600, Geforce 6600, RadeonHD 3650, and so on.

With a potentially legitimate, and humongous, competitor coming up in what has traditionally been the bread-and-butter for most graphics vendors, the market Nvidia thought they could sell into was lessened. Nvidia also had other issues, such as bad reputation with ODM's like Clevo and FoxConn, and OEM's such as Asus, Acer, HP, and Apple, over the issues with various NV4x, G7x, and G8x GPU's having (drastically) different thermal envelopes than Nvidia originally said those parts had. Just look up Exploding Nvidia Laptop.


As a company, Nvidia's market strategy needed to change. They need to become something other than a graphics card vendor. They couldn't depend on Intel's GPU entry being a complete mongrel. Learning that Epic Software, the guys behind one of the engines that builtNvidia as a gaming company (Unreal Engine), was involved in helping Intel optimize Larrabee for gaming had to land like a blow to the solar plexus. While I'm not saying that Epic's involvement is magically going to make an x86 software rendering solution good... I think Intel is serious about getting Larrabee right as a graphics product... and when Intel gets serious about a product, they generally deliver. E.G. Banias into Core 2, and then the Core 2 architecture into I7.

As a company, I can't really figure out what Nvidia's ultimate strategy is going to be. A couple months ago I would have throught they were trying to get into the x86 processor market, and produce a competitor to both the Intel CPU's and AMD CPU's, essentially becoming yet another platform vendor. I think, emphasis on think, these plans have been scrubbed.

It's been more profitable for Nvidia to focus on offering multimedia platforms, such as the Tegra. If rumors are to be believed, Nvidia may have even landed the contract for the next Nintendo handheld platform... a potential slap in the face to rival AMD whose employees have been behind the N64 (SGI), GameCube (ArtX), and Wii (ATi) consoles, and who were at one point contracted to build a system platform for the successor to the GameBoy... a project that fell through when the DS started flying off shelfs like it was powered by redbull wings.

It's the opinion of some analysts, and myself, that maybe not even Nvidia themselves know what direction they are going to take in the coming months. While it would be disappointing to see Nvidia leave the gaming card market, it is perhaps one of the options on their table.

(I'm poking through these threads since I have a system build coming up. Haven't decided if I want to run AMD or jump to Intel - not least because of money, but partially because of Socket Madness that's traditional to Intel. I can throw the $150-$200 I'd save off of the Intel system towards a better video card, after all. Current system - AMD, 3+ years old, and just barely at the end of its supported CPU upgradeability.)
I know what you mean. I've commented before about the Intel D975XBxmotherboard I have, that despite being a Socket 755 motherboard, and if I'm not mistaken, the first Crossfire enabled motherboard Intel ever did, only supported Conroe, Allendale, Kentsfield Core 2 processors, and will not support any Yorkfield or Wolfdale processors... that were just die-shrinks of the processors it did support.

On the other hand, I've got an Asus M2R32-MVP, which uses the Radeon Xpress 3200 chipset, which was a Socket AM2 motherboard from 2006... that's happily chugging away with a Phenom 9600.


Post #2

Originally Posted by Geobaldi View Post
Ok here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?
Yes / no / not really Yes, two 9800 GTX's in SLI are pretty powerful, but you are dependent on software support of multi-gpu rendering. It'd be a cheaper way to gain more performance while you wait for Nvidia to go bankrupt or get bought out. (and no, that's actually not a joke, I think Human Being addressed the financial problem Nvidia has trying to move GTX chips right now... or somebody did in the thread) No, I'm pretty sure that a single GTX 275 would outrun two 9800's. Granted, I can't actually test this. I don't have a GTX 275 on hand, although I do have GTS 250's... which are pretty much just die-shrunk rebadged 9800's. I'll get to the Not Really at the end.
After reading toms hardware guide on where video cards stand I'm reluctant to purchace any of the 2xx series at this time.
I mentioned this earlier in the thread, but I'd keep a barrel of salt on hand if you're reading Tom's. The site has been busted multiple times for accepting money to slant reviews one way or another. Now, while I can come up with dozens of reasons of my own not to buy a GTX 2xx card from Nvidia, most of my concerns simply won't matter to the average gamer.


Okay, here's the Not Really part for those not interested in a soft analysis. Not Really: it's a toss-up. If your an Nvidia fan, well. To broach the subject again, it's really questionable whether or not Nvidia is actually going to matter in a few months. Fermi's going to be launching against Larrabee and shortly before the product refreshes on the RadeonHD 5x00 series. Intel and AMD are also going to be pushing CPU's with Integrated GPU's in a 6months to 8months... and that's going to nce again create a three-horse race in the low to mid-range graphics processor segment. I'm not too entirely sure Nvidia can survive as a vendor of gaming graphics cards if it's traditional bread and butter market, the low end market where the likes of the Geforce2 MX, Geforce4 MX, Geforce FX 5200, and Geforce 6600 dominated, goes away.

Nvidia's going to have to launch Fermi with parts at all segments, from the low-end to the High-End, and deliver large quantities. Since they are dependent on TSMC, that's... not really something I'm sure Nvidia can pull off. AMD is having enough problems pulling their own 40nm parts that are completed out of TSMC.

In order for Nvidia to survive, they are going to need a megabucks deal. One of the hot rumors that is going around now is that Nvidia has won the contract for the next Nintendo handheld. Personally, I doubt it given Nintendo's decades long relationship with AMD. Remember, the N64 was created with help from SGI, and that SGI team left to become ArtX. ArtX did the design for the Gamecube's LSI, but were bought up by ATi shortly before 2003. ArtX turned ATi around, helping to launch the Radeon 9700 series of cards and revamp ATi's... broken... driver program. ATi then did the graphics design for the Wii, something AMD was rather grateful for after the merger as Nintendo's profits helped shore up crashing processor revenue streams. At one time ATi had a contract to do the GPU for the so called GameBoy 2, a console that was halted, then dropped, after the DS went from side-toy to main-show.

With this kind of history, Nvidia would have to offer an astoundingly good deal to beat what has been a financially successful string of consoles for Nintendo. So I don't think this kind of guaranteed income deal is in store from Nvidia.

We also know that Nvidia's president has very kind words for Apple, and has been developing mobile platforms targeted towards the markets serviced by PowerVR / Imagination... a company that Apple has invested in. Some have suggested Nvidia is trying to maneuver themselves into a position to be bought out by Apple.

Now, with this sort of background, and multiple questions surrounding what's going to happen to / with Nvidia, the sensible thing to do is wait it out. Wait for Fermi parts to be delivered.

TL;DR version: Honestly, I'd save your money for next year rather than rushing to upgrade now.
Post a Comment