Author Topic: Nvidia DMI chipset development on hold  (Read 3291 times)

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Nvidia DMI chipset development on hold
« on: October 09, 2009, 12:10:14 AM »
Article on Engadget.

tl;dr:
Quote
We will continue to innovate integrated solutions for Intel's FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel's improper claims to customers and the market that we aren't licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we'll postpone further chipset investments for Intel DMI CPUs.

Yeah, alright, I can understand that it's just business... but that's really an asshat move on Intel's part. I'll withhold further judgement until I hear their side of the story.
« Last Edit: October 09, 2009, 04:22:42 AM by kureshii »

Offline Xtras

  • Member
  • Posts: 894
  • ~
Re: Nvidia chipset development on hold
« Reply #1 on: October 09, 2009, 12:26:12 AM »
Lol, for two companies whose products supposedly go very well together, Intel and Nvidia sure do have a lot of quarrels.  ;D

While Intel is very likely to remain dominant, I wouldn't be surprised if Nvidia gets overtaken by ATI. They still haven't released any of the 45nm technology that I have been waiting for... >:(
edit:
Sorry. Kureshii is right, 40nm.
« Last Edit: October 10, 2009, 08:20:11 PM by Xtras »

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Nvidia chipset development on hold
« Reply #2 on: October 09, 2009, 01:00:20 AM »
45nm? Or do you mean 40nm...

Surprisingly (or unsurprisingly) Nvidia's market share has actually grown. Personally, I'm glad they're thinking of the small-footprint PC market as well, as evidenced by their conceptualisation of the ION platform. ION's performance shows how well a weak CPU paired with a GPU can do for HD playback. It's too bad Intel doesn't want to tango with them...

IMO Nvidia has the bigger picture in mind; both in terms of the other market segments, as well as in GPGPU computing. ATi makes cards mainly for gaming (good price for performance, for gaming-specialised needs), while Nvidia seems to be heading towards a computing-on-GPU direction. In the GPGPU field, Nvidia has released drivers for CUDA C, CUDA Fortran (released by Portland group for license-owners only), OpenCL and DirectCompute. That's quite a list.

ATi, on the other hand, only has BrookGPU so far, and their OpenCL drivers and SDK are released only to selected developers.

[edit] I just noticed that AMD actually has a "GPU'd" version of their ACML (AMD Core Math Library); that's pretty interesting, since accelerating applications doesn't get much easier than that. Just compile your app with their ACML-GPU and it's good to go (theoretically). Of course, I am only speaking about scientific simulations and things that actually use these math libraries, and not the usual generic apps... I might pick up an old 48XX series card to try this out.
« Last Edit: October 11, 2009, 01:15:35 AM by kureshii »

Offline kostya

  • Member
  • Posts: 181
  • Rar
Re: Nvidia chipset development on hold
« Reply #3 on: October 09, 2009, 04:10:03 AM »
The topic name is way off. The article only talks about nVidia stopping development of i7 chip sets. They will continue developing chip sets for AMD CPUs and other Intel CPUs. I seriously doubt that they will leave the market. I did an internship with the nVidia chip set group last winter. They were developing cool new things to blow the competition away.

Offline relic2279

  • Box Fansubs
  • Member
  • Posts: 4479
  • レーザービーム
Re: Nvidia chipset development on hold
« Reply #4 on: October 09, 2009, 04:12:27 AM »
If the courts rule in intel's favor, I wonder how much that would hurt nvidia. If it's significant enough, I could see intel buying nvidia like ATI and AMD.

Offline Xtras

  • Member
  • Posts: 894
  • ~
Re: Nvidia DMI chipset development on hold
« Reply #5 on: October 10, 2009, 08:29:27 PM »
I dunno, I don't see the practicality in pursuing development in how to compensate for lower end parts. In essence, if someone is going to buy hardware, open up their computer, and install that hardware, why not just replace the CPU instead of the GPU. A lot of the motherboards used for even the lower end processors in the past years since vista's release support up to quad core upgrades.

Also, in general I have seen great improvements in Gaming experience when I use higher end hardware. That experience however doesn't translate into viewing videos. Between the minimum needed to run 1080p without problems, and even the latest Alienware computers, I haven't seen much difference in video playback (although I will admit, when skipping around in videos, lower hardware does cause problems).

Nonetheless, I do feel Intel is being a bit of a jerk. It is a pretty clear indicator that they some kind of strategy up their sleeve for their next big move.

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Nvidia DMI chipset development on hold
« Reply #6 on: October 11, 2009, 01:18:14 AM »
I dunno, I don't see the practicality in pursuing development in how to compensate for lower end parts. In essence, if someone is going to buy hardware, open up their computer, and install that hardware, why not just replace the CPU instead of the GPU. A lot of the motherboards used for even the lower end processors in the past years since vista's release support up to quad core upgrades.
Do note that the release of new processors do not just affect us people who like to tinker with our hardware... OEMs use the same processors to build systems for the average Joe who just wants a ready-made Dell or HP.

These people constitute quite a significant portion of the market, and to them price is likely a bigger concern than how much L2 cache it has or what the memory CAS latency is. So price reduction is always an issue once you decide to target the mainstream market.

Admittedly, the P55 chipset is aimed more at the “performance” crowd (to distinguish them from the “enthusiast” crowd who are likely to go for the X-series chipsets), but do a quick google and you'll see what plans Intel has for the H55/H57 chipsets.

By leaving only the southbridge components off-die, they've kind of put Nvidia in a spot with regards to the mainstream crowd. This allows Intel to use only a DMI bus to link the processor and chipset; the motherboard essentially holds nothing more than the southbridge. And by squeezing Nvidia out using this new DMI license, their monopolistic intentions are made quite clear.

Hopefully this will be sorted out and Nvidia will suffer little more than a short delay in acquiring a license for this new DMI bus. Nevertheless, it's still an asshat move on Intel's part.
« Last Edit: October 11, 2009, 01:34:52 AM by kureshii »

Offline K7IA

  • Member
  • Posts: 884
  • :)
Re: Nvidia DMI chipset development on hold
« Reply #7 on: October 11, 2009, 07:51:25 AM »
I am certainly not up-to-date with what is going on for the last 10+ years concerning computer technology, but shouldn't this chipset thing be a concern of the past?

I mean, they can put a billion+ transistors on a thumb sized die but still some vital components for the core architecture are off die. I can understand the North and South bridges were shared amoung different cpu architectures 10+ years ago, but they have their own architectures now, why still external??

It is really annoying, my 2 year old core2 duo machine has a bigger motherboard that my 15+ year old 386dx. Ok it has an embedded NIC and sound, but they are quite irrelevant these days. There is only a graphics card attached to the PCI-ex lanes (a non-technical approach :) ) and the rest of the box is filled with air and cables.

I want smaller form factor, SoC solutions. It is really unneccesary to blame Intel if they want to combine more into a single IC, and leave out 3rd party manufacturers, imo. I don't want the nVidians whining about how they can't agree with Intel about chipset production either. Nobody is whining about why the FPU is embedded to the CPU 15+ years ago.


Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Nvidia DMI chipset development on hold
« Reply #8 on: October 11, 2009, 09:32:45 AM »
[Discussion of x86 architecture will be assumed from this point on. Forgive the long post.]

You can already have your small-form-factor and/or SoC (System-on-Chip) solutions. If you want complete SoCs (i.e. CPU+GPU+Northbridge+Southbridge in a single chip), those do not exist yet; you’ll have to wait until Intel brings them out. but if a GPU+NB+SB SoC is fine with you, those have been around for quite some time, in the form of Nvidia's Geforce series chipsets.

With regards to small form-factor solutions, Nvidia recently released their ION platform, a mini-ITX form factor platform merging Intel’s Atom processor and Nvidia's Geforce 9400m chipset. You should take a look at that article, I’m sure you'll love it. Intel has nothing like that since they suck at making graphics chips graphics chipsets are not their forte.
And let’s not forget that VIA has been in the background quietly churning out their Eden and C7 embedded systems; in mini-ITX, nano-ITX and pico-ITX form factors.

But that does not mean multi-die systems are going to go away; such systems still provide more flexibility in incremental upgrading of a computer system, as well as better customisability of hardware to a consumer’s needs. (If the processor were mated to the controller hub, you’d have to make a separate chip for dual cores, dual cores with embedded graphics and quad cores, even if they are based around the same controller hub.) It is Intel’s anti-competitive practices in this segment that are the issue at hand.

Intel is taking a different approach with the Core i3/5 series of processors and P55/P57/H55/H57 series of chipsets, by merging the GPU and NB onto the CPU instead of the chipset. That is not an approach I abhor. In fact, it’s why I want a Lynnfield desktop, having held off from the Core 2 series and 4-series chipsets all this time.
The issue here is not that Intel wants to lock third-party chipset makers out by integrating more and more chips into a single die. (I would be quite interested to see a complete Intel SoC as well.) They’re trying to doing so by using anti-competitive practices, not by changing their architecture.

I should start by saying that I do not blame Intel for their business practices; business is business, and I can sort of understand the mindset of a competing organisation. But an organisation’s actions will reflect its attitude towards aspects of business, and such a move (intentionally holding up the licensing of a new bus to a third-party chipset maker) reflects an anti-competitive mindset that is unhealthy for the market.
Such practices have been going on for some time. Again, I will not comment on the “fairness” of such practices, but they definitely reflect an anti-competitive mindset (and perhaps some fear on Intel's part), and that takes them down a few notches IMO.

I’ll end off by giving teaser images of your “small form-factor board”:


That is Zotac's Nvidia 9300 board, compatible with Intel Socket-775 processors ;) If you want the Atom version (Nvidia ION platform) cheaper, tell Intel to stop selling Atom processors to third-party OEMs at such a high price.

(The Nvidia ION systems are available from Zotac pre-built as well. They will be pricier than they would otherwise be, though.)
« Last Edit: October 11, 2009, 11:01:16 AM by kureshii »

Offline K7IA

  • Member
  • Posts: 884
  • :)
Re: Nvidia DMI chipset development on hold
« Reply #9 on: October 11, 2009, 10:27:52 AM »
Very informative post, always nice to read an expert opinion :)

Haven't read anything from Intel website since that 80 core die, if I recall correctly dating October 2006. I am glad to see that Intel has a solid plan and road-map for SoC solutions.

Concerning hw upgrade flexibility, owner's tendency to replace parts of a system has an inverse proportion to age (both) :) , for example why did you buy a totally new setup for CUDA, why could you not just upgrade?

Via C7 and mini-ITX brings back memories, from my first attempt to use a thin client to build my very own NAS, 3 years ago (seriously), ended up using a p3 1000 (still runs O.K.) because of the stupid price of the setup. Actually, I had found a good looking thin client from Fujitsu-Siemens for ~350$, (if I recall correctly)  but when I get to the shop, the guy told me that I had to buy 50 of them to get it at that price (stupid really).

Thanks again for the detailed info.

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Nvidia DMI chipset development on hold
« Reply #10 on: October 11, 2009, 12:02:30 PM »
You’re welcome :) I’m no industry analyst, just an enthusiast when it comes to small-form-factor systems and consumer processors/chipsets.

[oops, looks like I ended up writing a wall of text again...]

Concerning hw upgrade flexibility, owner’s tendency to replace parts of a system has an inverse proportion to age (both) :) , for example why did you buy a totally new setup for CUDA, why could you not just upgrade?
To go slightly off-topic, I would have upgraded a setup for CUDA... if I had an existing system to upgrade. The family desktop was an old socket-478 Pentium 4 with AGP+PCI. So the CUDA rig is effectively my first self-assembled desktop xD (yeah, I was really late to the self-assembled PC game...)
The family desktop is now an E5300 Pentium dual-core on Intel G41 chipset (socket 478 was at the end of its upgrade life, so I gave it away to someone upgrading from a Pentium 3). They don’t need anything powerful, so they’re happy with it. In the near future I can still upgrade it to a Core 2 Duo/Quad without having to change the chipset, which is really convenient (and wallet-saving).

I am largely vendor-agnostic, and while I still am hankering for an Intel Lynnfield desktop, my CUDA rig is an AMD Phenom II. So while I do not hold any grudge against Intel for their unfair pricing practices, I’m not an Intel-exclusive person either.



At this moment it really irks me that I have to shell out:
$170 for a Zotac board with Nvidia ION+Atom N330
($190 for a variant with integrated DC-DC power supply + AC-DC adapter)
$130 for a Zotac board with Nvidia GF9300 and LGA-775 socket (with rebate).

If you look at the Intel Atom N330+945GC boards you’ll see just how cheap Atom N330 systems can really be.

Without knowing the industry that well, and without insider information I won’t attribute ION’s high cost solely to Intel. Perhaps Nvidia just isn’t that good at marketing or cost-reduction, who knows?) But when such news is released, one has to wonder if things really can’t be any cheaper than they are. Assuming what Nvidia’s CEO said about the cost of Atom processors is true, how much cheaper can we get ION systems? Even a 10% reduction in overall price is sizeable.

A small form-factor platform with decent HD playback performance under $150 (sans casing, peripherals etc) suddenly sounds very real. What would it take to get us there? Maybe Intel can stop pricing Atom so high for third-party developers. Maybe Zotac can think about improving production efficiency. And maybe Nvidia can actively work towards getting things sorted out with Intel. Who really knows how these things go. All I know is,


An HD-capable setup like that could one day be available for less than $400.

Fine, not terribly exciting news for most people >_> especially when Best Buy already has HD-capable dual-core systems for <$400, but for small form-factor enthusiasts, ION is a big step forward, both in terms of price and performance.

----------

And to try to get back on topic now, I have to admit that I’m not really feeling the pinch from Nvidia losing the license for the new DMI bus, despite all my ranting. The PCI-e hub and memory controller are on the Lynnfield die now (Intel calls it the uncore, to separate it from the processor), so the P55 chipset is nothing more than an overpriced, glorified “ICH10.5R” southbridge.

Even so, the ICH10 is an excellent southbridge, and I find it unlikely that Nvidia can make a southbridge that will blow the ICH10 away performance-wise.

Besides, now that the interconnect (link between chips) is reduced to DMI with 1GB/s bidirectional transfer rate (you’ll see this written as 2GB/s, which is just total up+down transfer rate), Nvidia loses a big part of their advantage. Their nForce chipset used HyperTransport for the chipset interconnect, giving their southbridge a maximum transfer rate of 8GB/s, about 4 times more than DMI. That’s an advantage gone if they’re going to start making DMI chipsets for the new Nehalem. (Nvidia has already announced that they will not be making system chipsets for Bloomfield, i.e. socket-1366 Core i7.)

Still, it would be great if Nvidia can make an alternative product, even if it’s just to compete with the horribly overpriced P55 chipset ($40 for P55 vs $3 for ICH10R, to serve the same function). Anything to make Lynnfield ownership cheaper ^^
« Last Edit: October 11, 2009, 12:11:22 PM by kureshii »

Offline K7IA

  • Member
  • Posts: 884
  • :)
Re: Nvidia DMI chipset development on hold
« Reply #11 on: October 11, 2009, 01:02:35 PM »
You’re welcome :) I’m no industry analyst, just an enthusiast when it comes to small-form-factor systems and consumer processors/chipsets.
I didn't use "expert" based on this post only, how many people know the similarities between P55 and ICH10. I sure didn't sometime ago :P

Concerning hw upgrade flexibility, owner’s tendency to replace parts of a system has an inverse proportion to age (both) :) , for example why did you buy a totally new setup for CUDA, why could you not just upgrade?
To go slightly off-topic, I would have upgraded a setup for CUDA... if I had an existing system to upgrade. The family desktop was an old socket-478 Pentium 4 with AGP+PCI.
This is exactly my point, by the time you are willing to upgrade your own pc, either the number of cpu pins or the power cord have changed, which severely hinders the hw upgrade flexibility.

Assuming what Nvidia’s CEO said about the cost of Atom processors is true, how much cheaper can we get ION systems? Even a 10% reduction in overall price is sizeable.
I recall something similar concerning the Centrino platform, Intel forced OEM manufacturers to buy the CPU+Wireless NIC (I might be mistaken), or you could not put a Centrino amblem on the notebook, while Intel was simply promoting the brand Centrino. Forcing people to buy Atom+945GSE through price policy is at least unfair IMO. These are actually all part of a grand business plan, DON'T KILL and DON'T LET IT OVERTAKE.

Quote
A small form-factor platform with decent HD playback performance under $150 (sans casing, peripherals etc)
Well, it is difficult to decide what to do when you have technical knowledge, you just end up buying more, which increases costs.

A simple example would be, sometime ago I wanted to be able to use internet and watch video on my hdtv. I decided that I needed at least a setup that would fit into a very small case, wireless keyboard+mouse and wireless internet, a good hdd and an external dvd-rom, ram which eventually spoiled my primary intention, to create a cheap rig for entertainment. Of-the-shelf solutions are cheaper :)

--

Quote
----------

And to try to get back on topic now, I have to admit that I’m not really feeling the pinch from Nvidia losing the license for the new DMI bus, despite all my ranting. The PCI-e hub and memory controller are on the Lynnfield die now (Intel calls it the uncore, to separate it from the processor), so the P55 chipset is nothing more than an overpriced, glorified “ICH10.5R” southbridge.

Even so, the ICH10 is an excellent southbridge, and I find it unlikely that Nvidia can make a southbridge that will blow the ICH10 away performance-wise.

Besides, now that the interconnect (link between chips) is reduced to DMI with 1GB/s bidirectional transfer rate (you’ll see this written as 2GB/s, which is just total up+down transfer rate), Nvidia loses a big part of their advantage. Their nForce chipset used HyperTransport for the chipset interconnect, giving their southbridge a maximum transfer rate of 8GB/s, about 4 times more than DMI. That’s an advantage gone if they’re going to start making DMI chipsets for the new Nehalem. (Nvidia has already announced that they will not be making system chipsets for Bloomfield, i.e. socket-1366 Core i7.)

Still, it would be great if Nvidia can make an alternative product, even if it’s just to compete with the horribly overpriced P55 chipset ($40 for P55 vs $3 for ICH10R, to serve the same function). Anything to make Lynnfield ownership cheaper ^^

I will need to read more articles to properly interpret this part of your post  :D

Offline Xtras

  • Member
  • Posts: 894
  • ~
Re: Nvidia DMI chipset development on hold
« Reply #12 on: October 11, 2009, 02:04:00 PM »
I am still betting that Intel's scheme is to get Nvidia's stock to go low enough to where they can buy out the company, or do a merger or something like that. Then we might have some awesome stuff rolling off the line.

I don't think Intel just wants to throw such a useful company under the bus for the heck of it.

Offline bloody000

  • Member
  • Posts: 1401
Re: Nvidia DMI chipset development on hold
« Reply #13 on: October 11, 2009, 11:58:44 PM »
I am still betting that Intel's scheme is to get Nvidia's stock to go low enough to where they can buy out the company, or do a merger or something like that. Then we might have some awesome stuff rolling off the line.

I don't think Intel just wants to throw such a useful company under the bus for the heck of it.
LOL what?
All you have to do is study it out. Just study it out.

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Quick guide to next generation processors and chipsets
« Reply #14 on: October 12, 2009, 02:33:49 AM »
Concerning hw upgrade flexibility, owner’s tendency to replace parts of a system has an inverse proportion to age (both) :) , for example why did you buy a totally new setup for CUDA, why could you not just upgrade?
To go slightly off-topic, I would have upgraded a setup for CUDA... if I had an existing system to upgrade. The family desktop was an old socket-478 Pentium 4 with AGP+PCI.
This is exactly my point, by the time you are willing to upgrade your own pc, either the number of cpu pins or the power cord have changed, which severely hinders the hw upgrade flexibility.
I certainly couldn't take the family desktop for my own toying around... but if, say, I already had a P35/AMD770 motherboard I would probably have used that instead of buying a new one. And the family desktop was at the end of its upgrade path (socket 478 stops at the Pentium 4 series).

Even now, when the family eventually upgrades to a better dual-core, I intend to use that E5300 for a home server or perhaps pair it with the Zotac 9300 for a micro-sized HTPC ::) As long as you're not at the end of an upgrade path, there are always options (unless your PC is 10 years old). I believe you can still find socket 478 Pentium 4s on sale at a lot of places for really cheap; useful if you're still on a socket 478 Celeron (such as a certain person named Josh on #bakabt).

You do have a point though, that most cheap PCs are on platforms that are already at or near the end of their upgrade life, so people tend to lose the upgrade capability if they buy a system like that. But additionally I think most people just aren't aware that they can upgrade the individual components of their PC — there is a common misunderstanding that upgrading means moving from a Dell Inspiron to a Dell XPS xD

I agree that simplification of choice would be really good. Over the past 2 years or so, the socket 775 platform has seen 2 generations of Intel chipsets (3-series and 4-series), as well as 2 or 3 generations of processors (on LGA775 were the Pentium 4, Pentium D, Core Duo, Core 2 Duo and Core 2 Quad processor families, I'm not even sure how to count Intel processor generations anymore). That's 3 manufacturing processes (90nm -> 65nm -> 45nm) and 3 different core variants (single-core, dual-core, quad-core), which is lots of upgrade room if you're a DIY PC builder who keeps passing down older parts to other people.

But all that choice would probably leave many a newcomer flustered (what is the difference between the E6000, E8000 series, Q6000, Q8000 and Q9000 series?) I often have to google/wiki lots of this stuff because there's just too many things to keep track of, so I can imagine how anyone new to the commodity hardware market would feel.

----------

[At this point I start digressing into the Core-i_ series processors, and what follows is essentially quick coverage of this generation of processors and chipsets, as well as a mini-rant]

I have tried to write a chipset identification guide before, and the experience was enough to put me off trying to write one for processors (which are undoubtedly much more proliferous).

Unfortunately, consumer confusion doesn't seem likely to go away even with further architecture simplification. The Core i7 series is simple enough: Socket 1366, Core i7/i9 processor paired with an X58 chipset and ICH10 southbridge. But if we move to the Socket 1156 Core i3/i5/i7 series, things get really confusing; I think it's possibly even worse than with the Core 2 generation due to the optional integration of GPU on-die. Let's see...
  • We have the Socket 1156 Core i7 (not the same as the socket 1366 variants!), which is 4 cores 8 threads.
  • Then we have the Core i5, which is 4 cores 4 threads. Oh, but that is the quad-core variant, with no on-die GPU. A dual-core version with on-die GPU will follow when Intel moves to a 32nm process with an accompanying die-shrink. And, naturally, there is a mobile variant for laptops and low-power systems. So far so good.
  • Next, the Core i3. 2 cores 4 threads with on-die GPU. Nothing confusing... except existing Core 2 Duo/Quad processors may, at some point in the future, be rebranded Core i3 as well.
  • Finally, Pentium. Yeah, it seems Pentium has become the new cheap processor. I wonder if there'll be a 32nm Celeron. The 32nm Pentium is just an i3 processor with even less cache, and no hyperthreading.
Secret cheatsheet for Core-i_ processors.

Still have your wits together? Let's move on to chipsets then.
  • X58 is the enthusiast chip, supports socket 1366 only, which means Core i7-9XX and Core i9 processors. Easy-peasy.
  • Next, P55, which is currently available. No on-board video interfaces (linked to the on-die GPU of Core i5 and i3 processors). Supports SLI. Will be followed by the P57 chipset, which purportedly adds Braidwood support (think Intel Turbo Memory reborn) and "Quiet System Technology". Oh, by the way, nobody even knows if Braidwood is for real or not. Motherboard makers are playing safe and including the physical Braidwood interface for now, although they may not necessarily be hooked up to Braidwood connectors and/or supported in the BIOS.
  • Next, H55. Has FDI (Flexible Display Interface, i.e. video out). Does not support SLI (1 x16 PCI-e connector only). Will be followed by the H57 chipset, which adds Braidwood support, and a bunch of lingo nobody really understands (Remote PC Assist Technology, Rapid Storage Technology... wait, you mean H55 doesn't do Remote Connection and can't do read/write quickly? That's not right...).
  • And then Q57, the corporate version that nobody cares about (except IT purchase departments and maybe sysadmins).
Secret cheatsheet for Core-i_ chipsets.

Study them hard, there will be an exam on Wednesday. [j/k]

tl;dr -> You still have to pick between 2 chipsets (with/without onboard video), and 3 classes of processors. And if you find it hard to pick, remember that Core-iX --> X is as large as your epeen.

Are you confused? No? Good, because I have questions for you, questions I don't have the answers to.
  • What is the difference between the 2-core-4-thread Core i3 and the 2-core-4-thread Core i5?
    • Will Clarkdale (32nm Core i3/i5) be backward-compatible with H55?
    • The PCH chipset (generic name for the chip used in P55/P57/H55/H57) has 8 PCI-e lanes. Note that these lanes are on the PCH chip and are not the same as the PCI-e lanes on the processor die. Most motherboard manufacturers will use these for third-party chips (for more SATA ports, HD audio, LAN etc). Their use on mid/high-end P55 motherboards is obvious, but how will they be used on H55/H57 systems?
    • Why does the H55 have only 6 lanes of PCI-e instead of 8? Is it artificially limited, or is it a different chip? Or maybe those lanes are used for something else?
    • If current-generation Core 2 Duo/Quad processors are rebranded as Core i3, how will we differentiate them from the 32nm Core i3s?
    • Will Intel ever price the P55 reasonably?

    This is even before we start asking questions about software and OS compatibility (does Braidwood work in Linux? Will [HD video/audio feature] work with the onboard chipset?). Of course, most of these details won't matter to the majority, but if you're one of the few who are trying to make the perfect HTPC or build a very tuned system and like to fuss over the details, then I'm sure you will find these questions intriguing.

    To end this post, I'll put another teaser image:

    That's an H55/H57 motherboard for Clarkdale (dual-core Core i5 with on-die GPU) with mini-ITX form factor. Yay for small form-factor!
« Last Edit: October 12, 2009, 03:05:44 AM by kureshii »

Offline Xtras

  • Member
  • Posts: 894
  • ~
Re: Nvidia DMI chipset development on hold
« Reply #15 on: October 12, 2009, 02:57:30 AM »
My advice to new people looking to pick out a processor:
Skip Wikipedia (which at times can get too technical) and go to the comparison charts on Intel's homepage. Intel actually has charts that compare similar processors. You will notice little cliques within these charts (for example when the processors jump for 8mb to 12mb cache or something like that). At each of these cliques, you will find a pretty sizable jump in price. Then decide whether that new feature is something you feel will really take off in terms of performance. Intel also includes pretty nice "tech explanation for idiots". They detail what FSB and L2 Cache and all of those are in terms anybody can understand (that was how I originally figured it out actually).

Lol, no matter what the reason, you should not be buying a Q6000 series at this point.
Go straight to the Q8000 series processors. Q6000 was more of Intel's trial at quad core technology. A first generation iPhone if you will. The Q8000 series is the resulting iPhone 3G. Q8000 and above feature 45nm cores, better overclocking (though this largely depends on your mobo), and quite a few fine tweaks that most people won't know of but which make a difference nonetheless.

Provided I am running just fine with a Q6600, but it just has no advantage right now, whether in price or quality. Actually, a lot of dual core processors beat it in performance now.

Offline K7IA

  • Member
  • Posts: 884
  • :)
Re: Nvidia DMI chipset development on hold
« Reply #16 on: October 12, 2009, 07:49:52 AM »
I lost a few brain cell reading (all of) this  :D

Quote
What is the difference between the 2-core-4-thread Core i3 and the 2-core-4-thread Core i5?
oh that is easy, the die package is different (http://www.techpowerup.com/img/09-06-24/47a.jpg),

Core i3 2/4 is an Arrandale with Socket mPGA-989
Core i5 2/4 is a Clarkdale with Socket 1156

Also from what I can gather, nVidia could not get the chipset licence for 1156 pin Core i3/i5i/i7 which is considered to be low-end Nehalems.

Quote

That's an H55/H57 motherboard for Clarkdale (dual-core Core i5 with on-die GPU) with mini-ITX form factor. Yay for small form-factor!

WOW, that is super!
« Last Edit: October 12, 2009, 11:47:02 AM by enginarc »

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Nvidia DMI chipset development on hold
« Reply #17 on: October 12, 2009, 11:55:47 AM »
Quote
What is the difference between the 2-core-4-thread Core i3 and the 2-core-4-thread Core i5?
oh that is easy, the die package is different (http://www.techpowerup.com/img/09-06-24/47a.jpg),

Core i3 2/4 is an Arrandale with Socket mPGA-989
Core i5 2/4 is a Clarkdale with Socket 1156
That's what I thought too... but then I realised that cheatsheet might not show all the processors available at launch. Heck, I'm not even sure if anyone knows all the processors that will be available at launch.

For instance, a 3-month old article from Anandtech (I've read similar from some other sites as well, but since I don't keep track of all my sources I can't list them now) suggests there may be socket-1156 Core i3s. And they supposedly come in 2-core and 4-core variants. That clearly suggests the possibility of a Clarkdale Core i3, which doesn't sound impossible for Intel to produce... but then again I don't even know what the difference between them will be so I can't say for sure.

In all likelihood, i3 just might be i5 with less cache and artificially limited clock speeds (and no turbo feature). I guess we'll find out in 2010.

Quote
Also from what I can gather, nVidia could not get the chipset licence for 1156 pin Core i3/i5i/i7 which is considered to be low-end Nehalems.
Yep, that's what the first post is about :)

Offline K7IA

  • Member
  • Posts: 884
  • :)
Re: Nvidia DMI chipset development on hold
« Reply #18 on: October 12, 2009, 12:59:54 PM »
OK, so now I am speculating, based on what I read so far is correct.

ref.1 http://www.techpowerup.com/img/09-06-24/47a.jpg

Core i3 is actually a Core2 Duo (ref.1) , replaces Core2 Duo brand, compatible with P45 chipset mobos, created to satisfy the likes of me (I have a P45 chipset mobo)

Core i3 2/4 mPGA-989 created to be used on LV mobiles only, has lower profile due to packaging.

Core i5 is the midget between i3 and i7 for those who can't afford an i7 cpu but want something upgradable to i7 and have money for a P55 mobo but ignore the fact that the ACTUAL i7 is a 1366 pin cpu.

As a result, Intel blows nVidia chipsets by just changing the interlink between components without too much changing the CPU architecture (just like the Socket/Slot cpu revolution of Intel in 1999? ) and well I didn't see any low-end nehalems Core i3/i5/i7 in the diagrams that don't utilize a DMI link, and no more Core2, Quads which means goodbye to nVidia chipsets.


Offline kyanwan

  • Member
  • Posts: 1880
  • 口寄せ・穢土転生!
Re: Nvidia DMI chipset development on hold
« Reply #19 on: October 13, 2009, 10:27:15 AM »

An HD-capable setup like that could one day be available for less than $400.

That -

I don't think I could ever accept something like that as a PC - but maybe as a "smart" terminal, it might be damn frickin cool. 

Nothing.