Discussion Forums > Technology

Intel Makes 22nm 3-D Tri-Gate Tech for Ivy Bridge

<< < (6/8) > >>

TMRNetShark:

--- Quote from: newy on May 08, 2011, 03:52:21 AM ---You say it's such a simple concept. Do you work in the semi conductor industry? No? Then better shut up? :) I guess even the simplified silicon chip manufacturing was still too complicated. Less power is only possible if one knows how to optimise the production and that is only possible if one knows how production works. And what I listed are just rough steps/milestones in the production... and this starts with the growing of the silicon crystal and ends with the separation of the several transistors from the wafer to be put in a case. Do you know how long that takes? No? Again, read instead of post. Only the steps I mentioned already take several days to weeks. Now combine this and the new 3d structure, how much are the several steps prolonged or how many steps need to be added? Intel knows and has managed to have them ready for market only now.

As I worked (or have been an apprentice) in that industry I'm dying to know how the several steps in that production look like but of course I will never get to know that as I will never be employed at Intel (nor do I have the intention to work in that industry).

--- End quote ---

Why are you talking about production of chips? Not once in this entire thread did I start talking about production of 3D chips. I'm talking about the chip being used in a consumer fashion. Less power that that CPU needs will mean less voltage (you know, that thing called thermo-dynamics?), which in turn will have less wear and tear on those transistors.  Learn to read people's posts please.  ::)

Freedom Kira:

--- Quote from: TMRNetShark on May 08, 2011, 04:38:52 AM ---Why are you talking about production of chips? Not once in this entire thread did I start talking about production of 3D chips.

--- End quote ---

Huh, explain this here:


--- Quote from: TMRNetShark on May 06, 2011, 12:19:41 AM ---ON TOPIC: This will get Intel ahead of the game, but I'm wondering when this technology will hit GPUs and start lowering the amount of heat that are emitted from computers or any other technology that uses transistors. This will definitely increase usage life of CPUs... my question is, why didn't they think of this before? It seems like a simple enough concept, but I guess the technology to make such small circuits hadn't caught up yet.

--- End quote ---

--


--- Quote from: TMRNetShark on May 08, 2011, 04:38:52 AM ---I'm talking about the chip being used in a consumer fashion. Less power that that CPU needs will mean less voltage (you know, that thing called thermo-dynamics?), which in turn will have less wear and tear on those transistors.  Learn to read people's posts please.  ::)

--- End quote ---

In that case, you are off-topic, yet claiming in the previously quoted post that you were "ON TOPIC." Hmm. It'd be nice if you stopped contradicting yourself.

kureshii:

--- Quote from: TMRNetShark on May 06, 2011, 12:19:41 AM ---ON TOPIC: This will get Intel ahead of the game, but I'm wondering when this technology will hit GPUs and start lowering the amount of heat that are emitted from computers or any other technology that uses transistors. This will definitely increase usage life of CPUs... my question is, why didn't they think of this before? It seems like a simple enough concept, but I guess the technology to make such small circuits hadn't caught up yet.

--- End quote ---


--- Quote from: kureshii on May 07, 2011, 05:07:51 AM ---TSMC and GF have announced the FinFET process only for the 14nm node, since it seems “FinFETs require changes in circuit design (especially analog), tools and IP throughout the whole ecosystem”, and their customers need time to make those changes in their own workflow.
--- End quote ---

Your question has already been answered anyway. 14nm is in 2014 on GF’s roadmap, and in “the foreseeable future” for TSMC’s. Since those two fabs are where most gaming GPUs are manufactured (neither Nvidia nor AMD have their own manufacturing facilities), that’s roughly when you can expect to hear news about their production. Of course, things aren’t that simple since the switch to tri-gate transistors will require changes in chip design (as mentioned above).

[edit] Silly me, I overlooked one really obvious thing: we will be seeing 22nm tri-gate transistors in Intel (integrated) GPUs on Ivy Bridge, so expect that as early as 2012 :)



And what do you mean by “why haven’t they thought of this before?”

--- Quote from: Intel Newsblog ---Intel will introduce a revolutionary 3-D transistor design called Tri-Gate, first disclosed by Intel in 2002, into high-volume manufacturing at the 22-nanometer (nm) node in an Intel chip codenamed "Ivy Bridge."
--- End quote ---
That was when they disclosed it, by the way; the research that went into it probably took place years earlier, and likewise the idea that led to said research. Someone already thought of this before you even knew what a processor core is! Newy’s right; his mention of chip production is relevant, even if you don’t see its relevance. Moving a chip to a smaller process node is not something one can do at the snap of one’s fingers.


--- Quote from: TMRNetShark on May 08, 2011, 04:38:52 AM ---Not once in this entire thread did I start talking about production of 3D chips.
--- End quote ---
Nobody even mentioned 3D chips — these are “3D” transistors we’re talking about, being used in a planar chip design. 3D chips are yet to be announced on roadmaps . . .

kitamesume:
if ever they successfully use FinFETs on GPUs we`d be seeing mainstream rated at 20watts while flagships would be rated 100watts unlike today's which are mainly 60watts-200watts.

well anyway, thats 3-5years of time to wait so i got years to save up money, i just wish they'd hurry up and quit clinging to unnecessary greediness.

kureshii:

--- Quote from: kitamesume on May 08, 2011, 12:43:39 PM ---if ever they successfully use FinFETs on GPUs we`d be seeing mainstream rated at 20watts while flagships would be rated 100watts unlike today's which are mainly 60watts-200watts.
--- End quote ---
I wouldn’t count on that. The trend has been for power consumption of high-end GPUs to keep heading upwards (with limited overclocking), while mainstream GPUs haven’t exactly seen power consumption numbers dropping either.

Something else is going to occupy that 20W band: integrated graphics, and low-end budget cards. IGPs already typically consume ~16W on load ( a very rough figure); Sandy Bridge graphics already consumes much more than that—something like >30W, from quick calculations based on the below graphs:


* Load figures from running Bulletstorm 1440x900. Graphs from http://techreport.com/articles.x/20728/6.

And flagship graphics cards at 100W? That sure is a nice dream. Since Nvidia GT200 (my study of GPU trends doesn't extend much before that), flagships have never dropped below the 200W mark; most attempt to creep as close to 300W, the PCIe 2.0 max power spec, as they can. PCIe 3.0 doesn't lower that power spec, so I wouldn’t expect such a trend to change.

Smaller transistors use less power, but you can cram more of them into the same die area, so power consumption of the chip doesn't always decrease.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version