Discussion Forums > Technology
Intel Makes 22nm 3-D Tri-Gate Tech for Ivy Bridge
kitamesume:
meh, i`m saying today's mainsteam equivalent would be like 20watts and flagship equivalent would be 100watts. well it was a nice dream, would actually want to own a gtx 580 equal that uses 100watts =P
if you consider it like that, a 9500GT consumed 50watts while having 134.4GFLOPs of processing power, on those olden days they were considered mainstream, a gt5xx equivalent of that would be the newly released GT520 with 29watts while having 155.5GFLOPs of processing power. if they keep this up, we`d be seeing those as today's so called low end consuming no more than 15watts after 2years or so.
the GTX 460 SE had like 150watts rating, the GTX5xx equivalent would be the GTX 550 Ti, while being slightly slower, it still got 34watts shaved off, just think of the next GTX650 having another 30watts or so shaved off, or atleast a GT640 performing at GTX550 level and is under 100watts =P
kureshii:
The anticipation is always sweeter than the object of desire :) Were you this excited when the GTX 550 Ti, on TSMC’s 40nm, came out and displaced the GTX200 series, on TSMC’s 65nm? Wait, did you even remember that the GTX260 once existed when you heard the 560 Ti release announcement?
Even in 3 years, a graphics card quickly becomes history. You wouldn't even remember the 550 Ti by the time FinFETs-on-GPUs come out; you’d be going all “oooh 3D chips on 10nm! Can't wait to see mainstream hit 20W and flagship cards drop to 100W.”
TMRNetShark:
(click to show/hide)
--- Quote from: Freedom Kira on May 08, 2011, 05:46:41 AM ---
--- Quote from: TMRNetShark on May 08, 2011, 04:38:52 AM ---Why are you talking about production of chips? Not once in this entire thread did I start talking about production of 3D chips.
--- End quote ---
Huh, explain this here:
--- Quote from: TMRNetShark on May 06, 2011, 12:19:41 AM ---ON TOPIC: This will get Intel ahead of the game, but I'm wondering when this technology will hit GPUs and start lowering the amount of heat that are emitted from computers or any other technology that uses transistors. This will definitely increase usage life of CPUs... my question is, why didn't they think of this before? It seems like a simple enough concept, but I guess the technology to make such small circuits hadn't caught up yet.
--- End quote ---
--
--- Quote from: TMRNetShark on May 08, 2011, 04:38:52 AM ---I'm talking about the chip being used in a consumer fashion. Less power that that CPU needs will mean less voltage (you know, that thing called thermo-dynamics?), which in turn will have less wear and tear on those transistors. Learn to read people's posts please. ::)
--- End quote ---
In that case, you are off-topic, yet claiming in the previously quoted post that you were "ON TOPIC." Hmm. It'd be nice if you stopped contradicting yourself.
--- End quote ---
*facepalm*
Way to make youself look so so stupid.
--- Quote ---ON TOPIC: This will get Intel ahead of the game, but I'm wondering when this technology will hit GPUs and start lowering the amount of heat that are emitted from computers or any other technology that uses transistors.
--- End quote ---
GPUs are used in a consumer settings for gaming or graphical arts... no production talk there...
--- Quote ---This will definitely increase usage life of CPUs... my question is, why didn't they think of this before?
--- End quote ---
Once again, no production talk... only talking about the lifespan. The article even stated that lower voltages will be use which is ON topic.
--- Quote ---It seems like a simple enough concept, but I guess the technology to make such small circuits hadn't caught up yet.
--- End quote ---
The means of making these small 3D transistors just had not caught up to the actual technology of 3D transistors... which is in accordance with the article that says Intel will be just starting mass production even though the concept was 10-20 years old. Yet again, I only talked about the mass production of it, no the developmental stages.
Now please, before you embarrass yourself even more, stay on topic? I find it paradoxical that someone tries to insult and tell me that I'm off topic when clearly I was and they themselves did not mention a single thing about the topic.
NaRu:
When intel made a post about 3D transistors it was ready for being used in the new set of CPUs. As for development stages of future chips arent even being said in Intel post. Intel is well known not to talk about anything until they are ready to sell it in the consumer market.
kitamesume:
well yea xD, or what if "cant wait for electricity bill to drop by 90%" now thats a woopers or maybe they should start standardizing 90% efficiency PSUs. by then, i wouldnt care even if my GPU hordes more electricity than my air conditioner.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version