Author Topic: nVidia's GTX 600 Series  (Read 4037 times)

Offline Tatsujin

  • Box Fansubs
  • Member
  • Posts: 15632
    • Otakixus
nVidia's GTX 600 Series
« on: March 23, 2012, 03:04:46 AM »
GTX 680

I want ...
« Last Edit: May 03, 2012, 06:21:17 PM by Tatsujin »


¸¸,.-~*'¨¨¨™¤¦ Otakixus ¦¤™¨¨¨'*~-.,¸¸

Online kitamesume

  • Member
  • Posts: 7223
  • Death is pleasure, Living is torment.
Re: nVidia's GTX 680
« Reply #1 on: March 23, 2012, 05:08:47 AM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

well maybe i was just used to seeing nvidia's performance improvement going >20% in most of their newer models.

though i'm curious enough to know whats HD7770's equivalent, how much and what TDP will it have.
me would like an HD7770 equal to be <130$ while hovering <70W TDP.
« Last Edit: March 23, 2012, 05:10:39 AM by kitamesume »

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline Tatsujin

  • Box Fansubs
  • Member
  • Posts: 15632
    • Otakixus
Re: nVidia's GTX 680
« Reply #2 on: March 23, 2012, 05:30:59 AM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

well maybe i was just used to seeing nvidia's performance improvement going >20% in most of their newer models.

though i'm curious enough to know whats HD7770's equivalent, how much and what TDP will it have.
me would like an HD7770 equal to be <130$ while hovering <70W TDP.
I seriously feel like going with that Aquarium + Baby Oil and dumping my computer inside of it. I don't like heat, I want this whole thing to be like 10C or 15C - no more than that.

I'd probably want a GTX 690 for the dual-GPU - just one card would be enough.


¸¸,.-~*'¨¨¨™¤¦ Otakixus ¦¤™¨¨¨'*~-.,¸¸

Offline vuzedome

  • Member
  • Posts: 6374
  • Reppuzan~!
  • Awards Winner of the BakaBT Mahjong tournament 2010
    • GoGreenToday
Re: nVidia's GTX 680
« Reply #3 on: March 23, 2012, 07:12:32 AM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

well maybe i was just used to seeing nvidia's performance improvement going >20% in most of their newer models.

though i'm curious enough to know whats HD7770's equivalent, how much and what TDP will it have.
me would like an HD7770 equal to be <130$ while hovering <70W TDP.
The price itself is more than appropriate.
And well it's not like the jump from 2xx to 4xx, it's just 5xx to 6xx so I doubt it will that big of an improvement.
What I can't wait for is TXAA, man delivering 8XMSAA at half the performance requirement, DAMN. And even 400s and 500s will benefit.
BBT Ika Musume Fan Club Member #000044   
Misaka Mikoto Fan Club Member #000044
BBT Duke Nukem Fan Club Member #0000002

Offline GoGeTa006

  • Member
  • Posts: 6863
  • The fate of destruction is also the joy of Rebirth
    • Anime Planet listing
Re: nVidia's GTX 680
« Reply #4 on: March 23, 2012, 08:32:45 AM »

Online kitamesume

  • Member
  • Posts: 7223
  • Death is pleasure, Living is torment.
Re: nVidia's GTX 680
« Reply #5 on: March 23, 2012, 08:51:04 AM »
@vuze

no i didn't meant that, their gaming did improve a little bit, but their computation dropped like a rock... which is just too ridiculous even if their target is to reduce power consumption, they shouldn't have crippled performance that much or at least name the damn thing GTX670[Ti?] instead.



what i think would've been a better move from them is just simply die-shrink fermi to 28mm, add some more rops and cores and adjust the clocks accordingly. the die-shrink alone should reduce a favorable amount of wattage while improving or maintaining the performance level.
« Last Edit: March 23, 2012, 08:55:09 AM by kitamesume »

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline Tiffanys

  • Member
  • Posts: 7738
  • real female girl ojō-sama
Re: nVidia's GTX 680
« Reply #6 on: March 23, 2012, 09:15:34 AM »
I'm excited about the hair. I'd love to see video games be able to do that instead of just paint it on a 2d canvas.

Offline vuzedome

  • Member
  • Posts: 6374
  • Reppuzan~!
  • Awards Winner of the BakaBT Mahjong tournament 2010
    • GoGreenToday
Re: nVidia's GTX 680
« Reply #7 on: March 23, 2012, 10:26:48 AM »
I forgot to mention that it's using the GK104 core, and you know what that means, this ain't the big daddy yet, but that's just speculation.
But do enlighten me, what is texture decompression and why is it that you use it to judge computational performances?
I'm really in the dark on that, google didn't help much.
BBT Ika Musume Fan Club Member #000044   
Misaka Mikoto Fan Club Member #000044
BBT Duke Nukem Fan Club Member #0000002

Offline Tatsujin

  • Box Fansubs
  • Member
  • Posts: 15632
    • Otakixus
Re: nVidia's GTX 680
« Reply #8 on: March 23, 2012, 11:46:42 AM »
I'm excited about the hair. I'd love to see video games be able to do that instead of just paint it on a 2d canvas.
Agree. I want that.

@vuze

no i didn't meant that, their gaming did improve a little bit, but their computation dropped like a rock... which is just too ridiculous even if their target is to reduce power consumption, they shouldn't have crippled performance that much or at least name the damn thing GTX670[Ti?] instead.



what i think would've been a better move from them is just simply die-shrink fermi to 28mm, add some more rops and cores and adjust the clocks accordingly. the die-shrink alone should reduce a favorable amount of wattage while improving or maintaining the performance level.
Wow that does suck, specifically on a single game. Maybe it needs an update soon? Most websites say nVidia decided to use a different architecture on the 600 Series. Meh, I want to see more benchmarks. I'm losing hope if this series fails hardcore.

I'm getting this results Radeon HD 7970 vs GTX 680 Performance on AMD Bulldozer FX-8150 CPU Review Linus Tech Tips - across several games.


¸¸,.-~*'¨¨¨™¤¦ Otakixus ¦¤™¨¨¨'*~-.,¸¸

Online kitamesume

  • Member
  • Posts: 7223
  • Death is pleasure, Living is torment.
Re: nVidia's GTX 680
« Reply #9 on: March 23, 2012, 01:08:34 PM »
texture decompression wasn't my point, my point was that GTX680 is hardly any faster than GTX580, sometimes even slower than it.

the fact that sometimes its slower than the GTX580 means they failed somewhere in there.

some other stuff i found
(click to show/hide)

but hey, they did improved their performance/watt by a huge margin. though they could've done better if they didn't change anything and just transitioned the whole thing to 28mm.

though i got my eye on GTX 650 Ti, if it'll exist it should be on par with HD7770, and if its cheaper and requires less watts to power up then it'll be a deal.

so i got two guesses when the time comes, either way its a win-win scenario.
1) GTX 650 Ti is on par with HD7770, costs less and uses less watts.
2) GTX 650 Ti failed my expectations, BUT HD7770 drops price.
« Last Edit: March 23, 2012, 01:13:35 PM by kitamesume »

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline Lupin

  • Member
  • Posts: 2169
Re: nVidia's GTX 680
« Reply #10 on: March 23, 2012, 01:29:16 PM »
But do enlighten me, what is texture decompression and why is it that you use it to judge computational performances?
I'm really in the dark on that, google didn't help much.
Since this wasn't answered, texture compression allows you to cram in more textures (or higher detailed ones) in your video ram. Most games compress their textures to save space and to make them fit in your video ram. Imagine playing a game at eyefinity resolutions with uncompressed textures. All those textures won't fit in that 2GB (or higher) vram of your high end video card. The game will have to read your drive to load more textures. The game will stutter as it will have to wait for the new textures to load. Texture decompression is computation heavy because it's just like decompressing archives.

Offline vuzedome

  • Member
  • Posts: 6374
  • Reppuzan~!
  • Awards Winner of the BakaBT Mahjong tournament 2010
    • GoGreenToday
Re: nVidia's GTX 680
« Reply #11 on: March 23, 2012, 04:36:34 PM »
(click to show/hide)
Thanks, pretty much understood what it was after reading about it on wikipedia.
kitamesume, regardless of those benchmark test, most of us buy graphics cards for it's main function, to play games.
And most results I see on gaming performance is that the GTX 680 performs better than its predecessor.
The GTX 680 is a new card, so it'll take one or more driver updates plus mainstream software to pick it up before we start seeing it's true potential.
BBT Ika Musume Fan Club Member #000044   
Misaka Mikoto Fan Club Member #000044
BBT Duke Nukem Fan Club Member #0000002

Online kitamesume

  • Member
  • Posts: 7223
  • Death is pleasure, Living is torment.
Re: nVidia's GTX 680
« Reply #12 on: March 23, 2012, 05:13:12 PM »
i think you missed my point, i'm saying that they could've done better with the performance if they left the fermi design alone and just went for the 28nm die-shrink. this could've took them less months to develop since the design is already there to boot, though a few tweaks so it could profit the 28nm die-shrink.

the switch to 28nm should already have a lot of bonus effects even if you didn't change the core design, i mean the benefits of the 28mm die-shrink is less watts per performance, plus a larger headroom on core clocks, not to mention since they've shrunk the die-size they could fit more cores.

then they could've started with a real overhaul with the GT? 7## series, if they went this path they could've got the "first to reach the 28nm fab" plus could've got the time to take a price hike and later on bring it down when competition rises. and ultimately have the time to develop the GT? 7## series.

as of now i see AMD as a better pick since now they have the overall performance edge, unless you're craving for Physx, and the only chance nvidia would take the top-dog crown now would be to fine tune their drivers though AMD still have theirs being fine tuned as well. also if they could get their lower tier cards to perform well then it shouldn't hurt them too much.

also, if AMD can think fast, and if its possible, they could effectively drop the prices of HD7970 and HD7950 by 50$ more or less, that'll hurt nvidia a lot.
to explain how it'll hurt nvidia a lot, it is because nvidia just released their card out, if AMD can give pressure then nvidia's profit from this would severely be crippled, and since AMD had a headstart of about 3months and profited from it then it wouldn't hurt AMD much with this move.
« Last Edit: March 23, 2012, 05:46:04 PM by kitamesume »

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: nVidia's GTX 680
« Reply #13 on: March 23, 2012, 05:36:39 PM »
They have a water cooled one? What what?

So theoretically now (I'm not up to date on all the high end cards), you can run a monster rig at relative silence (except for the hard drive and PSU)?

*Now patiently awaits a water cooled PSU*

On topic: I think this card isn't as good as it could of been. I remember the old GTX 6800's. Those were definitely a more significant leap forward in terms of graphics than the GTX 680 (regression in naming? XD)... At least Nvidia knows how to get their drivers right *cough ATI cough*

Offline mgz

  • Box Fansubs
  • Member
  • Posts: 10561
Re: nVidia's GTX 680
« Reply #14 on: March 23, 2012, 10:30:41 PM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

well maybe i was just used to seeing nvidia's performance improvement going >20% in most of their newer models.

though i'm curious enough to know whats HD7770's equivalent, how much and what TDP will it have.
me would like an HD7770 equal to be <130$ while hovering <70W TDP.
I seriously feel like going with that Aquarium + Baby Oil and dumping my computer inside of it. I don't like heat, I want this whole thing to be like 10C or 15C - no more than that.

I'd probably want a GTX 690 for the dual-GPU - just one card would be enough.
running your computer submerged in oil doesnt keep your temps really low. What it does do is make everything incredibly stable in temp so it sits at a average temp and fluctuates very little under load and such. Radiators like that used in water cooling arent as effective at dissipating the heat from the oil.

Offline costi

  • Member
  • Posts: 1125
  • [tada.wav]
Re: nVidia's GTX 680
« Reply #15 on: March 25, 2012, 10:58:43 AM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

I wouldn't call it a fail: http://www.purepc.pl/karty_graficzne/test_geforce_gtx_680_vs_radeon_hd_7970_pojedynek_kart_dla_graczy

Online kitamesume

  • Member
  • Posts: 7223
  • Death is pleasure, Living is torment.
Re: nVidia's GTX 680
« Reply #16 on: March 25, 2012, 01:40:21 PM »
can't you get my point? they revamped the fermi design, shrunk it to 28nm, and priced it at $500.
what do we get? an inconsistent performance GPU thats being caused by who knows what? maybe bad drivers or whatever.

if they've stuck to the original fermi design and tweaked it up for the 28nm fab process, added some more cores since its shrunked meaning theres space to add more, they could've got a huge performance margin while maintaining power consumption or even lowering it.
the thing i've been saying is that they did something wrong in there, whether its on the revamp or the drivers, who knows.

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline vuzedome

  • Member
  • Posts: 6374
  • Reppuzan~!
  • Awards Winner of the BakaBT Mahjong tournament 2010
    • GoGreenToday
Re: nVidia's GTX 680
« Reply #17 on: March 25, 2012, 01:57:14 PM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

I wouldn't call it a fail: http://www.purepc.pl/karty_graficzne/test_geforce_gtx_680_vs_radeon_hd_7970_pojedynek_kart_dla_graczy
Haven't you noticed I gave up trying to argue against kitamesume?
Better not waste energy on an argument that will lead to no where.
BBT Ika Musume Fan Club Member #000044   
Misaka Mikoto Fan Club Member #000044
BBT Duke Nukem Fan Club Member #0000002

Offline costi

  • Member
  • Posts: 1125
  • [tada.wav]
Re: nVidia's GTX 680
« Reply #18 on: March 25, 2012, 03:43:09 PM »
Yeah, I guess I better quit as well... if a card that loses only to CF/SLI configurations in practically all games is a fail, than I don't know what isn't. ::)

Offline Tatsujin

  • Box Fansubs
  • Member
  • Posts: 15632
    • Otakixus
Re: nVidia's GTX 680
« Reply #19 on: March 25, 2012, 04:12:39 PM »
idk, based on reviews from sites i think nvidia failed in the performance department, though their performace/watt ratio improved drastically.

I wouldn't call it a fail: http://www.purepc.pl/karty_graficzne/test_geforce_gtx_680_vs_radeon_hd_7970_pojedynek_kart_dla_graczy
Haven't you noticed I gave up trying to argue against kitamesume?
Better not waste energy on an argument that will lead to no where.
Yup, save your breath. I've already posted a legit video comparing numbers across a lot of games between the two cards.


¸¸,.-~*'¨¨¨™¤¦ Otakixus ¦¤™¨¨¨'*~-.,¸¸