Author Topic: Intel wants to charge 50 bucks to unlock preexisting features on your processor  (Read 2768 times)

Offline AceHigh

  • Member
  • Posts: 12840
You would thing that it's impossible to crosslink more than 2 SLI, but that is what EVGA has done. Their motherboard comes with a 4 way bridge and MB has controller for 4 Graphic cards.

http://www.evga.com/PRODUCTS/IMAGES/GALLERY/170-BL-E762-A1_LG_3.jpg

As you see on picture you get a 3 and 4 way SLI bridge. The three way SLI is good if you plan to use the 4th graphic card as a dedicated physX card.

As for the question what to use it for, do you know the biggest innovation about DirectX 11? It's called "tessalation" and the technology is explained here.

And here is a guy who runs Heaven benchmark with maximum tessalation effect and his FPS is still impressive. My card is one of the good ones and it choked on a tessalation effect greater than medium.
http://www.youtube.com/watch?v=vY8uppkjwL0
For one thing, Tiff is not on any level what I would call a typical American.  She's not what I would consider a typical person.  I don't know any other genius geneticist anime-fan martial artist marksman model-level beauties, do you?

Online vuzedome

  • Member
  • Posts: 6376
  • Reppuzan~!
  • Awards Winner of the BakaBT Mahjong tournament 2010
    • GoGreenToday
4 cards, water cooled!!!
There's no way someone with x4 SLi would not go water for cooling.
BBT Ika Musume Fan Club Member #000044   
Misaka Mikoto Fan Club Member #000044
BBT Duke Nukem Fan Club Member #0000002

Online Pentium100

  • Member
  • Posts: 528
The cards alone will draw 600W of power when idle, and if you're very very lucky, you'll get a good couple of hours gaming in wearing nothing but your birthday suit and a towel to pat down the sweat in your custom built sauna before the top card begins to warp and/or melt.

Well, you can cool them with big fans that were designed for good airflow and not quiet operation. Also, while I don't have 4 way SLI, I have more than one computer with combined power usage (when not gaming) of about 650W (monitor uses additional ~100W, when gaming the power goes up to around 1kW). I cool my room with a 25cm, ~1400m^3/h fan, that's inside a tube, the other end of which is near an open window (not out the window, since I have a grating that prevents bugs from flying in or being sucked in). The other windows is also open and has a tube near it, but no fan, this is or the warm air to get out. I can manage to keep my room at about 5 degrees (C) above outside temperature, which still sucks in summer when outside is 30 degrees and my room is 35. The fan uses about 60W of power, much less than an AC unit would use.

So, it would be possible to cool down a 1kW of video cards.
BBT Ika Musume Fanclub Member #080586
Misaka Mikoto Fanclub Member:080586

Offline rostheferret

  • Member
  • Posts: 1584
Eh, fairly impressed they found a way to link them, but its still seems a bit pointless to me. Getting a $5000 machine, specially rigging up a room to deal with the heat (according to some benchmarks, you'd be looking at ~2.4KW for the cards alone under duress, though since I expect they won't be running at max capacity I reckon it might only be half that) and for what? A marginal improvement for DX11 games using a card that slapped it together at the last minute? (ATI came up with the idea, and when their line was released nVidia had 6 months to match it) I still can't imagine you'd be able to cool them properly; they'd start to get hot and soon drop off in power. Surely it'd make more sense to go dual HD5970? That way you can actually cool the damn thing, spend half the money, use half the power, generate half the heat and still have quad GPU's. I still can't quite see how getting this says anything more than "I don't have a penis."

Offline AceHigh

  • Member
  • Posts: 12840
I wasn't arguing about how pointless or not it is, just that 4 way SLI is actually possible.

I myself have been an ATI fanboy myself and the last time I used nVidia is when it's main competitor was the Voodoo cards from 3DFX. Right now I have a Radeon 5750 card.

If I would have invested money in a better rig, I would have bought 2 5900 cards and be happy with that.
For one thing, Tiff is not on any level what I would call a typical American.  She's not what I would consider a typical person.  I don't know any other genius geneticist anime-fan martial artist marksman model-level beauties, do you?

Offline captiosus

  • Member
  • Posts: 434
    • The Wings Elite
down boys down, this as a thread on how much intel is diving back into the shiathole AMD scared them out of with the athlon series. If you want to wage a video card war, http://boards.4chan.org/g/  they have them daily

Offline Meomix

  • Member
  • Posts: 4993
  • For our glorious order
    • MAL
I started a hardrive system upgrade thread and it turned into the best way to protect your hardrive from espionage. I'd just say that this forum is naturally serious.
Did you know Satan was supposedly gods RIGHT HAND MAN, not his left. Blows your theory out of the water now doesn't it.

Offline rostheferret

  • Member
  • Posts: 1584
I wasn't arguing about how pointless or not it is, just that 4 way SLI is actually possible.

I myself have been an ATI fanboy myself and the last time I used nVidia is when it's main competitor was the Voodoo cards from 3DFX. Right now I have a Radeon 5750 card.

If I would have invested money in a better rig, I would have bought 2 5900 cards and be happy with that.

Fair enough. 5770 here; great minds think alike apparently :P

As for the matter at hand, this is why I will never buy an AMD. If intel want to disable a portion of their chip, then they accept the risks in doing so (with the likes of pirates unlocking it) and I see little wrong with it.

Offline sdedalus83

  • Member
  • Posts: 2867
Oh and yeah, AMD CPU are just not competitive on the high end... Actually it pisses me off that they will be dropping ATI brand from future graphic cards, when ATI have been much more successful lately. It's a bad joke that a company that does poorly assimilates a company that makes superior product on the market.

/end rant

At the time AMD assimilated ATI, ATI was on the verge of releasing a product which would have killed the company had it not been assimilated - it nearly killed AMD for fuck's sake.  AMD's process tweaking and ATI's design teams saved both companies when they produced RV670.

Offline AceHigh

  • Member
  • Posts: 12840
At the time AMD assimilated ATI, ATI was on the verge of releasing a product which would have killed the company had it not been assimilated - it nearly killed AMD for fuck's sake.  AMD's process tweaking and ATI's design teams saved both companies when they produced RV670.

Do you mean they were on verge killing their own company? I am sure their sales were really good at the time of assimilation, and I don't remember them being on verge of bankruptcy at that time.

And if they would have killed AMD, good riddance, it would give new manufacturers a chance to present their product on the market.


Besides, killing off a brand like that is still very stupid in my opinion. There is a reason why Toyota doesn't kill Lexus.
For one thing, Tiff is not on any level what I would call a typical American.  She's not what I would consider a typical person.  I don't know any other genius geneticist anime-fan martial artist marksman model-level beauties, do you?

Offline tomoya-kun

  • Member
  • Posts: 6374
  • Reporting for duty.
At the time AMD assimilated ATI, ATI was on the verge of releasing a product which would have killed the company had it not been assimilated - it nearly killed AMD for fuck's sake.  AMD's process tweaking and ATI's design teams saved both companies when they produced RV670.

Do you mean they were on verge killing their own company? I am sure their sales were really good at the time of assimilation, and I don't remember them being on verge of bankruptcy at that time.

And if they would have killed AMD, good riddance, it would give new manufacturers a chance to present their product on the market.


Besides, killing off a brand like that is still very stupid in my opinion. There is a reason why Toyota doesn't kill Lexus.

AMD/ATI has small market share, but creates competition to drive Intel sales.  Intel pretty much has a monopoly at the moment anyway.  If they actually had one, it would be illegal


BBT Team Riko Suminoe #000002

Offline fohfoh

  • Member
  • Posts: 12031
  • Mod AznV~ We don't call it "Live Action"
At the time AMD assimilated ATI, ATI was on the verge of releasing a product which would have killed the company had it not been assimilated - it nearly killed AMD for fuck's sake.  AMD's process tweaking and ATI's design teams saved both companies when they produced RV670.

Do you mean they were on verge killing their own company? I am sure their sales were really good at the time of assimilation, and I don't remember them being on verge of bankruptcy at that time.

And if they would have killed AMD, good riddance, it would give new manufacturers a chance to present their product on the market.


Besides, killing off a brand like that is still very stupid in my opinion. There is a reason why Toyota doesn't kill Lexus.

AMD/ATI has small market share, but creates competition to drive Intel sales.  Intel pretty much has a monopoly at the moment anyway.  If they actually had one, it would be illegal

Sorta... probably they'd get forced to split and be forced to compete with itself.
This is your home now. So take advantage of everything here, except me.

Offline tomoya-kun

  • Member
  • Posts: 6374
  • Reporting for duty.
At the time AMD assimilated ATI, ATI was on the verge of releasing a product which would have killed the company had it not been assimilated - it nearly killed AMD for fuck's sake.  AMD's process tweaking and ATI's design teams saved both companies when they produced RV670.

Do you mean they were on verge killing their own company? I am sure their sales were really good at the time of assimilation, and I don't remember them being on verge of bankruptcy at that time.

And if they would have killed AMD, good riddance, it would give new manufacturers a chance to present their product on the market.


Besides, killing off a brand like that is still very stupid in my opinion. There is a reason why Toyota doesn't kill Lexus.

AMD/ATI has small market share, but creates competition to drive Intel sales.  Intel pretty much has a monopoly at the moment anyway.  If they actually had one, it would be illegal

Sorta... probably they'd get forced to split and be forced to compete with itself.

Which is like holding a monopoly but pretending that you are not.  That exact situation happens in Canada with electronics stores Best Buy and Future Shop.


BBT Team Riko Suminoe #000002

Offline AceHigh

  • Member
  • Posts: 12840
AMD/ATI has small market share, but creates competition to drive Intel sales.  Intel pretty much has a monopoly at the moment anyway.  If they actually had one, it would be illegal

Yeah, but ATI alone was always close to 40-50% of market share with Nvidia on video card market. Therefore I can't see why they had to merge.
For one thing, Tiff is not on any level what I would call a typical American.  She's not what I would consider a typical person.  I don't know any other genius geneticist anime-fan martial artist marksman model-level beauties, do you?

Offline tomoya-kun

  • Member
  • Posts: 6374
  • Reporting for duty.
AMD/ATI has small market share, but creates competition to drive Intel sales.  Intel pretty much has a monopoly at the moment anyway.  If they actually had one, it would be illegal

Yeah, but ATI alone was always close to 40-50% of market share with Nvidia on video card market. Therefore I can't see why they had to merge.

ATI is doing well, but AMD Processors are getting owned by Intel ones.  For Q1 2010, Intel holds 87% of mobile processor market and 71% of desktop processor market.  And of those AMD sales, most were low end processors.


BBT Team Riko Suminoe #000002

Offline AceHigh

  • Member
  • Posts: 12840
So then again, don't you think it's weird that a company that is doing bad, assimilates one that holds almost half of the market of the products the make?
For one thing, Tiff is not on any level what I would call a typical American.  She's not what I would consider a typical person.  I don't know any other genius geneticist anime-fan martial artist marksman model-level beauties, do you?

Offline tomoya-kun

  • Member
  • Posts: 6374
  • Reporting for duty.
So then again, don't you think it's weird that a company that is doing bad, assimilates one that holds almost half of the market of the products the make?

I think that AMD had more valuable stocks, who knows


BBT Team Riko Suminoe #000002

Offline Lupin

  • Member
  • Posts: 2169
So then again, don't you think it's weird that a company that is doing bad, assimilates one that holds almost half of the market of the products the make?
I think the reason AMD phased out the ATI brand is simple: When you speak about ATI/AMD's video cards, do you refer to it as e.g. ATI 58xx? No. The cards are more known as Radeon 58xx. The brand name Radeon has more brand recognition than ATI.

When ATI was bought, it was no longer a company, it became a brand.

Offline tomoya-kun

  • Member
  • Posts: 6374
  • Reporting for duty.
So then again, don't you think it's weird that a company that is doing bad, assimilates one that holds almost half of the market of the products the make?
I think the reason AMD phased out the ATI brand is simple: When you speak about ATI/AMD's video cards, do you refer to it as e.g. ATI 58xx? No. The cards are more known as Radeon 58xx. The brand name Radeon has more brand recognition than ATI.

When ATI was bought, it was no longer a company, it became a brand.

 I still refer to the ATI cards I purchase as ATI cards, I almost never say Radeon 5850, for example.


BBT Team Riko Suminoe #000002

Offline sdedalus83

  • Member
  • Posts: 2867
So then again, don't you think it's weird that a company that is doing bad, assimilates one that holds almost half of the market of the products the make?
AMD bought ATI almost 5 years ago, when Intel's best chip was the P4 and AMD dominated the HPC,large server and desktop markets.  ATI was losing market share in both integrated and discrete. things were a lot different in mid 2006 .