Discussion Forums > Technology

Your view on AMD's Bulldozer

<< < (23/43) > >>

kureshii:

--- Quote from: TMRNetShark on October 14, 2011, 01:08:53 PM ---
--- Quote from: kureshii on October 14, 2011, 07:38:17 AM ---tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.

--- End quote ---

Meaning what? I think graphics cards have been going forward... although you might mean something different when you say GPU features.

--- End quote ---
The last leap they made was Eyefinity, in the 5000-series. Since then, with the 6000-series we've only seen (feature-wise) full hardware video decode support, Displayport 1.2, HDMI 1.4a (adding 3D stereoscopic support over HDMI 1.4) and colour correction in postprocessing. Sandy Bridge has caught up with hardware video decode support (and overtaken it actually), as well as added colour correction. Ivy Bridge will add HDMI 1.4a, but will remain at Displayport 1.1 support only.

This all sounds like nothing much; Intel hasn't caught up yet, have they? What worries me is that all the talk about Radeon 7000-series (Southern Islands) has only been about its compute and gaming capability. It’s all fine and good that they're finally getting serious about the programmability of their GPUs; would be disingenuous of me to complain about that. But again, couple this with the lack of any commitment to fixing the HTPC-feature issues in Llano pointed out by Anand, and it means we have no assurance about these graphic features being eyeballed and paid attention when Trinity and Southern Islands are released next year.

“Wait, so what exactly has Intel done features-wise that AMD hasn’t?” you may ask. Hardware transcode, wireless display (WiDi), IGP triple-display support. Only the first has appeared in retail products so far, but the other two have been confirmed and just awaiting release.

“But what kind of consumer needs these anyway?” This is exactly what I’m worried about. If AMD continues to neglect IGP features like that, thinking it just needs to filter down enough features from its discrete-GPU product line to keep the consumer uncomplaining, we may see Intel come up with the IGP of choice for non-gamers. Its a laughable thought at this moment, but when you look at how far Intel has come since Clarkdale (which was just two generations), I think there’s good reason to sit upright and pay attention.

Let’s put that in a different frame. Late 2008. X4500 was Intel’s latest IGP. It was doubtlessly shitty. Blu-ray acceleration finally worked, and CPU utilisation during playback was still shitty. Still not worth looking at.

It is now Jan 2010. Clarkdale just came out; it still had absolutely shitty graphics, but at least they had DTS-HDMA finally working! But more importantly, it suddenly caught up with the 790GX in some benchmarks. So Intel finally caught up. Whats the worry? The most that happened is Anand finally considered Intel HD Graphics “HTPC-worthy”.

Fast-forward a year later. Sandy Bridge is released. Just about all its HTPC issues have been resolved, except for that niggling 23.976fps playback which wouldn’t work without disabling UAC. More importantly, it has now almost left the 890GX (rebranded 790GX) behind and is catching up with HD5450, even overtaking it in some games. Half a year later AMD reclaims its IGP crown, but it had to pull in its newest to do that. The HD6450 has been relegated to worse-than-IGP class. More importantly, AMD had to pull out all stops on the IGP for the first time. It’s a far cry from the speed their dedicated cards are pulling, but keep in mind that the IGP has to share TDP with the CPU, and I don’t doubt that what went into Llano was AMD’s best effort in IGP engineering.

It looks like a string of losses for Intel, doesn’t it? Each time it improves, AMD just filters some slightly newer Radeons into their IGPs and they’re still in the lead. But Intel isn’t just getting better in raw performance, it’s getting better—much, much better, scarily so—in CPU performance as well as power efficiency. Perhaps as a gamer this doesn’t matter much, but keep in mind that with on-die IGPs both the CPU and GPU have to share the TDP budget.

We were hoping that Bulldozer would save Trinity, but looking at its performance and power consumption I’m suddenly not so sure. How much better AMD can make their IGP perform will depend on how much they can shave off the CPU TDP without nerfing it too much (so as to leave more for the GPU), and how much they can reduce power consumption in the VLIW4 Radeon 7000Ds (which are the ones going into Trinity, not the FSA architecture). In the meantime, Intel has already promised up to 60% better performance from the Ivy Bridge GT2 units. AMD promises up to 50% better Gflops in Trinity, but no mention of whether this comes from CPU or GPU or both.

Are you scared for AMD yet? I know in a discrete card shootout AMD would doubtless own Intel many times over, but the IGP game is quite a different one, and that is where Intel and AMD both bring inherent advantages: Intel in power management and power efficiency, AMD in GPU architecture/performance. And Intel’s still catching up on AMD in GPU features. (And still no triple-display support announced for Trinity by the way). Are you scared for AMD yet?

Again, don’t take this as an Intel-pwns-AMD post. I am deathly scared for AMD, not because Intel is suddenly going to become godly at graphics. It’ll take them a few more years to catch up in performance at least, and many more to acquire the kind of experience AMD/Nvidia have, but they are taking IGPs seriously and it shows in their PR. Meanwhile, AMD doesn’t seem scared at all, and continues to spin their PR yarn, keeping mum about details. Time to wake up from your FX-fueled dreams, AMD; FX is no more. Listen to those bug reports, fix them quick, don’t aim to compete only on raw performance; that’s not how the consumer market works.


--- Quote from: kitamesume on October 14, 2011, 02:37:54 PM ---nao, USB3 videocards! lol, nevermind power consumption, wonder if the USB have enough bandwidth to support it.

--- End quote ---
http://www.sunix.com.tw/product/vga2788.html


--- Quote from: kitamesume on October 14, 2011, 01:42:02 PM ---lol APU Ivy...

--- End quote ---
For now, APU is just AMD’s marketing-speak for “CPU and GPU on the same die”. They’re going to have to bring GCN to fruition before they earn the right to talk about a true “Accelerated Processing Unit”, because right now what they’re doing is no different from Intel: merely putting a CPU and GPU in the same chip, with shared power management. In fact, Intel has gone one step further and gotten the CPU and GPU to share L3 cache; Llano/Trinity will only share RAM.

kitamesume:
nao i meant USB videocards where you could offload video decoding to it or something. in a simple way... a real GTX580 connected via USB O.o ok i`m kidding.

kureshii:
Heres an interesting mental exercise: guesstimating Trinity CPU performance.

Some background:

Highest stock clock on Thuban/Deneb:
3.7GHz (X4 980 BE w/o Turbo)
3.3GHz (X6 1100T w/ Turbo)

FX-8150 clock speed:
3.6GHz stock, 3.9GHz 8-core Turbo, 4.2GHz max Turbo

The reference:

A8-3850 – 2.9GHz 4MB L2 → performs similarly to X4 645 – 3.1GHz, 2MB L2

We’ll keep things simple and ignore lightly threaded applications. The matchup:

http://www.anandtech.com/bench/Product/188?vs=434&i=25.26.28.31.34.35.41.42.46.54.55

Keeping in mind that AMD promises a 10% performance improvement going from Bulldozer to Piledriver (a promise we hesitantly assume for now), and knowing:

1) Bulldozer’s performance and power efficiency
2) Trinity’s TDP will likely be kept at 100W max
3) Deneb and Thuban (X4 and X6) parts had 125W TDP as well (though you'll have to google to see how their power consumption compares with Bulldozer)

give a rough prediction of how Trinity is going to perform vs Llano (in terms of % performance increase/decrease) in the matchup.

(Remember that Trinity is a 2-module/4-core part.)

kitamesume:
^ we'll need a comparison between BD and Llano first. i mean a 2module BD vs 4core Llano.

TMRNetShark:
So Kureshii... your worried that built in graphics cards are gonna be replaced by IGPs? That's a GOOD thing. Will IGPs effect gaming/enthusiaists dedicated graphics cards? I havn't seen a single IGP that can rate up to an ATI 5770 or 6850 or a GTX 460/560. Having dedicated graphics is great, but that doesn't mean IGPs are aimed to replace them.

IGPs are there to help laptops and netbooks use less power while still having amazing graphical performace (for a laptop). Yes, laptops already have "dedicated" video cards... but those laptops run out of juice fast compared to an APU AMD has.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version