Author Topic: Your view on AMD's Bulldozer  (Read 8508 times)

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Your view on AMD's Bulldozer
« Reply #100 on: October 14, 2011, 01:40:13 AM »
Anyone wanna do some builds to see who comes out on top in price/performance?
I don't really see any price/performance advantage on Bulldozer (system cost), at least not until we're talking about dual-SLI builds. The high price of the 990FX boards makes them worth it only if you're fully using those PCIe lanes. The cheaper FX8150 may perform like an i7-2600K at their respective stock speeds, but the Intel chipsets start at a lower price (with fewer PCIe lanes).

I've been looking at some of the reviews/benchmarks... it seems the only way to really get the most out of the Bulldozer is to overclock it. I guess AMD needs to put it at a "safe" power level as for it to work with most mobo/PSU combos.
Unfortunately the FX-8150 doesn't seem to overclock very well; none of the reviewers (in the few reviews I read) were able to get it to 5GHz within review-writing, at least not on air cooling. In contrast, Sandy Bridge already hits 5GHz with multiplier overclocking on air. The quad-core FX might do better, or perhaps disabling some cores on the 8150 might bring it above 5GHz; let's see what the overclockers can do.

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: Your view on AMD's Bulldozer
« Reply #101 on: October 14, 2011, 02:00:27 AM »
Unfortunately the FX-8150 doesn't seem to overclock very well; none of the reviewers (in the few reviews I read) were able to get it to 5GHz within review-writing, at least not on air cooling. In contrast, Sandy Bridge already hits 5GHz with multiplier overclocking on air. The quad-core FX might do better, or perhaps disabling some cores on the 8150 might bring it above 5GHz; let's see what the overclockers can do.

I only read the reviews of the FX-8120... but I found this to be funny:

Quote
According to the slide above, first AMD FX processors will be short to exists. Orders for eight-core FX-8150 and FX-8100, six-core FX-6100 and quad-core FX-4100 will be available for only half a year with some time after that to fill all obligations. Four month later after debut of mentioned models, AMD will update the line-up with more powerful FX-8170, FX-8120, FX-6120 and FX-6120 CPUs that are to form the basis of AMD FX processors.

What's the difference between those CPUs? :P

Anyways, if the FX-8170's are anywhere near as good if not better than the i7-2600's... than I don't know what's wrong with that CPU. The FX-8170 has twice as many cores at .5 GHz faster speed.

Now that I look at it what Passmark the FX-8150 was at. It's pretty damn close to the i7-2600 as is. I wonder where the FX-8170 will be? Especially for under $300.

(Obviously this is only one website's test scores. We'll see more when actual benchmarks have been made which I'm excited for... but probably be underwhelmed by the scores.)

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Your view on AMD's Bulldozer
« Reply #102 on: October 14, 2011, 03:46:53 AM »
Now that two days have passed since Bulldozer’s release and I’ve had some time to think about what AMD’s trying to achieve, I think I’m ready to put my opinion in paragraph.

Bulldozer is a disappointment for various reasons, of which only a couple relate to its actual results from reviews. Most of them seem to indicate some kind of management issue over at AMD (not to say Intel doesn’t have any, but they seem to do a better job of hiding it). To list:

Performance not meeting resources
The undeniable conclusion is that while Bulldozer may be great for server workloads (I await Interlagos final-silicon benchmarks before concluding this), it clearly is not as great for most enthusiast applications. If we look at multithreaded application performance (because looking at lightly threading makes me feel sorry for Bulldozer), going from Thuban to Zambezi, we see an approximately 15% increase in load power consumption and double the transistor count, for performance increases that vary from 36.6% for par2 recovery to 14% for 7zip MIPS. Accompanying this is a 9% reduction in die size. Contrast this with the 27% decrease in die size and 29% increase in transistor count, 8% increase in load consumption, and corresponding change in performance over the same benchmarks* that we observe going from Lynnfield to Sandy Bridge. Both are observed over a single node shrink from 45nm to 32nm. In other words, while Bulldozer shows varying degrees of improvement over Thuban, the improvements are nowhere near matching what was accomplished with Sandy Bridge, a microarchitecture that is clearly a better fit for enthusiast application performance. To put it bluntly, it is not a good fit for the enthusiast market.
* Yes, this is but a subset of the benchmarks available. I pick them not to bias Sandy Bridge, but to avoid distraction from synthetic benchmarks, focusing instead on more pertinent real-world multithreaded benchmarks. One might even note that I picked benchmarks where Bulldozer performance was not downright painful to look at. 7zip is the only anomaly to the suite of real-world performance tests, since no real-world compression test was done for FX-8150.

Some may argue that, regardless of the number of transistors used, the die size, the TDP of the chip and its power consumption, as long as it performs better with a corresponding scaling in price, it should be considered a better product. Unfortunately, this argument also falls flat. On Newegg, the 1100T is $189; the FX-8150, $279 (though $249 MSRP). Ignoring the pricier 9-series chipset you’ll need for AM3+, that’s a 30% increase in price, for performance increases that don’t scale as much. The Sandy Bridge flagship, i7-2600K, at $319, was 7% pricier than its i7-860 predecessor ($299) at launch, but the performance increase justifies the price increase.

Another popular argument is that the FX-8150 compares favourably to an i7-2600K in x264 and similar heavily threaded applications for a lower price and is thus obviously the superior choice. This is putting the cart before the horse. The FX-8150 is not a better product because it is cheaper; it is cheaper because it is not a better product. You don’t think AMD would have loved to price the FX-8150 at $299? Then you quite sorely misunderstand AMD’s position. They did not pick the value-for-money position willingly, but were forced into it. A larger chip with more transistors would be costlier to manufacture, and pricing it lower than a competitor that has fewer transistors and a smaller die size is simply not desirable. Low-margin pricing is a risky game that chip designers try to avoid with good reason.

Market misfit
Since the first Bulldozer preview we’ve seen and understood that AMD is aiming for a different architecture, one that is geared for greater overall throughput. RWT surmised it is geared for heavily multithreaded server workloads, and none of the hardware previews revealed enough numbers for a conclusive guesstimate of single-thread performance. The unspoken thought was that AMD would be able to make a comeback, somehow, and without knowing enough about Bulldozer that was all we could hope for.

I’m not about to accuse AMD of withholding information. Intel has been much more forthcoming with microarchitecture information since Conroe; that is a privilege they enjoy from being in the lead and having picked a successful path in Nehalem. AMD is in no such position. Rather, what I’m questioning is this: AMD execs could not have been unaware of Bulldozer’s real-world performance in enthusiast benchmarks. Surely they must have known how it would perform against the X6 and Sandy Bridge at least a year before release.

Naturally, they could not back out of the enthusiast market at that point; I’m not sure if the enthusiast market is really that profitable, but they seem to need revenue from it quite badly. Instead of playing down Bulldozer as an enthusiast chip, they drummed up marketing (“World’s first 8-core desktop processor!” etc), and convinced OEMs to come up with 44 31 AM3+ SKUs for Bulldozer alone. Rebranding the 8-series chipsets to include AM3+ support is inevitable, but wherefore the need to overprice them? Think about it: 44 31 SKUs over 3 chipsets, on micro-ATX and ATX alone! Even FM1 only got 42 33 SKUs over 2 chipsets on mini-ITX, micro-ATX and ATX. I know the inevitable comparison here is to Intel’s >200 >160 SKUs for H/P/Z 6-series, but let’s not get too distracted: The point I’m trying to make is that AMD sold [the idea of] Bulldozer convincingly enough that they managed to drum up almost as many AM3+ SKUs as FM1 SKUs. Did they intend to sell Bulldozer at the same volume as their mainstream Llano offerings, despite knowing its performance in enthusiast workloads?
* Numbers adjusted to discount open-box products

Yes, certainly, Intel might have done the same thing in those shoes. But we’d have hated them for it. Even cutting AMD some slack, only a fanboy could believe such marketing again. They’ve disappointed with Phenom I (“World’s first true quad-core!” ... sound familiar? Remember how it did against C2Q?), and now they’ve disappointed with Bulldozer. It’s fine if Bulldozer, being geared for server workloads, wouldn’t do well in the enthusiast space, but they should have marketed it likewise then. Drumming up so much support for it knowing it would let enthusiasts down is a bad, bad PR move, and no doubt the bad taste is going to stay.

Mismanagement of the FX brand
I don’t mean to say the FX brand is some sort of godly name that has been sullied, but it would not be an overstatement to say that the FX brand name has had a really good reputation owing to Athlon 64’s success in its heyday. It has good PR vibes associated with it, and is a valuable brand name that could be used to sell new processor lines.

No company would willingly throw such a brand name away on a product that isn’t up to expectations. Dell reserves its Ultrasharp line of monitors only for non-TNs, HP Elitebooks are enterprise-level notebooks with build quality that matches the name, and I don’t think I need to say anything about Lenovo Thinkpads. FX processors enjoy the same reputation, and one would definitely expect it to at least best the X6 conclusively, even if not quite matching Sandy Bridge across the board.

They could certainly have used a different model name to market Bulldozer, but again they seemed to be desperate to drum up support for a product that was ill-fitted to enthusiast needs (overclocking, encoding, rendering, etc). The result is that they have wasted a lot of consumer goodwill, and the FX brand is no longer as reputed or profitable as it was before Bulldozer’s release.

Mistrust of AMD management
I’ll say it: trust in AMD has been eroding. Without knowing what’s going on in Sunnyvale I won’t make any guesses about what AMD management is thinking, but looking at AMD-related news lately I don’t think it’s unreasonable to say that things look less than sunny. A number of high-level execs have left, execs who’ve brought about the success of earlier products. But this is just one aspect of it.

The other thing is that AMD simply needs a better PR/marketing team. Intel, the success of their products notwithstanding, has a more downplayed PR team that doesn’t overpromise and doesn’t set themselves up for blame as easily. AMD’s PR team, on the other hand, puts forth the argument that their architecture is “forward-looking” and “aimed at tomorrow’s workloads” (Intel said that about the Pentium 4 too). It’s fine to say that in the server space, where workloads are more readily characterisable (virtualisation, HPC, database, hosting, etc) and hardware more readily tunable for specific needs, but it won’t fly in the enthusiast market; there, it simply looks like blame-pushing (if only devs would compile their apps for optimum performance on our obviously superior platform, we would get better benchmark numbers!)*. If one is going to sell a “new”, “special” microarchitecture in the consumer/enthusiast space, one had better do one’s best to ensure software support is present, if not at least promised in the near future, before launching one’s product. It’s great if it works for tomorrow’s workloads, but if it doesn’t fit today’s workloads well, people are only going to buy it tomorrow. [See market misfit above].
* No doubt x86 is partially to blame for this, but again let’s not get distracted: AMD’s disadvantage in Intel-influenced instruction sets aside, the architecture nonetheless demonstrates its unsuitability to lightly-threaded and enthusiast workloads.

I’ll avoid pushing too much blame to them; Bulldozer is not an easy product to sell to enthusiasts, not with that kind of performance. (It was a mistake to push Bulldozer in the enthusiast space in the first place IMO.) I am of the opinion that the amount and level of marketing for a product should match its performance, and if it doesn’t perform then don’t make it out to be the best thing ever. AMD’s selling of Bulldozer leaves me with the impression that the Bulldozer project hasn’t been very well-managed from the start. This is a different accusation from saying that Bulldozer is a poor product; don’t mix the two up.

So what now?
AMD has promised 50% better performance-per-watt for the Bulldozer architecture by 2014. That is a pretty vague promise, as no mention is made of which workload this promise applies to, nor is it a firm promise of better performance since they could just lower power consumption by 33% keeping performance equivalent. All we know for now is that “AMD is committed to improving Bulldozer”. That’s hardly newsworthy.

On x86 monopoly
I have no reason to hate AMD, since I haven’t preordered Bulldozer. But we’ve all been eagerly awaiting updates/upgrades from AMD that will “keep Intel on its toes”. I’m not naive enough to think that we have to keep AMD alive at all costs because their exit from the x86 market is going to bring back $500 mainstream Intel processors. Intel does have competitors, and they’re not in the x86 market. The end of PCs isn’t going to come anytime soon, but Intel can easily bring it about with suicidal pricing policies. Let’s not even mention the antitrust suits that various groups are waiting to spring on Intel should AMD make its exit ... Rather, what I worry is that Intel will have less motivation to improve on things it has to catch up on, namely IGP performance. Product stagnation is not a desirable outcome for consumers either.

On integrated graphics
I have similar expectations of AMD; smaller companies don’t get any more slack in the innovation or product improvement department. The lack of Eyefinity/triple-display support in APUs, lower hardware post-processing quality compared to the higher-end chips, and other little bugs (broken chroma upsampling in non-BD playback, for one) was a big disappointment for me, even if its raw performance is way ahead of HD Graphics. What this tells me is that AMD’s IGPs play second fiddle to its dedicated GPUs. I don’t mind if raw performance falls behind the PCIe cards; that is only to be expected. But they have made no mention of getting these issues fixed in future iterations, and I have no idea if future developments in GPU features will make their way into the APU.

On the other hand, Intel only has one focus for graphics: IGP. Whatever they are working on in graphics, you know it’s going to work its way into their IGP. They’ve promised fixes for the 23.976-fps-without-UAC issue, better hardware AA, as well as HDMI 1.4, DX11, OpenCL 1.1, OpenGL 3.1 in Ivy Bridge; that pretty much brings Ivy Bridge IGP’s feature set up to date and completely matching AMD’s and Nvidia’s (if they fulfill their promises). Go ahead and poke fun at their IGP performance; it is entirely justified. But one cannot deny that for non-gamers, HD Graphics is becoming more and more viable as a fully featured GPU. We could not even say this of their IGP two generations ago (in Clarkdale), but they have improved greatly with each generation. Once Ivy Bridge rolls around, dedicated HTPC cards may no longer be needed on Intel-based platforms.

Anxiety
It may sound like I’m selling Intel here, but what I mean to say is that AMD and Nvidia have a lot to be worried about even in the graphics arena, their stronghold: the market does not consist of gamers alone. Right now videophiles/consumers are still buying dedicated graphics cards for HTPCs (some with valid reasons, others based on outdated impressions of IGPs), but if Intel catches up in features, that is one market segment decimated for AMD and Nvidia.

Meanwhile, all we hear from AMD and Nvidia regarding new products is gaming, gaming, gaming. There is some mention of compute, but hardly any fire following the smoke; we haven’t seen much use of it apart from CUDA encoding, Folding@Home and MD5 brute-forcing. CUDA is popular, but largely in the HPC space, not so much in consumer applications. DirectCompute seems to be taking off, but very slowly. They should push for it much more aggressively in applications if they’re going to keep any advantage for a dedicated graphics card in anything other than gaming.

Looking forward
While I still can’t say Intel is my favourite company, I can say right now that it is the company that excites me the most. They’re promising things other than raw gaming performance, and delivering them. To date: triple-display on IGP, greater hardware video encode/decode capability, Thunderbolt support (on limited Panther Point motherboards, presumably more widespread in Lynx Point), new low-profile board options (thin ITX). AMD and Nvidia do have things on the roadmap, but they all seem to be geared for gaming and rendering needs. No doubt they would excite gamers and 3D modellers, but they promise nothing to non-gamers, typical consumers and non-gamer enthusiasts (yes, they do exist; look on AVForums and SPCR forums) other than better raw performance. Case in point: Nvidia doesn’t seem at all interested in OOTB single-card support for displayport or triple-display.

There is one exception, however: AMD Graphics Core Next (GCN; yeah, I did say they need a better PR/marketing team). This is supposedly what AMD’s future APUs will be, and it promises orders-of-magnitude increase in multithreaded integer performance without any developer intervention (or minimal intervention, at most). This, not Bulldozer, is what could turn AMD’s fortunes. And this is what I’d love to see more of.

Unfortunately, it’s not slated to be released in Trinity (which will be using VLIW4), so the earliest we’ll be seeing this is 2013. What we’ll be seeing on Radeon 7000-series instead is Fusion System architecture (FSA), basically AMD catching up with compute-oriented architectural developments that Nvidia had put in place in the Fermi architecture, released last year.

Mini-update insert: VR-Zone with a bit more IDF coverage on Haswell overclocking. More fine-grained clocking options sounds yummy, even if I’m not an overclocker (if anything I’m more likely to undervolt instead). To quote the article: “A year later, in early 2013, the pinnacle of Intel’s 22 nm process show off, the initial Haswell processor, is expected to go another step further, where CPU core, GPU, memory, PCI and DMI ratios are all set independently here, on top of fine grain BCLK base clock available within the Lynx Point chipset.”

Sounds like Haswell could be even easier to overclock than Sandy/Ivy Bridge.
« Last Edit: October 14, 2011, 12:50:04 PM by kureshii »

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #103 on: October 14, 2011, 07:23:46 AM »
^ TL:DR "AMD isnt giving up but still isnt much of a treat for Intel"?

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Your view on AMD's Bulldozer
« Reply #104 on: October 14, 2011, 07:38:17 AM »
tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: Your view on AMD's Bulldozer
« Reply #105 on: October 14, 2011, 01:08:53 PM »
tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.

Meaning what? I think graphics cards have been going forward... although you might mean something different when you say GPU features.

Offline Lupin

  • Member
  • Posts: 2169
Re: Your view on AMD's Bulldozer
« Reply #106 on: October 14, 2011, 01:15:22 PM »
tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.
++
once Intel puts its R&D prowess and money on graphics, all the lead both AMD & nvidia has all this time will be gone in an instant. Intel can fail a couple of times since it has the money which both doesn't have.

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #107 on: October 14, 2011, 01:42:02 PM »
lol APU Ivy...

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: Your view on AMD's Bulldozer
« Reply #108 on: October 14, 2011, 02:04:09 PM »
tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.
++
once Intel puts its R&D prowess and money on graphics, all the lead both AMD & nvidia has all this time will be gone in an instant. Intel can fail a couple of times since it has the money which both doesn't have.

Wait, that would mean Intel would release $200 entry level cards along with $500 high end cards (like a GTX 580). Oh wait, the GTX 580 is already ~$450. So if Intel made a dedicated video card chipset... I swear it's just going to be 2 Core 2 Duos on that fucking card (I know nothing about the different between CPUs and GPUs. I know CPUs has "threads" and GPUs has "pipelines"... be it shader or pixel.)... Quad core graphical gaming. The tessellation in that card would make perfect circles in games. :P

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #109 on: October 14, 2011, 02:37:54 PM »
nao, USB3 videocards! lol, nevermind power consumption, wonder if the USB have enough bandwidth to support it.

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Your view on AMD's Bulldozer
« Reply #110 on: October 14, 2011, 03:22:45 PM »
tl;dr AMD needs to get a better marketing team, and get their top-level management team sorted out. And they (and Nvidia) should stop being complacent with graphics, especially when it comes to GPU features.

Meaning what? I think graphics cards have been going forward... although you might mean something different when you say GPU features.
The last leap they made was Eyefinity, in the 5000-series. Since then, with the 6000-series we've only seen (feature-wise) full hardware video decode support, Displayport 1.2, HDMI 1.4a (adding 3D stereoscopic support over HDMI 1.4) and colour correction in postprocessing. Sandy Bridge has caught up with hardware video decode support (and overtaken it actually), as well as added colour correction. Ivy Bridge will add HDMI 1.4a, but will remain at Displayport 1.1 support only.

This all sounds like nothing much; Intel hasn't caught up yet, have they? What worries me is that all the talk about Radeon 7000-series (Southern Islands) has only been about its compute and gaming capability. It’s all fine and good that they're finally getting serious about the programmability of their GPUs; would be disingenuous of me to complain about that. But again, couple this with the lack of any commitment to fixing the HTPC-feature issues in Llano pointed out by Anand, and it means we have no assurance about these graphic features being eyeballed and paid attention when Trinity and Southern Islands are released next year.

“Wait, so what exactly has Intel done features-wise that AMD hasn’t?” you may ask. Hardware transcode, wireless display (WiDi), IGP triple-display support. Only the first has appeared in retail products so far, but the other two have been confirmed and just awaiting release.

“But what kind of consumer needs these anyway?” This is exactly what I’m worried about. If AMD continues to neglect IGP features like that, thinking it just needs to filter down enough features from its discrete-GPU product line to keep the consumer uncomplaining, we may see Intel come up with the IGP of choice for non-gamers. Its a laughable thought at this moment, but when you look at how far Intel has come since Clarkdale (which was just two generations), I think there’s good reason to sit upright and pay attention.

Let’s put that in a different frame. Late 2008. X4500 was Intel’s latest IGP. It was doubtlessly shitty. Blu-ray acceleration finally worked, and CPU utilisation during playback was still shitty. Still not worth looking at.

It is now Jan 2010. Clarkdale just came out; it still had absolutely shitty graphics, but at least they had DTS-HDMA finally working! But more importantly, it suddenly caught up with the 790GX in some benchmarks. So Intel finally caught up. Whats the worry? The most that happened is Anand finally considered Intel HD Graphics “HTPC-worthy”.

Fast-forward a year later. Sandy Bridge is released. Just about all its HTPC issues have been resolved, except for that niggling 23.976fps playback which wouldn’t work without disabling UAC. More importantly, it has now almost left the 890GX (rebranded 790GX) behind and is catching up with HD5450, even overtaking it in some games. Half a year later AMD reclaims its IGP crown, but it had to pull in its newest to do that. The HD6450 has been relegated to worse-than-IGP class. More importantly, AMD had to pull out all stops on the IGP for the first time. It’s a far cry from the speed their dedicated cards are pulling, but keep in mind that the IGP has to share TDP with the CPU, and I don’t doubt that what went into Llano was AMD’s best effort in IGP engineering.

It looks like a string of losses for Intel, doesn’t it? Each time it improves, AMD just filters some slightly newer Radeons into their IGPs and they’re still in the lead. But Intel isn’t just getting better in raw performance, it’s getting better—much, much better, scarily so—in CPU performance as well as power efficiency. Perhaps as a gamer this doesn’t matter much, but keep in mind that with on-die IGPs both the CPU and GPU have to share the TDP budget.

We were hoping that Bulldozer would save Trinity, but looking at its performance and power consumption I’m suddenly not so sure. How much better AMD can make their IGP perform will depend on how much they can shave off the CPU TDP without nerfing it too much (so as to leave more for the GPU), and how much they can reduce power consumption in the VLIW4 Radeon 7000Ds (which are the ones going into Trinity, not the FSA architecture). In the meantime, Intel has already promised up to 60% better performance from the Ivy Bridge GT2 units. AMD promises up to 50% better Gflops in Trinity, but no mention of whether this comes from CPU or GPU or both.

Are you scared for AMD yet? I know in a discrete card shootout AMD would doubtless own Intel many times over, but the IGP game is quite a different one, and that is where Intel and AMD both bring inherent advantages: Intel in power management and power efficiency, AMD in GPU architecture/performance. And Intel’s still catching up on AMD in GPU features. (And still no triple-display support announced for Trinity by the way). Are you scared for AMD yet?

Again, don’t take this as an Intel-pwns-AMD post. I am deathly scared for AMD, not because Intel is suddenly going to become godly at graphics. It’ll take them a few more years to catch up in performance at least, and many more to acquire the kind of experience AMD/Nvidia have, but they are taking IGPs seriously and it shows in their PR. Meanwhile, AMD doesn’t seem scared at all, and continues to spin their PR yarn, keeping mum about details. Time to wake up from your FX-fueled dreams, AMD; FX is no more. Listen to those bug reports, fix them quick, don’t aim to compete only on raw performance; that’s not how the consumer market works.

nao, USB3 videocards! lol, nevermind power consumption, wonder if the USB have enough bandwidth to support it.
http://www.sunix.com.tw/product/vga2788.html

lol APU Ivy...
For now, APU is just AMD’s marketing-speak for “CPU and GPU on the same die”. They’re going to have to bring GCN to fruition before they earn the right to talk about a true “Accelerated Processing Unit”, because right now what they’re doing is no different from Intel: merely putting a CPU and GPU in the same chip, with shared power management. In fact, Intel has gone one step further and gotten the CPU and GPU to share L3 cache; Llano/Trinity will only share RAM.
« Last Edit: October 14, 2011, 03:43:20 PM by kureshii »

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #111 on: October 14, 2011, 03:40:45 PM »
nao i meant USB videocards where you could offload video decoding to it or something. in a simple way... a real GTX580 connected via USB O.o ok i`m kidding.

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline kureshii

  • Former Staff
  • Member
  • Posts: 4485
  • May typeset edited light novels if asked nicely.
Re: Your view on AMD's Bulldozer
« Reply #112 on: October 14, 2011, 03:53:35 PM »
Heres an interesting mental exercise: guesstimating Trinity CPU performance.

Some background:

Highest stock clock on Thuban/Deneb:
3.7GHz (X4 980 BE w/o Turbo)
3.3GHz (X6 1100T w/ Turbo)

FX-8150 clock speed:
3.6GHz stock, 3.9GHz 8-core Turbo, 4.2GHz max Turbo

The reference:

A8-3850 – 2.9GHz 4MB L2 → performs similarly to X4 645 – 3.1GHz, 2MB L2

We’ll keep things simple and ignore lightly threaded applications. The matchup:

http://www.anandtech.com/bench/Product/188?vs=434&i=25.26.28.31.34.35.41.42.46.54.55

Keeping in mind that AMD promises a 10% performance improvement going from Bulldozer to Piledriver (a promise we hesitantly assume for now), and knowing:

1) Bulldozer’s performance and power efficiency
2) Trinity’s TDP will likely be kept at 100W max
3) Deneb and Thuban (X4 and X6) parts had 125W TDP as well (though you'll have to google to see how their power consumption compares with Bulldozer)

give a rough prediction of how Trinity is going to perform vs Llano (in terms of % performance increase/decrease) in the matchup.

(Remember that Trinity is a 2-module/4-core part.)
« Last Edit: October 14, 2011, 04:13:04 PM by kureshii »

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #113 on: October 14, 2011, 05:03:11 PM »
^ we'll need a comparison between BD and Llano first. i mean a 2module BD vs 4core Llano.

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: Your view on AMD's Bulldozer
« Reply #114 on: October 14, 2011, 05:10:41 PM »
So Kureshii... your worried that built in graphics cards are gonna be replaced by IGPs? That's a GOOD thing. Will IGPs effect gaming/enthusiaists dedicated graphics cards? I havn't seen a single IGP that can rate up to an ATI 5770 or 6850 or a GTX 460/560. Having dedicated graphics is great, but that doesn't mean IGPs are aimed to replace them.

IGPs are there to help laptops and netbooks use less power while still having amazing graphical performace (for a laptop). Yes, laptops already have "dedicated" video cards... but those laptops run out of juice fast compared to an APU AMD has.

Online kitamesume

  • Member
  • Posts: 7224
  • Death is pleasure, Living is torment.
Re: Your view on AMD's Bulldozer
« Reply #115 on: October 14, 2011, 05:41:04 PM »
^ actually that of a HD5670 / 9600GT level is already plenty for an IGP. these cards could at least play modern games on reasonable settings.
« Last Edit: October 14, 2011, 05:56:40 PM by kitamesume »

Haruhi Dance | EMO | OLD SETs | ^ I know how u feel | Click sig to Enlarge

Offline Lupin

  • Member
  • Posts: 2169
Re: Your view on AMD's Bulldozer
« Reply #116 on: October 14, 2011, 06:27:46 PM »
So Kureshii... your worried that built in graphics cards are gonna be replaced by IGPs? That's a GOOD thing. Will IGPs effect gaming/enthusiaists dedicated graphics cards? I havn't seen a single IGP that can rate up to an ATI 5770 or 6850 or a GTX 460/560. Having dedicated graphics is great, but that doesn't mean IGPs are aimed to replace them.
Intel and AMD want to eventually combine the CPU and the GPU, sharing all resources with very little latency between them. There's lots of performance gains in doing that, similar to improvements when the memory controller got integrated to the CPU die. If they (or at least intel) succeeds in doing that, discrete cards will eventually become a niche product. You know what happens to prices in a market like that. Because discrete cards became a niche product, how many SKUs do you think will be released?
« Last Edit: October 14, 2011, 06:39:05 PM by Lupin »

Offline TMRNetShark

  • Member
  • Posts: 4134
  • I thumps up my own youtube comments.
Re: Your view on AMD's Bulldozer
« Reply #117 on: October 14, 2011, 07:35:30 PM »
So Kureshii... your worried that built in graphics cards are gonna be replaced by IGPs? That's a GOOD thing. Will IGPs effect gaming/enthusiaists dedicated graphics cards? I havn't seen a single IGP that can rate up to an ATI 5770 or 6850 or a GTX 460/560. Having dedicated graphics is great, but that doesn't mean IGPs are aimed to replace them.
Intel and AMD want to eventually combine the CPU and the GPU, sharing all resources with very little latency between them. There's lots of performance gains in doing that, similar to improvements when the memory controller got integrated to the CPU die. If they (or at least intel) succeeds in doing that, discrete cards will eventually become a niche product. You know what happens to prices in a market like that. Because discrete cards became a niche product, how many SKUs do you think will be released?

Yeah, I can see phones getting like 2005 computer's power in 5-10 years... but then we need advances in the battery market (cause you know that phone will die in 2 seconds if it tried to play BF1942). Combining the two on same dye does lead to advances in speed, but then are there gonna be the top of the line CPU-GPU combo like a 965 BE with a 6850 while another bundle is is like a FX-8170 with dual 7980's (or whatever is the comparable 6970's of the next generation). That would be cool... but wildly expensive and would DEFINITELY need water cooling to even think about overclocking.

Offline ColdFission

  • Member
  • Posts: 77
Re: Your view on AMD's Bulldozer
« Reply #118 on: October 15, 2011, 12:37:34 AM »

Offline mgz

  • Box Fansubs
  • Member
  • Posts: 10562
Re: Your view on AMD's Bulldozer
« Reply #119 on: October 15, 2011, 12:48:22 AM »
So Kureshii... your worried that built in graphics cards are gonna be replaced by IGPs? That's a GOOD thing. Will IGPs effect gaming/enthusiaists dedicated graphics cards? I havn't seen a single IGP that can rate up to an ATI 5770 or 6850 or a GTX 460/560. Having dedicated graphics is great, but that doesn't mean IGPs are aimed to replace them.
Intel and AMD want to eventually combine the CPU and the GPU, sharing all resources with very little latency between them. There's lots of performance gains in doing that, similar to improvements when the memory controller got integrated to the CPU die. If they (or at least intel) succeeds in doing that, discrete cards will eventually become a niche product. You know what happens to prices in a market like that. Because discrete cards became a niche product, how many SKUs do you think will be released?

Yeah, I can see phones getting like 2005 computer's power in 5-10 years... but then we need advances in the battery market (cause you know that phone will die in 2 seconds if it tried to play BF1942). Combining the two on same dye does lead to advances in speed, but then are there gonna be the top of the line CPU-GPU combo like a 965 BE with a 6850 while another bundle is is like a FX-8170 with dual 7980's (or whatever is the comparable 6970's of the next generation). That would be cool... but wildly expensive and would DEFINITELY need water cooling to even think about overclocking.
thats only because you are thinking in current terms and physical items.

As processors shrink heat output generally goes down and power consumption goes down while making a more powerful item as that continues to scale it becomes much more feasible to have what is seemingly unthinkable in a very short period of time.
Just read some shit ray kursweil or w/e writes more or less hes a futurist and inventor i dont care if i spelled his name right.
And just expects our technological advances to follow the same growth it has been for some time which means compared how far we went from the calculator power sofa sized computers to now in 50 years. Our components are tens of thousands times faster and more efficient, and so much fucking smaller.

So just slide the scale in your mind and think about that, and then apply that to a concept like integrated graphics and realize that graphics can only get so good with the type of viewing we currently use.