What did you think of the AMD Hawaii Tech Showcase today? I really loved how they showed all the games that were optimized for AMD cards, I didnt think that there were that many of them since nVidia has held the performance crown for so long. It made me realize that now that all the next gen consoles all have AMD chips in them that really means that game developers will get alot more used to designing games for AMD chips. I really loved how they partnered with Raptr to make an app similar to the nVidia Experience app called the AMD Gaming Evolved App: http://raptr.com/amd which takes performance data from raptr users running the same hardware, settings, etc and it finds the most optimal settings depending on the benchmark results instead of having settings picked just depending on what hardware your system has. Some of the games optimized for AMD that they showed and caught my attention were: 1) Thief 2) Star Citizen 3) Crytek 4) Tomb Raider 5) Battlefield 4 Some of the other things that I liked were that their graphics card with titan performance is going to cost $600 instead of $1,000 and it will have more DDR5 memory than the Titan/GTX 780.
I saw reading opinions about the new Hawaii chips, here are some interesting articles I came across that are good reads: http://semiaccurate.com/2013/08/07/amd-to-launch-hawaii-in-hawaii/ http://www.anandtech.com/show/7368/amd-gpu-product-showcase-live-blog
Well, duh. They wouldn't be competitive if they couldn't beat the current $600 card at the $600 price point.
Well AMD hasnt been competitive vs Intel for a long time now, and their video cards were really only a good option if you were on a limited budget but they couldnt compete with nVidia flagship cards. Its good to see that they finally became competitive agian, I miss the days when it was ATI vs nVidia because they really were toe to toe back then.
Final specs: Radeon R9 290X - 4 Independent Tessellation Units - ~3,000 Stream Processors - 512-bit Memory Interface - 4GB Video Memory - DirectX 11.2 Titan has 6gb ddr5 @348 bit Imho AMD will loose no matter what people say. As soon as AMD card will come out, Nvidia will drop the price and gonna come out with updated gtx 780. Plus alot of people already upgraded to next gen graphic cards.
AMD can compete plenty well in the CPU space because people often don't want to fork out an extra $100 for 5 frames per second better performance, and in video rendering which is a perfect example of highly multi-threaded workload AMD beats Intel in nearly every situation when comparing the two mainstream platforms. I can honestly say that owning both and FX-6300 and an i5 3570k that I have only encountered a single situation where I have ever been able to tell the difference between the two systems in any way, and that situation is so far out there that no one should ever really see it. On graphics cards, AMD has done very well for a long time, continually trading the crown with Nvidia. Actually many of the 7000 series cards are far better than their Nvidia equivalent, you just don't see it in a lot of the really old reviews because those don't use the drivers that gave the 15% performance jump to the 7k series. The only time Nvidia is clearly a better choice in all generations so far is for multi-graphics card systems. AMD is working hard on fixing that, but it is a long way from beating Nvidia in that situation. AMD graphics cards had 1-2 bad years when the cards that had started work already when AMD bought out ATI did not do very well because of merging complications. Those would be the 2k and 3k series. Starting with the first AMD made generation, the 4k series, the graphics cards have done very well. "The" card to have back at that point was the 4870 and then the 4890. Nvidia had no answer whatsoever and was flailing helplessly for almost 2 generations before finally coming back hard with Fermi based cards. Do you even remember the GeForce 100 series? A few remember the terrible GeForce 200 series cards, the power sucking, heat dumping, buggy behemoths. Then the stopgap GeForce 300 series to keep OEM's happy and pretend they had a product. Finally Nvidia got their feet 4 generations later with Fermi and GeForce 400 series, GTX 460, 470 and 480. The whole argument about AMD not being competitive is nonsense. Many new games are AMD optimized because AMD graphics chips power the Wii U, XBox 360, XBox One, and PS4. AMD CPU's power the XBox One and PS4. Developers are forced to massively optimized for highly parallel architecture now because AMD only is providing ~1.6GHz cores, but 8 of them. The only way to have good performance is to use all the cores you have available. The graphics chip in the new consoles is crap compared to our dedicated cards, but is still something like 15x more powerful than the old generation graphics chips in the consoles in real world usage. On the CPU side again, if you calculate strictly by flops performance between generations we see a big problem, the new CPUs only provide something like 22.x gigaflops, and that is from all 8 cores. Last gen XBox for example theoretically provides about 115 gigaflops of performance. The catch though is that the way the old CPU is designed and coded for actually has real world performance that is pitiful to even look at, something closer to 15 gigaflops real world usage. SO the new CPU should be just slightly more powerful, but only when the game is properly coded for multi-threading. Comparing the Cell processor in the PS3 though, it is a HUGE step down, but much easier to code for.
So you think that nvidia will respond with the GTX 790 (essentially two GTX 780's)? but I doubt that they would match the $600 price point if they did that though and the new nvidia cards arent scheduled for release until January 2014 which means that AMD will have the crown for 3-4 months. What did you guys think about the idea of AMD Mantle? http://www.techspot.com/news/54134-...y-mantle-api-to-optimize-gpu-performance.html http://seekingalpha.com/article/1714022-mantle-is-the-most-interesting-announcement-at-advanced-micro-devices-gpu14-event?source=yahoo http://www.tomshardware.com/news/amd-mantle-api-gcn-battlefield-4,24418.html
Nvidia's response that will beat these top end cards with a single GPU for the same price with better performance will be in the Maxwell cards, which are late and will be out in the 1st quarter of 2014. A GTX 790 would be a stopgap to maintain superiority, but unless Nvidia has already been working on such a card "just in case" it wont happen in a timeframe that makes any sense since Maxwell cards would be out before a product could be designed, tested, and manufactured. It also would make Titan owners very max, because the card would sell for probably about $1100, which is Titan range but would be near equivalent to running dual Titans. To pull a stunt like that would bring more hate towards them than any benefit from keeping a performance crown a couple months longer when Nvidia will just take it back anyway in short order. But then again, Nvidia loves to piss off anyone and everyone possible in the name of money, so it wouldn't be unrealistic to think them capable.
I don't understand the mindset of "You screwed me by releasing a new thing 6mos later". It's 6mo later, computer shit is always replaced in a short time frame with something with better performance/$. If you buy something near the end of its "new" cycle, you should expect a price drop or a better product to come soon. As far as AMD's news: Yay new faster cards, nothing particularly interesting about the cards themselves, though. Refresh of GCN, more efficient (thank goodness). The sound processing is neat. What has the potential to be really interesting is the Mantle API. I didn't get the big deal they made about 4k displays...It's just a new resolution, why's that somehow more difficult than it's ever been?
True, but people seem to get quite mad if their new $1000+ toy is replaced by something much better for the same price or barely anything more if it is within a year of release.
GPU announcements are nice but the market hasn't been all that interesting for a number of years due to similar performance between companies. I am eagerly awaiting the next AMD CPU announcements thought.
Well i just hope that AMD will come back really competitive. I still have 5850 cards on my 2 other PCs only the new one has a GTX 660 TI. Haven't used an AMD CPU in aeons tho, there is still quite a difference with Intel if u take CPU heavy games like Total War or PS2. If they do their shit right, i might build an AMD based solution next year, that'd be fun. Combined with ASUS ROG it'd be full black & red.
AMD has all but given up on sticking with Intel in the higher-performance ranges. The next gen FX cpus don't seem to exist. Then steamroller is only on 28nm...Intel is going to be selling 14nm chips next year! They'll be a full 2 nodes ahead. Of course, powerful CPUs may matter less or rather they want to make them matter less - that's the reason for Mantle. But being so far behind on the processing is really bad for efficiency. My only knocks against AMD GPUs are their efficiency and drivers I've never been very happy with. They're nice and competitive in general.
This article talks about everything I was thinking last night in regards to Mantle when I heard about it: http://www.anandtech.com/show/7371/understanding-amds-mantle-a-lowlevel-graphics-api-for-gcn
The design dimensions are important but cpu architecture is so much of a larger factor. People get caught up on things that don't matter. The relative gains between jumps haven't been that huge (compare a Sandy Bridge to Ivy Bridge). Power reduction is nice but heat is becoming a big issue in Intel chips. In my personal opinion, AMD has a much stronger architecture once it is sorted out.
Unfortunately AMD is in a bit of a bind, SoI process advantage is quickly running out which is one of the reasons for switching to Bulk. Intel is already used to Bulk and has their designs down well, AMD will be starting fresh. Also due to international decisions AMD must change their entire designs from a gate-first over to a gate-last architecture, another large change where Intel has already been doing it for years and AMD must play catch up. Sorting out the architecture manufacturing process may be more complicated than most people estimate.
Interesting. Just did some reading inspired from your post. It seems Q1 2014 will be a very make or break year for AMD on the CPU side.
Power is heat. Intel maybe has a "temperature problem" because they used a crappier thermal interface material. To call it a heat issue with the architecture or process is really wrong. And it really just doesn't matter that much except for the teensy tiny fraction of people pushing the CPU's past spec anyway (and even then, Intel is far more amenable to that, IMO, because of much lower heat output to begin with). It's not simply design dimensions/die area that is effected by the size (though it is a great way to reduce material costs). You're altering the properties of the transistors such that they need fewer volts, mostly. As P=f*C*V^2 you can see why that's a really big deal. Whether AMDs architecture is better or not is up in the air to some extent. The reality is that most workloads do not respond linearly to parallelization. Graphics stuff generally does that well - Lots of things need to get done and you can do them all at once. So do more. But it doesn't matter if you can do all the steps at once if they have to be done in a sequence. You have to do faster.
Power is not exactly heat, although usually the two are linked. That formula you posted has to do with actual power consumption, and power is dissipated as heat. However there are other factors that affect heat output such as the thermal density of the design. We can see from the Sandy to Ivy switch that they have the same power design envelope, yet Ivy is hotter. This is partly because of moving from solder to TIM, but it is also that Intel is at such a small process node that their thermal density is really high, this is compounded by the FinFET 3D design in their processors also allowing more power through in a smaller area. At stock speeds and voltage you are right that Intel does fine. And you are right it does help a lot for the power consumption of their processors, but even tiny changes in voltage drastically increase heat output because everything is so dense and heatsinks have a very hard time moving the heat away. All objects are easier to cool if the heat is spread out more because you can make more efficient use of the larger contact patch. As things get smaller your method of cooling must also change in that you have to come up with a way to get more efficient at cooling a single smaller area. Unfortunately standard heatsinks are not very good at that because the design concept of modern heatsinks is opposite of what needs to be done. We are right at the tipping point these last couple years in that although we can get better OC potential from smaller process node's giving better improvement in speed for less voltage but we are getting so small that the shrinks are causing OC headroom to start to decrease at a higher rate than it is increasing due to the thermal density and lack of cooling efficiency. Intel is at its limits for how much performance they can squeeze out of the Pentium 3 architecture and they will need to do a complete redesign from the ground up if they want to get significantly more performance from the processors. Or Intel can instead do what they have been focusing on, which is improving power consumption to become more relevant to the mobile space. Still, doesn't matter how much anyone loves or hates Intel, it is very impressive they could take an architecture so far and tweak such performance out of it for this many years. EDIT: Intel could also hope that new materials become mature enough to use as some of the new up and coming materials can be switched faster for less power and this would also bring better performance to their processors even without really changing the architecture.
Few volts isn't the problem (it is beneficial even) the issue is thermal density. Edit: Enigma just posted before and covered it like a boss.