Probably going to buy a new rig soon. Have a 1.3k budget. So far, build looks like: Intel Core i5-3570K (3.4/3.8)Intel Core i5-3570K (3.4/3.8) (219) GA-Z77X-D3H (135) Team Dark Series 8GB (2 x 4GB) 240-Pin DDR3 (60)
Why not go with the haswell 4670k? 20 bucks more (and actually $20 off on newegg if you order it by tomorrow). Also, wow DDR3 prices are still way up. I thought Hynix was back working. I've never heard of that particular brand, though.
This is how I would break down your budget CPU - 20-30% Video Card - 20-30% Memory - 10-15% Motherboard - 15-20% Try to reuse what you already have, like a case, power supply and hard drives. If not then use the lower range of my percentages. Wait till December at least before buying anything.
Hynix fire had nothing to do with DDR3 prices since the facility manufactures the VRAM used in Nvidia graphics cards. prices are up just because companies slowed down all their production cause they were not making very much money. If you can do what FriendlyFire said then you could get away with a 4670K, nice heatsink, GTX 760, $150 motherboard, 16GB of RAM, and a 250GB Samsung EVO SSD within your budget. TeamXtreme makes some pretty good RAM, not very popular though. I have some Team Dark sticks I like.
Ill try and get an entire build together after work today based on what is currently out and current prices. If you have any money available during Black Friday and Cyber Monday then you should keep on the lookout for any parts you want that will be on sale for better prices.
I was going to wait for the release of the next gen hardware before buying. I figured with the next gen stuff coming out, older gen stuff would be massively depressed for a bit.
AMD's new cards just came out, Nvidia is delayed 6 months or so. Intel has nothing good coming out soon, AMD's next processor is due out middle of January to February, however it is likely to only be a quad core at most. But ya best to wait to see what AMD has up it's sleeve since we will know within 2 weeks.
Think it's worth getting the FX-9350 or FX-8350? I've been reading that oct-core processors aren't utilized fully in games. (I'd also be using this for engineering simulation and modeling.)
Depends on if your engineering and modeling use a lot of processor threads or not. It is something you definitely need to check on, as well as finding out if it uses CUDA or OpenCL to let the GPU do a lot of the work. That info will decide what parts you need to go with. If your software doesnt use the GPU, but does use 8+ threads then an FX-8350 may not be a bad way to go. If it only uses 1 or 2 threads then staying with Intel would be a better choice.
Previous generation IntelCPU prices don't tend to tank. The 2xxx and 3xxx stuff is now an outdated socket, but they haven't dropped precipitously ( you see there, the 3570k is only $20 cheaper than what's "replaced" it in the lineup). I don't think its ever a particularly good idea to buy previous gen hardware unless it's some kind of screaming deal on the highest end stuff. For calculations that are easily parallelized the GPU (or something like a Xeon Phi) will be better than a single CPU...and if its not very parallel Intel is going to be faster than AMD. I'd stick with Intel just because it won't get nearly as hot in a 100% pegged situation. What exactly are you modeling? Most of the "real" stuff is either pretty easy so you don't really need to worry about it as getting good gaming stuff is sufficient, or it's really quite hard and you need to be looking at something more specialized than a gaming box.
Intel processors get far hotter than AMD. Not that heat has ever actually mattered to me. Just saying.
Intel puts out less heat to the tune of 140 fewer watts compared to an FX9xxx. Thus things get less hot. Nitpick all you want about die temperature.
temperature =/= power draw. it is hardly nitpicking when in order to compare temperature, you actually have to look at the temperature. The FX-9xxx series is about the same actual temperature as most Intel CPUs, slightly hotter when comparing stock to stock. Anyone who is considering that processor though doesnt care at all about heat or wattage and it is in a completely different category than the CPUs you are trying to compare it to. These are the parts I came up with in your budget Trakaas. But it still depends on what your software is capable of using, so you still need to find that out before you buy anything. But as kinda a rough draft build here ya go: http://www.newegg.com/Product/Product.aspx?Item=N82E16811146078 http://www.newegg.com/Product/Product.aspx?Item=N82E16827135204 http://www.newegg.com/Product/Product.aspx?Item=N82E16817151118 http://www.newegg.com/Product/Product.aspx?Item=N82E16813128592 http://www.newegg.com/Product/Product.aspx?Item=N82E16819116899 http://www.newegg.com/Product/Product.aspx?Item=N82E16835103099 http://www.newegg.com/Product/Product.aspx?Item=N82E16820313372 http://www.newegg.com/Product/Product.aspx?Item=N82E16814131499 http://www.newegg.com/Product/Product.aspx?Item=N82E16820147248 http://www.newegg.com/Product/Product.aspx?Item=N82E16822236339 Subtotal: $1212.81 After tax and shipping should end up about $1300ish.
It'd be a solid build. But on temperature: Why does this consistently come up? It's like nobody understands what watts or joules or specific heat capacity are, let alone that fact energy has to be conserved. The CPU can be 30degrees cooler, it doesn't matter: 140w more means "stuff" has to be hotter even if the die is still cool. Something else has to get warmed up instead. The energy doesn't just disappear. So the whole entire universe ends up hotter with a 220w CPU than it is 84w cpu. The cooler on the higher power CPU will have to work harder per surface area. If you put it in a medium size bedroom you will absolutely notice the difference in temperature of that room between the two regardless of what the actual CPU temperature is - that just doesn't matter much. It's absolutely something to consider especially if it's going to do prolonged number crunching. He's said he's looking at them already - he hasn't said a thing about heat/noise, so who knows what he cares about? They're faster, it's a tradeoff to look at. Maybe he lives in a nice 70F year round cave and it doesn't matter. I don't know.
There doesn't necessarily have to be a build-up though. Yeah, more power is consumed but it could be more efficient, ie. release less waste heat.
Almost all of it comes out as heat. The work being done is moving electrons, spinning things and orienting magnets...all of which turn into heat in the end. Your computer isn't storing energy or doing (much...it does move some air) work on its surroundings but it is consuming power. By a simple energy balance we know it has to be coming out as heat.
I see what you're saying, but something there sounds wrong to me... the energy that goes into a system is essentially divided into two parts: output work and output heat. The work is the computation (no temperature change), the heat is all the energy wasted to achieve the desired output (increased temperature). I could be completely off on my thermodynamics but I seem to recall that's how things work. Then again, I've never studied thermodynamics in relation to nano-scale micro-circuitry so i might be wrong.
I would think Trakaas is right on this. The work produced would be in this case the billions of transistors that are mechanically switching, the gates opening and closing, etc inside the CPU. That energy can be transferred to the "mechanical" part of what goes on inside the CPU instead of 100% of the energy going in to come out as heat. If all processors simply had all their power come out as heat then how does an FX-9590 run the same temperature as an i7 4820k yet that is comparing a 220w to 130 watt TDP.
Mechanical switches might actually be a decent analogy: You use power to flip switches to do computation. When a mechanical switch flips it has to gain some kinetic energy to move. It loses some of that to friction as it is moving (heat) Then when it reaches the end of the flip it has to dissipate that kinetic energy into something to stop moving: deformation of the mechanical components, heat or sound/vibrations. You can make a more efficient switch by eliminating more and more friction and mass of the moving parts but you can't get rid of it and it always ends in heat. You can add energy recovery to the switch some how...but thermo says that always has to be somewhat inefficient and that has to become heat. But that's technical stuff you don't need to worry about to see it has to be heat: Do the energy balance on a computer system. In - Out = Accumulation Is it accumulating energy? That'd mostly be silly, right? It'll warm up a bit but eventually reach a steady-state temperature, so it doesn't accumulate heat over long times (if it did it would melt). Sure some wear on the mechanical parts or even electromigration in some things, but that's not a ton of power. Otherwise it' turn into some kind of spectacularly dangerous bit of energy when it's been on nearly 24/7 for years. You can measure what it takes at the wall for "In". So "Out": What work is the computer doing on the outside environment or what perpetually gains potential energy inside the box to offset its intake of power? It moves some air...but certainly not 100s of watts worth. It makes a bit of noise...again not 100s of watts (and actually both of those things would end up as heat in the end) The rest has to be heat.