I mean in campaign mode, 1440p full settings. I'm at 20fps with my 780 GHZ and i5 3570k, that sucks... I think it's CPU related, tho. When i end my turn i'm down to 4fps and that's quite the heat. I wonder if you need an IBM Z10 or something to run this game.
I'm not even sure it's resolution related i think it's because of the bazillion AI calculations, battles aren't much better tho.
You're at 60FPS on Attila Erock ? I took a (quick) look at reddit and stuff but folks keep saying it's because TCA messed up using their same old engine since Shogun 2 and it uses only one core to run the whole game. Guys with dual titan Z can't even run it at that fps lol. So i guess it's your i7 à 4.4 ghz probably, i should try to OC my i5 at 4ghz it should work it out, but i'll have to find the right settings. Because, yeah, it pretty much destroys any config ... http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/12 http://www.reddit.com/r/totalwar/comments/2vqx92/total_war_atilla_benchmarks/ Still i think it's silly that they put only GPU benchs when it's a CPU heavy game... I'm still looking at that IBM mainframe tho, might go and host the game on one from work and stream from there Those gifs made my day too
Found a CPU too but they don't say what GPU is innit http://forums.overclockers.co.uk/showthread.php?t=18655871 no x99 either...
TW Atilla is shit in terms of optimization. To see even slight difference in fps, you might need to OC your 3570k to 4.4/5ghz. Also 1440p settings need way more gpu power in Atilla than any other game out there(again, shit optimization). Also Atilla eats gtx 780s for breakfast. OC your card to as close to 780ti as you can to gain performance. Bump down to 1080p, completely kill shadows and AA.
Seems like it hits both CPU and GPU pretty hard. Probably best to run on lower settings in 1440p or somewhere between min and max in 1080p. The game has to use more than 1-2 cores though, look at the max performance benchmark graph. The 4 core 4 thread processor at 4.5GHz gets 7 fps lower than the same speed with 8 threads. The 2MB more l3 cache cannot make that big of a difference on it's own. The game definitely needs them jiggahertz though.
Someone should teach TCA how to code a damn game then. Also Geforce experience is kinda not working as the "optimized" settings are too high. My 780 is about the same power than a stock 780 TI as it's an "out of the box" OC one, i'm around 40fps now with mixed settings but the gain is not great even in lowest settings as the calculations kill the CPU in any case.