Would you still play new console games if they were developed only using the console hardware? What I mean is, instead of using PCs and other computers, Developers would use the consoles to develop the games. Do you think the games would be worth buying? Would PC games be preferred? (better graphics, options, etc) Would the over all quality of console games improve? (no porting problems, no optimization problems)
Aren't most PC games nowadays ports from their console siblings? I understand what you mean by using the console hardware, but.. other than a few tweaks and what not - isn't that what a console to PC port is? Despite what folks say about consoles, I still play my 360, maybe not as much as I use to, but also, despite, the hardware in the current-gen consoles, they put out some decent looking games! Some look and play really good, I don't know why they get harshed on so much. Wait, what were we talkin' about again?
I grew up on console games. The big difference I found between console and PC is that computers were connected to the internet far before a console ever was. This drove most people to a PC for online games and games you play with friends while not together. However this past gen consoles are now connected to the internet. However to me it seems there are quite a lot of people that still play PC due to a mouse. Which I think consoles are starting to break into that territory as well with the Wii and Playstation Move. I'm not sure if they would like that or not but it is the same concept. I personally think they wouldn't like the Wii since from my experience the controls suck. The mouse control usually gets the flac from many fps gamers. I haven't used the above "contraption" since it just came out recently but its something to look at. From a price standpoint a console you get better performance for the same price. However most PC gamers are used to $40 price tag. Which consoles have been $60 for quite a long time. Recently though I've seen quite a few PC games pretty much say screw that and put it at $60 anyway.
Console developers use dev kits to make games optimized for the console already...I don't really understand what you're asking. Do you mean for PC titles like Dragon Age and Mass Effect? If so; yeah sure, why not make them using the console kit? They get more sales in that area, so why not optimize it for that? If they can find ways to minimize system requirements, all the better!
Your question is very confusing. I hope you didn't mean to actually make the 3D models and animations using a console. I have felt lately that consoles have been holding back the next generation of graphics. Because games have to be ported to PS3/Xbox/PC, they have to fit on each ones hardware which means you get an overall worse quality because Xbox 360 is so old.
Simply: If a game was developed on a console, with only that consoles hardware utilized in development, would you still play it? I am sorry if that seems confusing. When ever I play my consoles I feel like "WOW, the Devs must of had some great ideas but since they had to limit/optimize/tone down the game it really is not as good as it could be." Or possibly the game ended up poorly implemented because the hardware is "bad." Maybe the "polish" and quality of console games is very low or are my standards that high?
But the tech requirements to make a game are much higher than they are to play it. You couldnt run 3Ds max or maya very well on the Xbox 360 hardware or god forbid a Wii.
It isnt that companies want to make it on PC and then have to dumb it down for consoles. 90% of games are made for an engine optimized for consoles (sadly) and get ported to PC. So they are already optimized for a console. The issue that makes image quality suffer and things like that is a console's hardware is far inferior to a PC. The XBox 360 has 256MB of vram, that means all texture data must fit into 256MB. That limits you a lot. In addition you have to make the console low power so it isnt loud and doesnt overheat with a low output fan. This limits performance since you have to get lots of heat and loud cooling for high performance at the time it is made. It isnt a PC limiting anything, a PC frees developers to make amazing things, it is the console itself that limits everything. And to maximize profits, games are made for the lowest end major console, XBox 360. Thus PC and PS3 graphics suffer because of how old the 360 is.
There are a ton of other, more important reasons as to why "next gen" games are held back, and it hasn't got much to do with the xbox. I suggest exploring the topic some more on your own.
What you say is exactly what I feel about the PC games. Not the console ones. PC games lack the polish of console games, and few are actually smooth enough to make forget their PC inheritage. Most feel clunky, rough on the edges, and unconvenient. You are still not being precise enough though. What I describe concerns the playability, ergonomics parts of the game. Are you limiting yourself to a purely resolution & AA point of view ? Okay, what an amazing bunch of crap this is. I can't believe how ignorant you guys are. How about looking up, say, wikipedia before opening your mouth. For starters, the Xbox 360 has 512 MB of unified memory. If you don't even know how much memory it has, just don't talk, you'll save yourself embarassment. (The PS3 is the one who has only 256 MB vram on one hand, and 256 MB system memory on the other hand) Second, you don't even actually know what quantitiy of vram your PC game uses on your graphic card. If you think your game is programmed to always use all the vram it can find, you are wrong again. I would really like to find which current games actually use the 2GB vram of my AMD 6970. (Hint : surely not TERA.) Third, vram requirements are directly related to resolution and AA. Consoles have no use for 5040*1050 and up to 8xAA, my AMD 6970 has. Fourth, you simply have no idea how having a fixed, single hardware (x360 or PS3) to use boosts the performance over having to program for two different architectures (ATI & nVidia), themselves with a lot of different subcases (the entire range of both ATI & nVIdia cards) and different capabilities (shader number and type, vram amount, ...). This comment belongs to a fan boy site, not here. It is false and incorrect on so many levels. I suggest educating yourself on the architecture of both consoles and the PC, learn about games that exploit fully either one of the two consoles, and then start to program yourself.
The PS3 has 256mb of system ram and 256mb of vram. The difference between the 360 and the PS3 is it being unified. The 360 has more potential ram however that depends on how much ram the console OS uses itself. Some of this info is outdated, (360 now has built in wifi, PS3 now has rumble etc.) but the specs are pretty much the same.
The actual specs are irrelevant. The Xbox was released in 2005. That is ancient. PC has far surpassed the 360 in terms of hardware. This means you can release higher quality games for the PC because you are not limited by the hardware as much as you are on 360. Examples DX 11 has been around for a while now and is actually at an affordable price on PC these days. Consoles do not support DX 11. This is one reason I say consoles hold back the growth of gaming development. I dislike that you call these comments fanboyism. I love consoles I play a variety of games on them but I just miss games like Crysis 1 where they pushed the envelope of graphics. Then they dumbed down the settings for Warhead so they could sell it on 360 as well. Its not a terrible thing the game still looked good but not as good as the original and certainly not better.
Developing console games on a console instead of a PC (or equivalent) wouldn't do anything to improve anything. It might make them worse, since the required tools don't exist on a console (is there an IDE and C compiler for the PS3? 3D modelling tools? Sound and video editing applications?), so you'd have to roll your own. Generally the way something like this will work is - the game may be programmed on a PC but it uses a cross-compiler to target the platform it's being built to run on, and the end result can be tested directly on the console (or using some kind of similar development kit). What they don't do is first develop the game to run on a PC for testing and then port it to the target console. This means there are no porting/optimization issues unless you're targeting multiple platforms (no quick fix for that one).
Who cares what games look like anyway? The most important thing is how fun they are. PS3 games look great, but most of them are boring as hell and usually involve some kind of gun or sword or gunsword or swordgun (sometimes a chaingun or a chainsaw or a chainsawgun!). A lot of DS games are really fun and innovative and they look really basic. The Wii also has way more good games than the 360 and the PS3 combined and it has very basic hardware (most of which are virtual console games from very old hardware!). Minecraft just plain looks like shit and it's a great game. Hardware is 95% irrelevant to how good a game is, so I guess it's kind of a worthless point anyway. Even if you're talking graphics, the quality of the art design goes a much longer way in actually making the game look good than the resolution or the poly count.
I very much care how games look. You make a good point the game should first and foremost be fun. I like my games to look good as well.
Currently on PCs the CPU is the bottleneck. Sony can somehow see the future and saw that that would happen. And because of this the PS3 has quite a bit more processing power than even the most powerful PC processor. A 6 core i7 maxes out at 107.55 gigaflops. However the PS3 is capable of 2 teraflops, this is a huge gap the PS3 is capable of almost 20 times more processing power than the most powerful of PC processors. I think he's referring to developing them FOR consoles. Not ON consoles. So, I went to play Arc Rise Fantasia on a Wii after reading this. I started up the game and was greeted with this amazing cinematic. I was thinking hey this isn't so bad. 5 minutes later, gameplay I'm sorry I cannot play this. It's one thing when an indie game looks like crap. But a commercial game that looks like crap is a whole different story. Minecraft was successful because it was something new. However most developers are scared to make something that's new because there is a potential loss if people don't like it. Thus that is often left to indie devs that have nothing to lose. I'm sorry, but I would much rather play this: Over this:
This is why I am citing the Xbox 360's age as the problem not PS3. When the next gen xbox comes out I'm sure we will see a jump in graphics across many mutli-platform games. Also this is what I have found regarding the PS3 hardware. The "RSX" GPU in the PS3 is literally a 7900GT at 550MHz with 8 ROPs disabled, and only 256MB of DDR3 1400MHz on a 128-bit memory bus. The Cell Processor is a custom IBM Power PC CPU and it's completely unlike anything we've ever seen, or likely will ever see in a desktop PC. It's not necessarily faster or slower than what you could put in your desktop, just different. It uses the Power PC instruction set, and is an in order processor, and is asymmetrical. The main core known as the PPU in the Cell is basically the same as one of the cores in the Xbox360's Xenos CPU; the Cell in the PS3 also has 7 SPEs (originally 8 but 1 was disabled to improve yields) and 1 of the SPEs is reserved to the PS3's OS, so developers really only have access to 6 of them. If you were doing something that required a lot of branch prediction, any semi-current desktop CPU would beat the Cell. Where the Cell shows its power is with scientific applications due to its massive FP performance, which would outperform even the QX9770 because that's all it was designed to do. If you take a look at the folding@home statistics, there are roughly 6 times as many active donors on Windows than there are PS3 donors, yet the PS3 group has a total performance rating a little over 5x higher than the Windows group.