You totally missed my point, on a console you don't need to worry about performance and graphics settings. You buy the game and you know it works full stop.
I know *what* you were getting at, but I was getting at something else...Let me go further than what I went and explain.
Your above is the advantage of a console -- especially if a game is only being developed for ONE console. So, you can expect say a game like GoW, built for ONE system, the X360, to really take advantage of one set of hardware.
Things can get quite flaky, when a game's being multi-platformed at the same time for multiple consoles
and/or the PC. This is what I was getting at. I should've explained what I meant more so earlier, but I'll do it now.
If a console game is "botched" performance wise then its just a poorly designed/ported/coded game, this is not the fault of the hardware be it a console or a PC, and scalable settings dosen't fix this problem.
Regardless of that, I shouldn't have to be stuck w/ what the designers give me if the game isn't going to be up to snuff on
the owner's set of hardware -- yes, especially on say a "botched" port from PC to console. And/Or if the game is poorly coded. And/Or if the designers force the game out the door WAY too early -- see Quake 4 on X360 and also Driver 2 on PSX, for my point.
Okay, so let's go look at Driver 2 on the PSX, which was made
ONLY for that system period. This'll be an interesting example b/c this game performed bad for being made on only one system. Driver 1 looked great and ran great, for its time. Did the dev's really need to upgrade the graphics that damn much for Driver 2 to mess performance up that majorly??? Hell No. Graphics ain't everything. Driver 2 looked great upon its release, but the framerate is ALWAYS all over the place. It runs fine, then suddenly slow -- repeat the process over and over; rinse, lather, repeat. The designers should've pulled the punch and dropped the graphics quality to get it running better. But, they didn't -- they fell in love w/ how great it looked. Now, if I could've been able to scale the graphics quality for performance so it looked more like Driver 1 or not too much better than Driver 1, maybe Driver 2 would be much more playable than what the hell it was.
On the the other hand I've never heard of a console game that suffered from poor performance, Q4 for 360 is the first, but I know nothing about such issues beyond this thread, but just because one game was "botched" doesn't mean all console games should be scalable.
I think scaling graphics quality on a console game b/c it's being developed for multiple systems would make a lot of sense.
When a port does look like it's going to be "botched" when it's in development and the designers are forcing it out the door and/or the publisher forces it out the door due to time-constraint and contract agreement, then it should be in consideration to make the game maybe a lil' more scalable, graphically and performance-wise.
Put it this way - why in gods green earth would I want to scale back the graphics in Gears of War when the console can run it the way it was designed to do? (by the way never had any slow down or performance issues, none, zip, 0).
As far as I know, that game was not riddled w/ performance issues. And if they knew it was going so well while in development, then would it be a big deal??? No. They're only making the game for that *one* system, anyways. So, that'd be kind of hard to screw up. Though, this has been screwed up before; usually, that happens when making a sequel. Driver 1 to Driver 2 is a prime example of that. For the X-Box versions only, from KOTOR 1 to KOTOR 2 would be another, as from what I know, there were more framerate issues on the sequel than the actual original. (That's Obsidian for you....)
Scaling the graphics would not be a big deal usually, if they are developing a game for ONE particular set of hardware -- namely, they were only building this for the X360 ONLY when they were putting out the X360 version of that game.
The eventual PC port will have heaps of considerations and req's like DX 10, graphics card memory, does the game require Vista or not, do you need more RAM?
Agreed.
None of this was a factor on the 360.
B/c that game was exclusive to ONE system: X360. Nothing else.
GoW was not being multi-platformed to other hardware as well -- such as the PS2, PS3, Wii, PC, or anything else.
If a game is being multi-developed for a PC and a console, this is often where I believe the "botching" can take place on performance. Or, say a game is being ported from a more powerful set of hardware (usually a PC, since they bleed tech) to a weaker set of hardware (usually a console) -- like say porting a PC game supporting the newest hardware to the PS2, GC, or (original) X-Box.
The point is D all things on a console is equal, there is no variance.
B/c the hardware is fixed. Though, issues can arise from developing a game for multiple systems.
On a PC there tones of variance and the scalable graphics and settings etc accommodate for this variance so there's no need for scalable settings on a console.
That point's definitely taken.
An interesting and great thread we got goin' here Jedi, as far as D is concerned.
*high 5*