It really is a subjective matter, coming down to preference, price range, and prior experience, rather than either format being objectively superior.
Personally speaking, my preference varies from game to game, heavily dependent on the number of buttons the game uses and how they're associated with one another. Every time I pick up a console version of Street Fighter, I'm frustrated by Weak and Medium strikes being on the controller's face but Fierce strikes being on shoulders--it just feels wrong. On the other hand, a six-button game with four attack buttons plus, say, a "run" button and a "block" button works fine for me on a controller.
In the past, I actually preferred the PS2 controller for Tekken IV-V. I can't imagine I'd be able to put up with a PS3 or 360 controller, though--the analog triggers reduce the number of viable button inputs by 2, and for Tekken, I'm quite fond of using the spare buttons to map multi-button inputs that I have trouble hitting simultaneously. (I feel kinda like I'm cheating when I do that, but I fight better, so I get over it.)
I'd still love to get a fight stick, though. Not because I play a lot of fighting games, but because fight sticks recently have been getting really nice, and I'd really like to have something high-end and well-crafted for my gaming systems. (I'd probably use it mostly to play Pac-Man: Championship Edition, but still.)
"
Every time I see people complain about the platforming mechanics of Psychonauts, I'm astonished--every time I've played it has been on PC with keyboard/mouse controls, and I think it's actually one of the tighter, more responsive games I've played control-wise.
I wonder what platform the people who complain about the controls are playing it on. Xbox? PS2?
"
I call foul. False clues"
Batman: Arkham Asylum.
It follows through pretty well on 3 of your 4 criteria--no guns, the stealth sections are designed to be played as stealth sections, and it's not overly punishing when you do get caught. It even does a good job of relieving the tension of stealth sequences by alternating them with unarmed guard brawl sequences.
If you haven't played it before, give it a try--I think it'd be right up your alley. If you do, I'd love to hear your impressions"
That said, I think it will be beneficial not only to games, but to art critique in general for games to become "acceptable" art. Compartmentalization isn't a bad thing, and games make it much easier to judge the distinction between "enjoyment" and "quality."
It's perfectly acceptable to say you sunk 15 hours into "Babiez 4" because you enjoy the mechanics of a virtual pet game, or to say that you really don't like God of War because you just couldn't get anything out of the combat mechanics. Hopefully, instead of that personal-response compartmentalization going away, it will spread into criticism of other mediums--being able to say you don't like The Office because you have trouble watching awkward humor, or that you really enjoyed Transformers: Revenge of the Fallen because the CG was pretty and the robot fights were fun.
There will always be assholes who think their opinion is the only valid opinion. Hopefully, games will help everyone else figure out that "I enjoyed it" and "it was good" can be two completely different things"
A movie, in most cases, is conveying a narrative. There's a story to tell, focusing on various characters and how they relate to each other--and in good movies, how they grow. The viewer's emotional response is generated via empathy for the characters. Movie sequels fail when they mistake a chain of events (thing A happens, so thing B happens, which makes thing C happen) for a story (Hero's Journey, or Falling In Love, or Fall From Grace) or leave the characters as static caricatures because their character arc was completed in the first movie.
A game, on the other hand, is about conveying experiences. The player's emotional response is generated from their own actions. When the main character in a game does something smart, it's the player that feels clever for having done it, rather than feeling impressed with someone external to them. When the main character is in danger, the player feels the adrenaline directly, rather than the stress of wondering what's going to happen. When the main character is exploring, the player gets to examine a gorgeous world rather than see the bits of it a filmmaker thought was important. Unless a character is part of what makes the experience work, like solving puzzles or performing combat with Alyx Vance in Half-Life 2, characters in games are a world-building technique rather than a defining characteristic.
Game sequels work for the same reason that games are replayable--even if the story is the same, the experience is still different. When replaying a game, it's subtle differences, like enemies cropping up in slightly different places, or missing a jump you've made the previous time. It's enough to make sure the emotions are still drawn out by the experience. In a sequel, there's even more variety--new environments, new encounter designs, and new surprises to elicit similar emotions in new ways.
Of course, that's not to say that games and movies don't have any overlap. The climax of Bioshock is identical the climax of The Usual Supspects--both consist of the viewer/player realizing they've been led along the entire time, forcing them to immediately re-evaluate their assessment of what they've seen/done over the course of the narrative/experience. It's a highly effective technique and, as far as I'm concerned, the key reason both titles are regarded so highly"
1) Game Developers are busy as it is. Every branching path requires two times as much content for everything following that path--more, if there's more than one path. As it is, most branching pathways either result in an immediate reward or punishment and no further comment. Occasionally, there will be a small sidequest that gets locked off, or a character that will react to you differently, but nothing related to the main plot is affected. That makes perfect sense--developers spend years working on games, and it's stupid to prevent gamers from seeing the content they've worked so hard to create.
Ideally, a game would allow every action to have a consequence--for example, if you could protect Uriel Septim VII from his assassination at the beginning of Oblivion instead of being automatically killed off--but that ultimately means the designers have to design *two* main quests. And then two more for each of *those* main quests for every game-altering decision made at the next step. And so on. Pretty soon, the designers have thousands of different questlines to design, and any one player will only see a small fraction of them. Plus, with so many scenarios, it winds up being impossible to test, iterate upon, improve, and polish any individual experience. The decision regarding whether or not to emphasize consequences can easily turn into "Is the gamer's experience the same as their friend's but really good, or different from their friend's but not particularly fun?"
2) Gamers are systematically trained to solve puzzles. Many view the game itself as a puzzle, using the term "solve" to describe completing a game. Even combat is a puzzle--it's a combination of a strategy problem, a resource-management problem, and (in many games) a reflex test. It's no surprise that, when reflected with an ethical dilemma, gamers will view it as a puzzle--"How can I get the most out of this situation?" After hours of seeing failure as something that results in death, and success as something that results in a reward, the ethics of the situation are ignored in favor of the inevitable reward the game developers will offer for a "right" answer. Karma systems are an attempt to solve this issue, but they fail miserably, turning "good" and "evil" into merely two strategies for min-maxing.
3) Games are long. As a gamer, if I know I'm going to be playing something for 20 hours, I want the experience to be a good one. I don't want to feel like I'm missing out on what could have been a *better* experience. If I don't choose the "right" option, or I do something that causes the game to cut me off from part of the experience, I feel like I've screwed up. I feel like I'm playing the game wrong. Many dedicated, hardcore gamers go back and play it again the "right" way. Slightly-less-dedicated ones save and retry until they get a result they can live with. I'm even less dedicated than that. When I sit down with a Bioware game or a Bethesda game, I feel compelled to pull out a FAQ and map out what I'm supposed to do so I can play it the "right" way the first time and so I don't miss anything. That process makes the game so un-fun that I usually get too stressed out to continue by halfway through.
Ultimately, I see a two-part solution.
First, games that want to explore consequences need to be short enough for players to finish quickly and replay, both for the sake of the developers' workloads and for the sake of the player's mental health. Make a game that's meant to be played several times, with a completion time that's reasonable for a single sitting. At that length, it's reasonable to present someone with a completely different experience whether they go left or right, and a gamer won't feel compelled to min-max it so they have the "right" experience--at least, not the first time.
Second, the choices and consequences they offer require nuance. Instead of making the "right" choice rewarding, or offering "good" points for the "good" choice and "evil" points for the "evil" choice, make me experience conflict. Have "good" require sacrifice on my part, rather than reward me. Make "evil" hugely rewarding but guilt-inducing. Put me in a situation that makes me understand why good people do bad things. Don't necessarily make choices backfire on me, but make them not work or turn out differently than I expect. If the choices stop feeling like a dialog puzzle to be "won" or "lost," perhaps the gamers won't treat it so much like one"

@awa64
http://twitter.com/#!/awa64/status/45589360984330240
"