Well at least more so than I thought.
Today I went to the Mountain Dew & Xbox 360 Game Live, a College Campus Tour, at my school and it felt like a minature-E3 – games were buggy, there was a DJ, hot girls asking player’s if they wanted to enter in a contest, etc.
I went around asking students what they thought about the games they were playing – Splinter Cell 4, Eragon, Madden 07, Need for Speed Carbon, FlatOut 2, DOA 4 – are the ones that I remember being there. Players actually weren’t impressed with buggy games. Buggy as in buildings were flickering from textured to non-textured. Crap like that won’t sell a game so why in the world would any developer show this to the public?
It’s different in the case of showing the media a buggy game. They’ll understand that the game might only need a few tweaks to fix the broken parts, but do consumers understand this? Somehow I doubt it. Seriously does any consumer, other than geeks, understand how games are developed?
What’s really sad is that consumers are forced to put up with mediocrity. Have you entered a game store recently? 1,000s of games on the shelves – how are they supposed to know which one is better than the other?
Do they know the hundreds of developers and publishers behind those games? Obviously not! They may know the top 8 names – Microsoft, Sony, Nintendo, UbiSoft, EA, Activision, Midway, Atari – but after that…
There’s a reason why so many game companies fold and that’s because consumers are confused. No matter the company or the franchise consumers are not safe from buggy and low quality games. So I ask what will make them buy $60 games that have not been unrestlessly assured in quality?