Relative graphics, immersion and gameplay
Shiny and realistic-looking graphics are great and can really add to the experience and immersion of a game. However, one shouldn’t assume that good graphics equals shiny and realistic-looking all the time. In fact, I would say that graphics need to be good at what they do, i.e. be designed for the task at hand, rather than optimized for looks per default.
This is Minecraft:
This is Wurm Online:
They serve as a good reference point for my post for many reasons: they are both considered somewhat indie, and Minecraft is solely developed by Markus “Notch” Persson, who was also a founder of and long-time coder on Wurm Online. They are both written in Java, aimed at an online audience and share similarities in world-shaping abilities, construction, resource gathering etc.
There are two things they are vastly different in, however: level of success and style of graphics. Minecraft’s world is built by square blocks of bright pixels. Wurm Online has that staple “tried and failed” 3D look, even when running on the highest settings possible. Minecraft is also hugely successful, netting Markus Persson not only millions upon millions but also a chance to found a game development studio where his now quite evident talent can hopefully be put to great use. Wurm Online, on the other hand, has been out for years upon years but has never really made it past its niche crowd of sandbox gamers.
I will not argue that the success of either game is based solely on their style of graphics, but I will argue that despite Minecraft’s simplistic approach to graphics, it still looks better than Wurm Online’s attempt at decent 3D, and probably took a whole lot less time to create (not to mention run on a modern-day processor). This I like to think is based on the fact that simple graphics that get the job done are better for immersion and gameplay than advanced graphics that aren’t quite there. Lack of anti-aliasing isn’t a problem in a world of cubes, so to speak.
Using the right style of graphics for a certain job doesn’t only relate to the player experience. It’s also something a developer should be interested in for their own gain. Let’s say Markus Persson had an idea for a game that he really wanted to create, where the focus was the gathering of resources the use of these resources to construct things. While my experience with game-development is fairly limited, I know enough about games to know that creating 3D models, textures, physics and so on for a game is not something to be done over a weekend. So what if the parts of the trees that aren’t cut down will float in the air? It will only be a big issue if you attempt to give the player the experience of being in a completely realistic world to begin with. I would say that the breaking of immersion isn’t mainly constituted by the this isn’t something that could happen in real life-experience, but rather by the this isn’t something that should happen in this game-realisation. Indeed; the problem isn’t reality, but rather the failure of delivering the experience you have implicitly told the player they would have while playing.
When you read a book, the story is key. This story is crafted with words. When you play a game, the key is gameplay. The term is rather vague, and is more or less the equivalent to the term “good” in the field of ethics, i.e. that one word that rests at the base of all other values, yet is itself heavily contested territory. I think the problem with the term gameplay is that people have tried to define it as one thing; e.g. “that feeling you get when you overcome a challenge” or “the joy of playing”. However if we look at the book analogy once more, we can see that its key to success, the concept of a story, is just as vague and ever-changing. Try defining what a good story is and you quickly realise that it all depends on the writer, the reader and the experience that the book tries to deliver. A quirky epos is probably bad, while a jovial rendition of a tragic story can be hard to accept as a reader. Mere examples, of course, but the point stands: different stories want to accomplish different things, just as different games want to deliver different gameplay. Trying to find a common denominator between the action-packed and dynamic gameplay of a game like Crysis and the time-bending and heavily story-based gameplay of much deserved indie-hit Braid is pointless at best, and runs the risk of creating a heavily reductionistic view of games in general at worst.
So let’s get back to our case in point, the graphically challenged lovechild of indie gaming and Farmville; Minecraft. It doesn’t strike me as the type of game that’s aiming to deliver tightly-packed action. Nor does it want you to gape at the amazing sunrise over a field of freshly cut wheat while you practice your spellchanting in the crisp morning air (to be honest, some of the creations in Minecraft are so amazing, I’d be surprised if this very setup doesn’t actually exist… but I hope you can see my point despite this). Minecraft aims to sate the creative (and megalomanic) streak that most humans seem to harbor. It’s more akin to Populous and Theme Park than World of Warcraft. Given enough time, I’m pretty convinced some Minecraft players will raise the level of abstraction and construct advanced games within the game itself.
But this all seems to beg the question: would Minecraft be an even better game if it had all of the things above and stunning visuals without any of the impairments in regards to client computer needs? To be honest I’m really not sure, but the question is also somewhat moot: it’s a fact that you exclude people from playing your game the higher you put your lowest visual bar. Until voxel graphics are big I really can’t see Minecraft needing to change graphical directions for any good reason. Wurm Online, on the other hand, will probably truck on, in essence hampered by it’s too highly set bar in the graphics department. A sprite-based game like (classic) Ultima Online delivers infinitely more immersion in my eyes.