Thursday, August 25, 2011

Taking Action and Playing Roles: The Skill/Luck Divide

RPGs are fickle beasts. On the one hand, RPGs purport to be the most character-driven genre of games, whether tabletop or electronic. On the other hand, RPGs offer one of the lowest levels of character control in gaming. The fact that most RPG systems are turn-based and/or tactical means that, rather than the player doing something as the character, the player is usually issuing an order to the character and expecting the character to carry it out. In an action game, the player is on some level directly doing something, even if it's highly processed through the game's controls. Reflexes and skill are involved more than planning and percentages.

The role of "skill" with regard to the player character has a different meaning than "skill" with regard to the player. In a traditional RPG (which is to say "rolling dice"), a character's skill affects the likelihood of passing a check, but it doesn't change the fact that the dice are all that decides it. A character may have a better or worse chance depending on their skill level, but ultimately it's down to the dice. This ought to create an attitude of acceptance; it's down to the dice, they're what decides whether or not someone lives or dies. Yet I find that it often does not, and this is largely connected to the fact that, again, RPGs purport to be primarily about character-driven narratives. How can a narrative be character-driven if said characters can die at any moment for reasons outside the player's control or influence? Hence, the divide.

In contrast, action-RPGs can include skill systems that naturally reflect a character's abilities. A character with more experience using guns might reload faster and aim more steadily; a character who's better at a mechanical or electrical skill might simply complete the job in a more timely fashion. However, these are a blend of "the character" and "the player". The limitations of most games mean that the things that the character influences are subtle things that the player does not directly control. The player hits R to reload, they don't actually go through the motions of removing the magazine and putting in a new one. The player holds down a button to hack a computer, they aren't expected to know the coding. Hence, the game becomes divided between "the player's job" and "the character's job". A theoretical game that was wholly player-based would have no room for RPG skills, because there would be nothing left for the player to influence.

I'll use an example scenario. Three characters are attempting to swim across a river. The first character is from a luck-based system and has a low skill level. The second character is from a luck-based system and has a high skill level. The third character is from an action game and has a variable skill level. The first two characters, despite their differences, are dependent entirely on luck; unless there are provisions or ceilings in place that say either the high-level character can't fail or the low-level character can't succeed, they're both equally vulnerable to a naturally high or naturally low roll. The third character may have an easy or hard time of it depending on their skill level, but it's the player's skill that does it - the character's natural ability just makes it easier or harder for the player. Yet the player in this scenario would be contributing a certain part of the skills, thus making it less "character-based".

Now, naturally, I'm making it sound like the player doesn't do anything in a turn-based RPG, and that of course isn't true. The player makes tactical and social decisions; it's simply a smaller set of responsibilities, and generally players dislike it when those things are taken from them. For example, many RPGs have a charisma stat and social skills of one kind or another, but the player generally expects to pick what is said. While a lot of the "charisma" process can be considered minor, but important, details (body language, tone, visible confidence and self-esteem), the player expects to be in charge of the major decisions regardless of the difference between their own charisma and their characters'. This becomes even more clear when talking about intelligence or wisdom, where the limitations of human malleability are tested simply by their nature.

In essence, skill tests are divided between the player and the character. If the action requires manual intervention, it's the player's job. If not, the character takes care of it. The more control is given to the player, the less important the character is. A true "character", if such a thing was possible, would be an autonomous individual with their own skills and abilities. Certainly a wise sage, a veteran soldier, or an experienced thief should handle their own jobs better than some fumbling player, and freed of the constraints of the player they ought to make better decisions. Yet the player must play a role, and this is a conundrum I've discussed before: where should the player end and the character begin?

Now I'm going to try to bring this back to one of my earlier points, to wit, the nature of failure and death in an RPG. RPGs are designed around the idea that one player plays one character, which contrasts with wargames and tactical games where the player has many expendable or semi-expendable subordinates. Loss in such games can be handled acceptably, because the unit can continue while the individual does not. Players may not be happy about a character's death, but the game goes on regardless. This applies to TV shows as well; characters died or were wounded in Band of Brothers, but the show was about the unit and thus the story continued. RPGs are stories about individuals (albeit multiple individuals grouped into loose affiliations), and if an individual dies permanently, that individual's story is over.

I guess what I'm trying to get at is that I'm surprised how many "role playing games" still center on the assumed survival of the individual and the related focus on long, winding, relatively linear story paths. Death and failure are part of a story, and yet in order to reliably complete the stories laid out by the developer or the GM, those things must be ignored or marginalized. This isn't to say that action games don't have failures as well, but a failure in an action game is usually the fault of the player, and not just bad luck. No matter how you skew the odds - larger dice, bell curves, dice pools - failure is going to be inevitable, and there's nothing the player can do about it other than hope it doesn't happen in a critical scenario.

When we're talking "believability" in this scenario, the idea that characters simply can't die isn't going to enter into it. Once predestination and the assumption of safety is brought into a game's story, the game is guaranteed to either be rigged or to be derailed. Rather than building the story as it progresses based on pre-existing tools, characters, and events, the "do all these things so you can unlock the next part" approach means that the risk of un-removable failure (which is a major part of life, to be frank) simply can't exist. I'm not going to try to suggest what should be done about it, I'm just going to note that having one character in a combat-intensive scenario is basically putting all your eggs in one basket, and also you can't do anything to stop the basket from breaking other than hoping really hard that it doesn't.

Saturday, August 6, 2011

The World-Building Process

It's not easy to make a new setting or series. There's a lot riding on a combination of familiarity and ingenuity, and the author or designer is tasked with creating a scenario that's recognizable enough to be easily comprehensible or tangible while also being distinct enough to be remembered as its own concept. The process of designing a world starts with an idea, and then that idea solidifies based on the components and processes that will end up creating the setting.

When it comes to settings I've liked and disliked, though, I've been able to draw a clear difference in terms of cause-and-effect. The settings I've liked have built up their aesthetic and their plots through things that made sense; the settings I've disliked have needed to find a place for their main plot, and thus everything else was just sort of thrown together to make way for it. A proper setting should be able to exist logically on its own, without requiring the main character to do everything and change every aspect of the world.

Making A Theme
There's two kinds of themes that are important when talking about world-building, and I've talked about them both before. The first is conceptual or authorial, and the second is the key component of the universe. The former refers to the way the setting is treated narratively - high or low fantasy, hard or soft sci-fi, and so on. It determines the nature of internal consistency in the setting. Deciding on this theme is a vital step when moving forward; for purposes of believability, though, the most important aspect of it is deciding how certain things in the setting work (physics, for example) and then sticking with it through the rest of the world-building process. The latter refers to a central theme or concept of the work, or something around which the setting revolves. I've discussed it before at length in the previously linked article.

The reason that this is the first step, apart from the natural concepts of establishing themes, is that if you're building a setting believably (based on logical properties and processes), then it's reasonably helpful to know what the basic properties are before you start acting upon them. Rules make up a dynamic, and dynamics are part of the collection of traits that define a setting. Star Wars treats space combat in a very specific way (dogfighting small ships, large capital ships). If that was changed, and it was treated more realistically, the setting would be different. It's not "realistic", but it's "internally consistent". If you're going to have a break from reality, and there's certainly nothing wrong with that, make it a mechanic that influences the development of things in the setting. The same is true for things like magic; codify the way it works before you start building the setting, so you can have a logical path of development within the setting itself.

In essence, the point of this step is both to be creative about coming up with a new setting and to lay the foundation for the rest of the concept. If go about it the other way by making a bunch of stuff and then trying to fit the concept in, it's going to end up a bit more arbitrary. If you want to include certain features or aspects as being central to the campaign, put them in first and build everything else around it, because otherwise it's going to feel totally awkward whenever it gets mentioned.

Building It Up
So you've got your ideas. You've got your main concepts. You've decided what you want as the primary features in this campaign, and you've basically taken your first steps into making something. What's next? Well, if we're talking verisimilitude, you just do what makes sense. You take your base setting, or your base world, or your base concept, and you add in people and animals and whatever else you need, and you see where it takes you. If the things you've introduced as being "different from reality" are consistent and regulated, then the effects they have on the world should be equally regulated.

Say you want to make a fantasy setting, centered on the artificial introduction of magic (a la Chrono Trigger and Lavos, I guess). If we're assuming that this is a world that was previously Earth-like (or at least played host to humanity), then the time before the introduction of magic ought to be sensible. Use Maslow's Hierarchy for guidance, and just figure out where people got their food, shelter, tools, and so on from. Then, once magic comes into the world, apply whatever rules and regulations you made for it in the last phase and apply it to the world. How does magic work? Can anyone use it? What can it do? If it only comes to a select few, what's stopping those few from becoming powerful rulers and sorcerers? If everyone can do it, how does it affect the world at large, and the development of technology? If magic is commonplace, would the conventional methods of war that we understand in real life be effective - or, to phrase it more directly, does it really make sense for people to use swords and shields in a world of magic? And if it doesn't make sense, is it at least thematic to your concept?

In the past I've talked about the evolution and development of warfare in games. The binding thread of that and a lot of other articles is that, bar the interference of personal tastes, people generally do what makes sense based on the systems available to them. What's thought of as "min-maxing" in a game makes perfect sense in-universe, and is generally only unacceptable because it also doesn't make sense in-universe. If you're going to include a system or a concept in the game, then have people treat the concept logically. If they don't treat it logically, you can justify it with cultural or religious values, but don't forget to, you know, still have things make sense. If a character chooses not to wear armor in a setting where armor protects you, they don't get protected. If a character tries to use a basic spear in a magic-heavy setting, that character is most likely getting fried. Make things consistent.

The benefit of doing all this is that the setting is going to become "nested", or whatever you'd like to call it. What I mean is that the setting is going to be layered in such a way that all its components are intrinsically linked and identifiable. No part of it will just sort of "be there"; if it's assembled properly, all the pieces will be connected to both the major concepts or components of the setting, and will also be connected to the other pieces of the world. The more airtight the world is, the less it's going to feel like a grab-bag of random concepts. If you interlock everything, you can't pull a piece out without dragging the rest of the setting with it. It provides explanations and justifications for things that happen, and that gives depth to the world.

A lot of gamers and developers and writers seem concerned about the idea of not just doing the "same old stuff", which is to say standard Tolkien-derived fantasy, or standard Star Wars-derived sci-fi. This is usually because they're talking about things in reference to other series' or settings or works, and not in reference to things that make sense in the environment. They're talking about re-using things that people have already done, but doing it in a different order: elves do x, dwarves do y, halflings do z, but there's no reason for them to exist other than "I wanted them in my setting". And yeah, eventually you might have to do that - you can't really be expected to build everything up from a cellular level - but justifying things and having them make sense feels more grounded and acceptable than just saying "that's the way it is", and it helps people connect with the concepts you're trying to use.

Including The Players
If you're making a game - whatever genre, whatever medium - at some point the players are going to have to take a role in the world you've built. Since players are generally not inclined to fill "safe" roles, jobs, or careers, you've got to find some way to give them something exciting to do. In games like D&D, the players are usually outside the system - the NPCs live over here and do boring things, the PCs have their own distinct classes that are objectively better than NPCs are and don't have to worry about things like economic inflation. The "NPC" world exists as a vague backdrop to the "PC" world of hacking and slaying and looting. However, it's totally possible to make a world where dangerous pursuits can be believably included in the setting as a whole.

For example, in real life, dangerous pursuits are often motivated by specific prizes that cannot be easily gathered through simple labor processes. This is to say, there has to be a reason a spunky group of ne'er-do-wells is out in the wilderness looking for the resource, rather than a group of workers with a sound financial backing. The classic image of adventurers being motivated by gold is based on the rarity of gold in real life, and the equal rarity of finding it in the wilderness or in some old ruins. In most games, the simplicity of delving into a dungeon and finding more stuff ought to water down the value of what's being found, but for the player's sakes this topic is avoided.

Conversely, there are some fictional settings that have something that is relatively common and necessary, but is always dangerous to acquire. These include the artifacts from STALKER, the titular souls of Demon's Souls, and thermal energy from Lost Planet. In these settings, there's a combat-based career for PCs to pursue that justifies the nature of the adventuring party. Even in more traditional or generic settings, including things like enemy-based resources (such as harvesting body parts from slain monsters) gives a reason for the players' role in the ecosystem. It gives the player a role besides just "doing gameplay" and it adds to immersion when you're part of a logical chain of supply and demand.

To me, this is a vitally important part, but it's not that different from making the setting as a whole logical. The world is a system. Everything feeds off of something else, and everything provides food for something else. Farmers and miners and smiths and scientists and politicians all need the fruits of each others' labors, because what an individual can do by themselves is limited. The world needs to work, and to show that there needs to be some kind of chain in place. If the player never got involved, would the world still work? Conversely, if the player was involved, what sort of role would they play? What's a logical way for them to interact with the setting without just being thrown into it and treated differently from everyone else?

The world works. The world works the way it does because there's a trillion little systems and subsystems that also work. Plants work, animals work, people work, and they all work in relation to each other. Technology is the utilization of the rules of reality to benefit humanity; "this produces a consistent result, let's keep doing that". All the different parts of the world and all the different aspects of science - chemistry, biology, meteorology, whatever - are just different systems that form the larger system commonly perceived as "reality". That's development. Things happen, and they affect other things. The other things affected also affect other things, and onwards and onwards.

When you're making a setting, do that. Sure, you've got to have a starting point that's kind of abstract (and it's not like we're exactly 100% clear on where reality came from or why it works the way it does), but once the rules and the world are in place, work with it. Things happen that make sense. It might take some research, but the questions you have about what happen should come naturally. "How does this work?" "How do these things form?" "Why did this happen?" When you figure out why it happens in reality, you can figure out how it would work in a world that had magic, or a world that had monsters, or a world where a certain technology exists. Figure out the rules, and then figure out how people would play the game.