Friday, May 31, 2013

Taming Attack Routines

Just saw this post over at Talysman's blog which reminded me of the problem: Do you really want to roll eight attacks for a carrion crawler's eight tentacles? Simple, abstract combat is something of an "ideal" in old-school circles after all, and that's a lot of rolling.

I think Talysman's rule is too fiddly (no offense!) and it occurred to me that there's already a common "house rule" that could be extended instead: lots of people seem to give characters who attack with two weapons (aka "dual wielding") a +1 on their to-hit roll. (I don't want to debate whether that's a "realistic" approach, let's just take the mechanic at face value.)

A direct application of this to the carrion crawler may not be such a good idea: they'd get +7 on their to-hit roll and that seems way too much of an advantage. (Actually it may not be, but I didn't want to run the probabilities and just went with "gut feeling" instead.) But we can go with an "exponential" system to "soften the blow" as it were:

#attacksto-hit bonus

Seems fair to me, and the math is nice too because the bonus is just the logarithm (base 2) of the number of attacks. (I should get extra credit for that!) But some may complain that this way ghouls with their 3 attacks are not scary enough anymore: a measly +1? Tweaking the cutoffs a bit differently we get this:

#attacksto-hit bonus

The math is now off, but hey, your beloved ghouls stay relatively scary while at the same time carrion crawlers are not completely "off-the-charts" either. And I don't even know of a monster with 9+ physical attacks, so the +4 probably never applies at all? Anyway, just my $0.02!

Edit (aka Actually Doing the Depressing Math): Carrion crawlers have 3+1 hit-dice and 8 attacks in B/X. Let's say one of these beasties is fighting a guy with AC 3 (plate mail) and no other bonuses, so it needs a 13+ to hit. Rolling 13+ on a d20 means a 40% chance to hit. Using (hopefully correctly) the binomial distribution we find that the chance for no hit in 8 attacks is only 1.6% meaning the chance for at least one hit is 98.4%! (The most likely outcome is actually 3 hits which has a 28% chance.)

That's rather sobering and sort of what I feared when I said "it may not be" in parenthesis above: Using the +7 suggestion the carrion crawler would make a single attack roll for which it needs 6+ on a d20 which is a 75% chance, a far cry from the 98.4% of actually rolling 8 attacks. It really should get +12 bonus, then it would have at least a 95% chance to hit.

Now let's do this again for a ghoul, just to see what's what. Ghouls have 2 hit-dice and 3 attacks in B/X. Again assuming an opponent with AC 3 a ghoul needs a 15+ to hit, that's a 30% chance. (This is actually small enough to run a simulation to double-check our math.) The formula comes out to 32.4% for no hit, meaning 67.6% for at least one hit. Translating that back into a attack bonus, we should give a ghoul +7 or +8 for its attack routine.

And what about a character using two weapons? Assuming we use the same reasoning, that there are really two attacks but we abstract that into a bonus for a single die roll, what should that bonus be? Let's say a level 3 fighter attacks a carrion crawler with two weapons. The carrion crawler has AC 7 so our fighter needs a 12+ to hit, that's a 45% chance. (Here's the simulation.) That's a 30.2% chance for no hit and a 69.8% chance for at least one hit. So instead of a +1, the character should get a +4 or +5 attack bonus for attacking with two weapons.

In summary, all these bonuses are seriously flawed approximations. I'd contend that you can still use them, but you should be aware that you're paying for the convenience of rolling fewer dice with (overall) much lower chances for a successful attack. Or the other way around (for the min-maxers out there): If you can get an actual second attack, you should almost always take it instead of taking a constant bonus.

Wednesday, May 29, 2013

The Joys of Proprietary Software

So Johns Hopkins has a new identity, meaning new unified logos for all of the different parts of the university. If you're expecting a rant about how I don't like the new look, fear not: I like it just fine! Personally I am not sure that this was money well spent (not that I actually know what exactly was spent, but it certainly wasn't cheap, probably several assistant professor salaries' worth), but since the end result looks pretty great: Yay!

Now what does any of this have to do with proprietary software? Well, I immediately grabbed the new logos to update the "virtual letter head" for recommendation letters I write, and looking at the "large" versions of the new logos I noticed that they are all 300+ kB for not much more than a stylized seal and a string of text saying "Johns Hopkins" and whatever else it needs to say. Does that seem too large to you? Certainly did to me.

A little digging showed that the PDF versions of our new logos had been created with Adobe Illustrator CS6 (Macintosh) using the Adobe PDF library 10.01 of all things. I have no idea what any of this even means, but I am stating it for reference.

The logos were surrounded by a lot of white space that I had no use for, so I loaded them into inkscape (using the "very fine" approximation level, something that probably doesn't matter) and saved them again. I did this because the "Save as..." dialog of inkscape gives me the option to say "only drawing" instead of "entire page" and hence gets rid of the white space I didn't want. Took all of 3 minutes to figure out, and when I compared the new inkscape PDFs to the original Illustrator PDFs at high magnification, I couldn't tell any difference whatsoever.

Reassured by the quality of the conversion process I hacked away at the new letter head and eventually I committed it all into my old Subversion repository. Whenever I do this I check the size of all the files I am about to commit (probably a habit left over from the good old days of CVS) and I was a little surprised to find that the new PDFs were 12+ kB. Yep, that's right:

Illustrator creates a file that's 25 times larger than a visually identical file created by inkscape.

Admittedly this difference probably doesn't matter very much in today's world of 1+ GB networks and 2+ TB disk drives. But I conclude from this that Adobe simply doesn't care. And that's sad because people pay them a lot of money for their products. Inkscape on the other hand anyone can use for free. Even better, anyone can improve it too. For me this is just another example of why free software beats proprietary software every time.

Sorry that I ranted for so long just to make such a minor point.

In any case, I can now create PDF recommendation letters that are 100+ kB instead of being 500+ kB for just a sexy new logo. And although I may be the last person on the planet who cares about the size of the files they send out, that still feels good.

Monday, May 27, 2013

Level Progressions: History and Experiments

Alright, I am on a history kick for some reason. This time I am looking at level progressions for the four "core" classes in the various "official" editions of D&D. I am mostly trying to get a feel for what the "traditional" progression should be for the first 10-or-so levels, but I'll make some comments on higher levels as well. We start, of course, with OD&D from 1974 (including the thief from Supplement I: Greyhawk; I'll use "wizard" instead of "magic-user"):


There are a few things open to interpretation here, for example it is unclear that the progression for clerics is really supposed to be by 100,000 at high levels. The most obvious "problem" with OD&D is that wizards, while starting out slower, eventually progress faster than fighters and clerics, something that doesn't seem in line with their supposed power curve. There's also the weirdness that thieves are described as requiring 125,000/level after 10 which is higher than the implied (sadly not documented) requirements for any of the other classes? Let's look at AD&D from 1978 next (ignoring the useless +1 point issue):


Although all classes get slowed down, there's still the noticeable "reversal" where things get easier for wizards at level 8. But we're also told that the "high-level advancement" is 250,000/level for fighters, 375,000/level for wizards, 225,000/level for clerics, and 220,000/level for thieves, so wizards slow down again quickly after that. Overall this doesn't seem less "broken" than OD&D to me. Let's look at B/X D&D from 1981 next:


Finally we get a progression that makes some sense. First there's no "reversal" for wizards anymore, they now consistently require more experience points (XP) than other classes. Second we're told that "high-level advancement" is 120,000/level for fighters and thieves, 100,000/level for clerics, and 150,000/level for wizards, so there's consistency here as well. (The fact that thieves advance like fighters at high level is a small wrinkle at first sight, however notice that in B/X fighters and thieves both receive 2 hit points per level after the hit dice stop, clerics and wizards get 1 hit point per level instead.) Later developments (BECMI, RC) keep the same progressions. Let's look at 2nd edition AD&D from 1989 next:


What I find amazing here is that despite knowing of a "better" (in my book anyway) progression from B/X, one that could even be made more like the original AD&D progression, the people designing the 2nd edition chose to stick almost completely with the original.

You might say at this point that comparing level progressions from widely different versions of D&D is not something that should be done, after all the abilities of the various classes differ a lot as well. But if you look carefully, those differences seem to have very little impact on what the designers did! The AD&D thief, for example, has almost the exact same abilities that a B/X thief has, but at a much lower XP cost; the AD&D thief even has a larger hit die! But it's even weirder than that: The person designing the "sane" B/X progression was the same person keeping the "insane" AD&D progression for 2nd edition AD&D: Dave Cook!

In any case, just to be "complete" let me mention the unified progression that came with 3rd edition "D&D" as well:


This is obviously extremely different from all the previous progressions. For one thing the XP requirements are a lot smaller starting at about level 6. But also this progression makes it a lot harder for new characters to catch up to characters that avoid death and keep collecting XP. Interestingly this is the same exact progression that a German roleplaying game called "Das Schwarze Auge" used in 1984 (except that they divided by 10).

I am sure you can tell that if I had to pick one of these progressions as the basis for a new D&D variant, my choice would be B/X. However, in the end, I would probably do away with the "different tables for different classes" approach entirely: If you average across the B/X table above, you'll see that the required XP/level across all classes is always slightly below that for the fighter, and the fighter works out by roughly doubling the required XP starting from 2,000. That's probably close enough for all of them if you tweak the abilities of each class appropriately.

As a final note, check out what happens if we start with the AD&D requirements and then strictly double them each level:


In my book, that's a mighty fine progression. If we care to extend it to higher levels in the old-fashioned way (switching to a linear progression), we can simply go with 256,000/level for fighters, 320,000/level for wizards, 192,000/level for clerics, and 160,000/level for thieves. That's not too different from the original AD&D progression either, but it follows out of a very simple formula with no arbitrary patching. A winner if you really care to maintain the feel of different XP progressions for different classes.

I simply use the fighter column for all classes but keep doubling forever, which means that for all intents and purposes, the game is capped at level 12. Individuals with higher levels will be extremely rare, and anyone who really wants to be an arch-wizard (able to cast a level 9 spell) essentially must consider the path of the lich (and keep adventuring or build a huge empire) or be an elf: level 17 requires a breath-taking 65,536,000 XP!

Works just fine for me. :-)

Friday, May 24, 2013

Saving Throws: The Clones

In a previous post I looked at saving throws across the ages as far as the official D&D releases are concerned. This time I'll look at a number of more recent clones instead. (The selection of clones below simply reflects what I am familiar with. If you think that I am missing an important innovation from some other system, please let me know!)

Let's start with Labyrinth Lord (LL), arguably the most popular clone out there. LL is billed as a clone of B/X, so the following saving throws should not come as a surprise:
Breath Attacks
Poison or Death
Petrify or Paralyze
Spells or Spell-like Devices
Aside from expressing and ordering things a bit differently, this list is a perfect match of B/X. The only "addition" is the generalization from "rods and staves" to "spell-like devices," something that sounds a little awkward but indeed clarifies what the saving throw is used for.

Another extremely popular clone is Swords & Wizardry (SW) which bills itself as a clone of the original 1974 rules. Somewhat surprisingly, however, SW uses a single saving throw regardless of the kind of effect a character has to defend against (although some classes get bonuses against some effects). SW also includes an optional system that matches the original game:
Death Rays and Poison
Wands (all)
Turned to Stone
Dragon's Breath
Spells and Staffs
Except for minor spelling changes, this alternate system is indeed identical to the 1974 rules for saving throws.

Next we look at the Old School Reference and Index Compilation (OSRIC), a popular clone of the first edition AD&D game that uses the following saving throws:
Aimed Magic Items (rod, staff, wand)
Breath Weapons
Death, Paralysis, Poison
Petrifaction, Polymorph
Spells for unlisted categories
As is to be expected, the categories are identical (aside from variant phraseology and the clarification about unlisted categories) to AD&D. I may be the only one, but the weird renaming of "petrification" to "petrifaction" seems exceedingly wrong to my ears (despite the fact that it may be absolutely correct from a geological perspective).

Not really a clone of anything in particular, although roughly inspired by B/X as far as I can tell, Lamentations of the Flame Princess (LotFP) uses the following saving throws:
Breath Weapon
Magical Device
On the one hand this is clearly more B/X than AD&D: Paralyze is different from poison. On the other hand this is clearly more AD&D than B/X: Magical devices (such as wands and staves) are different from magic itself (spells and innate magical abilities). In other words, this is actually the first "cross-over" saving throw system so far, so LotFP gets extra points for innovation here. Also notable is the absence of an explicit "petrification" category.

The LotFP saving throws also come with a decent rationalization: paralyze is for effects that somehow limit movement, poison is for effects where hit points are irrelevant, breath is for area effects, device is for effects caused by magic items, and magic is for all other spells or spell-like effects. This presumably means that "petrification" could fall under "paralyze" since it restricts movement, "poison" since it's not affected by hit points, or "magic (device)" if it's a spell effect?

The Adventurer Conqueror King System (ACKS) is not really a clone either, but it's certainly inspired by the BECMI approach to D&D. ACKS uses the following saving throws:
Petrification & Paralysis
Poison & Death
Blasts & Breath
Staffs & Wands
Compare this to LotFP above and you'll realize that the saving throw categories are pretty much identical. Whether ACKS actually copied from LotFP or not doesn't really matter, what does matter is that they came to the same conclusions (and their rationalization is pretty much identical too).

Much like SW (see above), Delving Deeper (DD) also bills itself as a clone of the original 1974 rules. DD uses the following saving throws:
Breath Weapon
Despite the goal of cloning the 1974 rules rather closely, DD nevertheless modifies the saving throw categories: First "rays" move from being paired with "poison" to being paired with "wands" instead; second "paralysis" shows up as an effect again, but paired with "petrification" instead of "wands" this time. Most importantly, however, DD interprets all saving throws more based on effects rather than on sources.

The DD rationalization for saving throws goes as follows: poison is used against biological attacks, including things like cloudkill; wands is used against targeted attacks, including things like finger of death; paralysis is used against physiological attacks, including things like flesh-to-stone, polymorph, and slow; breath is used against area attacks, including things like fireball; and spells is used against mind-affecting attacks, including things like charm person and geas. In other words, DD deviates significantly from the original rules when it comes to saving throws.

BLUEHOLME (BH) bills itself as a clone of the Holmes Basic set. BH uses the following saving throws:
Breath Weapon
Magic Wand
Ray or Poison
Spell or Staff
We do have quite a surprise here: the "gaze" category which replaces "turned to stone" from the actual 1977 rules. As far as I can tell, there is no precedent for a "gaze" category in any of the other D&D-variants I've studied, so BLUEHOLME should get extra points for innovation. Whether the category is actually useful or not I am a little confused about. For example, would it apply to all attacks a beholder can make?

Two more quickies to finish off: Both Basic Fantasy and Dark(er) Dungeons pretty much follow the B/X saving throws with very minor variations in spelling and application.

What can be learned from this?

  1. Some of the newer reincarnations of D&D stick rather closely to the original saving throw categories for the system they are based on.
  2. Others try to innovate, either by reshuffling the categories in a way that had not been tried before, or by offering different rationalizations.
  3. Only two systems, LotFP and ACKS, actually agree on a "new" interpretation of saving throws; several systems agree (pretty much) with B/X, which probably makes those categories most widely known.
If I had to choose from among all these interpretations, I tend to like LotFP and ACKS best, followed by DD, followed by all the B/X interpretations. One particular reason for my preferences is that I like my elves to get a bonus versus any effect that restricts their movement, something that nicely dovetails with the traditional "immunity" elves get from ghoul paralyzation.

Saving Throws: The Originals

Saving throws have been around for almost 40 years now. I've been fighting with them quite a bit while trying to design my own D&D-variant and eventually decided that a "historical survey" would be helpful. Here's what I learned, starting with the original game from 1974:
Death Ray or Poison
All Wands - Including Polymorph or Paralization
Dragon Breath
Staves and Spells
For people who grew up with B/X or BECMI or AD&D there are quite a few surprises here:
  1. The "polymorph or paralyzation" category was originally the "wands" category instead of being a separate "effect" category (the effects just being examples of wands).
  2. Whereas a successful "death ray" saving throw means "no effect," a successful "poison" saving throw means "half of the total possible hit damage" instead; whether this means that the character could still die (half of maximum hit points) or not (half of current hit points) is not really clear.
  3. The description for "spells" seems to imply that a successful saving throw means "no effect" at all, which is at odds with the description of "wands (and staves) of cold, fire ball, lightning" causing one-half damage if the saving throw is successful; presumably "spells" was meant as "all spells that don't otherwise cause damage"?
Next we have Holmes Basic from 1977:
Spell or Magic Staff
Magic Wand
Death Ray or Poison
Turned to Stone
Dragon Breath
Aside from the different labels, the descriptions are pretty much the same as in the 1974 game, including the "save against spells for no effect" approach. Note that complete absence of "polymorph or paralyzation" from this set of saving throws. Let's skip AD&D from 1979 for a moment and continue along the "basic D&D" line first, with the B/X sets from 1981:
Death Ray or Poison
Magic Wands
Paralysis or Turn to Stone
Dragon Breath
Rods, Staves, or Spells
First note that we're now back to the order in which saving throws were given in the 1974 game. Next note that "paralysis" has moved from the "wand" category into the "effect" category of "turn to stone" but "polymorph" is nowhere to be found (still covered by "wands" I would presume). Finally note that "rods" finally make their appearance (Holmes included the Rod of Cancellation as a magic item but there was no corresponding saving throw). Neither the BECMI sets from 1983 to 1985 nor the Rules Cyclopedia from 1991 change the saving throw categories further in this line of D&D development.

Let's look at AD&D, the first edition from 1979, next:
Paralyzation, Poison or Death Magic
Petrification or Polymorph
Rod, Staff or Wand
Breath Weapon
Here we have a number of significant departures from the original 1974 game and it's "basic set" offshoots:
  1. Paralyzation moves from the "wands" category into an "effect" category just like in B/X, however instead of moving to "turn to stone" it ends up moving to "death magic" which is the more inclusive revision of "death ray" from 1974.
  2. Polymorph is back from 1974 and also moves from the "wands" category into an "effect" category, but this time into "petrification" which is the move inclusive revision of "turn to stone" from earlier editions of the game.
  3. Wands are no longer separate from rods and staves, implicitly giving rise to something like a "magical device" category (even though it's unclear whether this was actually the intention).
  4. Spells are now their own category, further implying that there's a distinction between effects from devices and effects from actual casters.
There's some advice that "petrification and polymorph" should not be used for such effects from wands, that "breath weapon" should not be used if the breath causes "petrification or polymorph," and that "spells" does not apply when another saving throw has been specified (presumably in the spell description). I have to say that "breath weapon causing polymorph" is something I do not recall any monster for, so maybe I should design one? AD&D also goes into a lot of detail about saving throws for items, something that does not concern me here (except to point out that B/X and BECMI handle those with just the normal saving throws).

Ten years later we get to the second edition of AD&D from 1989:
Paralyzation, Poison, or Death Magic
Rod, Staff, or Wand
Petrification or Polymorph
Breath Weapon
Interestingly nothing really changed from 1979, they are even using just about identical footnotes to the saving throw tables. Not much innovation there! (I guess it all went into removing devils and demons?)

In 2000 with the third edition of (A)D&D we get the infamous "streamlined" saving throws: Fortitude, Will, and Reflex. In 2008 with the fourth edition we get something I don't even understand, but I know I don't want it. So we'll just ignore those strange developments here.

What can be learned from this?
  1. Only the "breath weapon" category remains the same across the various revisions, although it does start out as "dragon breath" and thus got generalized a little with AD&D.
  2. AD&D lowered the power level of staves and rods by lumping them together with wands instead of regular spells. This could be seen as a "magical device" category.
  3. The "3P" effects of "paralyzation, petrification, polymorph" jump around a lot over the years. If we prefer them as distinct effects (rather than basing the saving throw on the source of the effect) then the AD&D version arguably makes the most sense: both petrification and polymorph are "radical changes to the body" and lumping them together based on that seems appropriate; paralyzation, on the other hand, simply takes the victim out of the game much like "death rays" or "poison" might.
Personally I still prefer the B/X revision when it comes to saving throws, mostly because I like the idea that wands are easier to resist than staves. However, the AD&D revision is a close second.

(You may also be interested in this followup article covering various clones of the original rules discussed above.)

Saturday, May 18, 2013

Vancian Swords

The origins of these accursed swords are shrouded in mystery. Both long swords and short swords appear in historical accounts, all of them very cleverly made to give the appearance of being powerful elven weapons.

Clerics or magic-users who pick up a Vancian Sword suffer 1d6 of electricity damage per round until they let go. These characters will also get the distinct impression that a sweet elven voice whispers to them about sharing "their gift" with a powerful fighter or thief.

Fighters or thieves who pick up a Vancian Sword must save versus spells or be stricken by the curse. Those who make the save suffer 2d6 of ice damage per round until they let go. These characters will also hear an angry elven voice chide them about not being worthy of "the gift" the sword has to offer. Those who fail their saving throw will hear a sadistic voice, decidedly human in character, laugh and cackle diabolically. Despite feeling ashamed and insecure when they hear the laugh, these characters will be unable to let go of the sword and will prefer  it over all other weapons until a Remove Curse spell is cast on them.

Those using a Vancian Sword will fight rather strangely. Find the highest spell-level that a magic-user of the wielder's level can cast. The sword will strike unerringly that many times each day, no to-hit roll necessary. However, the target gets a saving throw versus magic for half damage. A Vancian Sword can hit creatures only harmed by magic weapons and does 1d6 regular damage as well as 1d6 acid damage on each hit.

After the sword's daily allowance of attacks has been used up, the wielder has to save versus spells for each additional attack. Success indicates that a normal to-hit roll can be made for 1d6 regular damage. Failure indicates that the wielder's morale fails: they will leave combat and try to hide behind fighters and clerics, begging for healing and support, quivering for their lives. The diabolic cackling continues in their heads until they can somehow leave combat and rest for at least one turn to calm down. Until the wielder gets 8 hours of sleep and another hour in the morning to "appreciate" their sword, they will have to keep making saves against spells for each attack with the same consequences for each failure.

(Optionally, while a Vancian Sword is being used, the characters around the wielder will hear a faint murmur of people arguing about magic systems.)

Wednesday, May 15, 2013

Turn Undead = Favored Enemy

I don't know exactly who started it, for me it began with Dyson's updated Turn Undead table, then came Delta's somewhat unrelated ruminations, and finally we got to Talysman's interpretation of turning as morale, so there was literally a whole "train" of cleric posts in the last few days. All of this made me think that I better share my take before someone else does. :-)

I've been fooling around for a while with a revised cleric class for my own "sorta B/X and sorta not" retroclone, and one of its key components is that clerics get something akin to the ranger's "giant-class enemy" or rather what became known as "favored enemy" in more recent iterations of the game.

Let's face it, while most religions like to think that they are "for something" they are almost always "against something" as well, and frequently they are against many more things than they are for. In a swords and sorcery setting in particular, there shouldn't be any shortage of religions, gods, whatever that are against a certain kind of "monster" or "creature" as it were. So gods of life and sun and all that should be against undead. Gods of nature and wilderness and all that should be against civilization (and anyone who stands for it: settlers, cavaliers, and rangers for instance). Gods of law and civilization should be against barbarians. There could be a god who's against wizards, a god who's against dwarves, heck even a god who's against other gods. And so on, and so forth.

With that in mind, I give clerics (who I choose to interpret much closer to the fighter/paladin end than the priest/wizard end) a favored enemy at first level. Depending on the background the player cooks up, there may be more favored enemies later. Some gods might be against several things after all. And everybody who can think knows that clerics are fanatics, maybe even bloodthirsty lunatics, who always bring down the hammer or sword or whatnot where they feel they must: on the heads of those their religions declare enemies.

Let's say the players encounter a bunch of wererats. There's a cleric of some god who is known for hating lycanthropes among them. The wererats might even know some other clan who's recently been wiped out by these "genocidal" clerics. What do you think will happen? First the wererats get a negative modifier to their reaction roll, making them more hostile than if the cleric wasn't there. Then, if they lose a few friends, they get a negative modifier on their morale check because they don't want to be the next clan to be wiped out.

The modifier will depend to some degree on the cleric's level of course, which is why this is similar to turning undead in the original game. What is very different, however, is that it works for all kinds of "monsters" as long as there's a suitable religion in the campaign. And as a fun twist, creatures with no intelligence are immune because they cannot understand the danger they are in. So no "turning" skeletons or zombies, not ever. Of course this assumes that the cleric isn't hiding what religion he or she follows, but most gods will probably expect that kind of boldness anyway.

Other combat-related changes for my cleric include a "no missile weapons" rule: your god expects you to fight bravely against those evil worms, no hiding in a tree with a sling or a bow. Also compared to "standard" clerics I slow down the spell progression, somewhat inspired by the Sohei class found in Oriental Adventures but not quite that extreme.

I really need to write that cleric class up in full sometime soon...

Monday, May 13, 2013

Garrik's Landing (Level 1)

Inspired by both Dyson Logos and Matt Jackson I decided to give mapping a try again. I had not done any in 15+ years which left me with no style to speak of. So I am simply cloning their style for now, badly I might add. But I guess with some practice I could get back into semi-decent shape? Anyway, I just grabbed a stack of four-by-six note cards from Staples and slapped together the first level of "Garrik's Landing" which you see here.

Map of Garrik's Landing (Level 1)
Garrik's Landing (Level 1)

You can also grab a larger version if you feel like it. I have totally overdone it with the tracks/footsteps, I know that. Also several things are severely off-center, however that's actually quite alright for this location: It wasn't built by "perfectionist dwarves" after all. Constructive feedback would be appreciated! And I'll try to do one of these on a semi-regular basis to get back into the swing of things. We'll see for how long I can keep it up!

Saturday, May 11, 2013

Thieves Evolved

I've put the "Thieves Evolved" mechanic I outlined in an earlier post into a single-page PDF now. I also "evolved" it a bit more and there's even a worked example, although maybe not the best possible one. I'd like to thank Harald Wagener (for inspiring me to write my thieves up in the first place and for giving me feedback on the PDF) and Dyson Logos (for allowing me to use one of his excellent maps as a watermark). And now back to work...

Monday, May 6, 2013

Nice Example of World Design

Jason Lutes has two wonderful posts about world design over on his blog. He uses commonly available tools to put together a basic sandbox-style setting that he then slots existing modules into. Very nice work, and very inspirational.

Wednesday, May 1, 2013

Thief Skills vs. Combat Skills

If you look at the combat system of most D&D-variants, you'll quickly realize that it is designed to give two average, untrained, unarmored, human combatants about a 50% chance to hit each other. Given the amount of damage most weapons do, this also results in about a 50% chance to kill each other. Overall this seems mighty reasonable as a starting point and trained, armored, higher-level characters differentiate themselves nicely.

Thief skills, however, don't fare as well. First note that there is (usually) no such thing as an "average, untrained" thief: You have to be 1st level to be a thief whereas those untrained combatants above are 0-level humans. But if we look at the average chance for a thief skill to succeed, this is what we find:

Again, these are the chances for a trained character to succeed, and they are much lower than the chances for an untrained character to kill another. How can this possibly be right? (Also note that these averages are only "saved" by the fact that "climb wall" is amazingly high.)

Of course we can make up all kinds of explanations. A first one might be that it's simply harder to pick a lock than it is to kill someone. I lack the personal experience regarding both, so I can't really argue with this line of reasoning, but it seems iffy anyway. For one thing locks don't fight back. Also, how come that climbing a wall is so much easier than killing someone if picking a lock is so much harder?

Another popular explanation says that thief skills are "extraordinary" in the same way spells are. So all characters can try to be stealthy, but only thieves can move completely quietly without making any noise whatsoever (silence spell?). Similarly all characters can climb, but only thieves can climb smooth walls (spider climb spell?). Things break down a bit for "open locks" I guess since I don't think too many referees would give an average fighter a chance to open a lock short of bashing it to pieces, but hey, nothing's perfect.

But wait, why is this even a problem? A magic-user starts out with a single spell and has to make it through a few levels before becoming useful, so why shouldn't the same be true for a thief? I cannot argue with that take either, except to say that this attitude makes pretty much all classes nothing more than "bad fighters" for the first few levels. If that works for you, great. Personally I find it lacking.

Clerics are actually pretty decent fighter-substitutes, but thieves and magic-users are not. Having shoddy skills or only a single spell can be frustrating for the player who feels like they are not contributing enough in the early game. And while the power curve of the magic-user seems to justify their initial lack of skill, the power curve for the thief is a lot flatter: they don't become "cool" at level 5, they just finally become usable for their original purpose at around level 7. Not the most satisfying progression.

I feel somewhat vindicated in my "thief skills should be 50% right away because that's how the combat system works too" attitude by the way Delving Deeper handles thief skills: All of them are fixed at 50%. The only "problem" is that there is no improvement as the thief gains levels. So the character is useful right away, but aside from hitting better and lasting longer in combat, there is no class-specific advancement (except for backstab damage which does increase, but that's not really a "thief skill" in my mind).

So what am I doing in my own D&D-variant to address this?

First I want to give thieves useful skills right away, so they must start at around 50%. But unlike Delving Deeper I also want to give them advancement. I would guess that Delving Deeper doesn't allow advancement because it's unclear how far skills should be improved. If they went for the same kind of advancement they use for backstabbing, then a 12th-level thief would have 100% in all thief skills. And since they use "3-6 on d6 for success" as their mechanic, they can neither advance more gradually, something a d20 or d100 mechanic would allow, nor can they account for a persistent chance of failure to keep thieves from being "infallible" at what they do.

But there is a way around all this I believe, and that is simply not giving thieves access to all thief skills from the start. So in my game, a new thief character picks from a list of possible thief skills the four they want to start out with. They also pick them in order, from the most important skill to the least important one. The system assigns chances of success in such a way that their favorite skill will by 50% likely to succeed at 1st level, with the other skills lagging behind in 10% steps. This represents where they chose to focus during their "apprenticeship" as it were. When they gain a level, they can decide to improve skills that are not yet 50% likely to succeed by 5%, or they can pick up a new skill at 10%.

Since I am using Delta's excellent Target 20 mechanic, this works fine for games where characters retire at around level 10 or so (but it has a bit of stretch beyond that as well). But let's look at the details.

The four initial skills are 50%, 40%, 30%, 20% for an average of 35%. This is not too different from the traditional averages, but note that this thief is good at what they want to be good at, not just good at climbing walls like all the other thieves in the world. Advancing 9 levels allows for a total of 9*5% = 45% of improvement. So if the character never picks up another skill, the final skill profile would be 50%, 50%, 50%, 35%. Since level is added to all checks, this character would have three skills that fail only on a roll of 1. Fair enough, such single-minded dedication is (a) unlikely and (b) should indeed lead to formidable skill.

The more probable case in which the character picks up three more skills on the way would lead to a skill profile of 50%, 40%, 30%, 20%, 10%, 10%, 10% before advancement, or 50%, 50%, 40%, 25%, 20%, 20%, 15% after advancement, so we'd get two "perfect" skills and the lowest skill would have an overall 65% chance of success at level 10. Seems reasonable. And a welcome side-effect is that not all thieves are the same, one of the very few things AD&D 2nd edition actually got somewhat right.

I think I'll call this the "Thieves Evolved" mechanic.

Yes, it's a little more fiddly for the referee than plain Target 20, but it's a lot more enjoyable for the player. And if there's really a need to keep the stat block for NPC thieves down, a table of "typical progressions" should solve that well enough.