Reasons behind Thanc0/AC in 1E/2E

Philotomy Jurament said:
It's also worth pointing out that the older approach to AC wasn't as closely tied to the concept of a mathematical formula as it became, later. AC started out as an indicator of the type of armor you wore.

A glance at the most ignored table in 1e, the to hit modifiers per weapon by armor class table in the PHB, strongly suggests you are correct.

If you have few or no AC mods, this table works. A dragon with an AC 2 is "just like" a knight in platemail and shield when it comes to trying to hit with a longsword. Better pick up a mace instead.

But this becomes bizarre when mods start getting added in. Is an elf with a high Dex wearing chainmail "just like" a knight wearing platemail and shield? Just doesn't work anymore.
 

log in or register to remove this ad

Philotomy Jurament said:
I've heard that a lot, so it must be a commonly-held opinion, but I don't see it that way. Typically, players had their "row" from the to hit charts written on their sheets; obtaining the number they need to hit doesn't get much simpler than glancing at the character sheet. No calculations required. THAC0 isn't a complicated calculation, but I don't see it as a leap forward in reducing complexity.

The moment you have more than one modifier for your to hit roll or your AC, it is a mathematical calculation for which THAC0 is a big step forward. In the days of Chainmail or OD&D a chart may have been good enough, but that approach was obviously silly when it came to AD&D.

A monster with a +5 to hit and AC 16 is vastly easier to handle than a 4 HD monster with an additional +1 to hit that has AC 5 and a -1 modifier to be hit.

A chart is never easier. In some simple cases, it is not noticeably harder.
 

Pretty much echoing what everyone else has already said -- in the earliest days (OD&D, 1974), AC was merely a table-heading and not intended to have any intrinsic mathematical value (why those specific table headings (2-9 with lower=better) were chosen remains something of a mystery, and open to speculation -- Dave Arneson is the person who would know, and he seems to either not remember or just not want to say). If you were wearing chain-mail and carrying a shield your AC was always 4, and if you had defensive adjustments (from, say, carrying a +2 magic shield) your AC didn't change, but people attacking you subtracted 2 from their attack rolls.

It so happened that the way the tables were set up, adjustments to the "to hit" roll equaled adjustments to the armor class (i.e. if your AC is 6 and attacks against you are adjusted -2, then the attacker effectively has the same chance to hit you as he does to hit a target with AC 4) which led the designers in OD&D Supplement I (1975) to simplify (?) the process by saying that instead of defense adjustments affecting the attacker's "to hit" roll they instead affected the target's AC -- so if you are wearing leather armor and shield (AC 6) and have a -2 defense adjustment (from magic or high Dex or whatever) your AC becomes AC 4. This also led to extrapolation of ACs below 2 (so you've got in Supplement I new monsters like the silver dragon with "AC -1," which would've been meaningless before -- they also had to provide a table explaining that to hit AC -1 you add 3 to the number needed on the table to hit AC 2).

From this basis, mathematically inclined fanzine contributors realized that you don't really need the whole table, you just need to know the attacker's chance to hit a single AC and can then adjust the number up or down for ACs higher or lower than the baseline. Different numbers were experimented around with (as noted above, the TSR Monster & Treasure Assortments (1977-78) provide the creatures' # needed to hit AC 9) and eventually 0 was settled on as the de facto standard (as seen in the 1E DMG (1979) appendix listing monsters' shorthand # needed to hit AC 0 -- which actually doesn't work in 1E the way it did in OD&D because the 1E "to hit" tables include repeated 20s; so while, for instance, a 1st level cleric and 1st level thief both require a 20 to hit AC 0, the thief's "to hit" chart is actually 1 step worse than the cleric's).

In the later 1E days the inclusion of "THAC0" (an acronym for "To Hit AC 0") as a shorthand reference in module stat-blocks became more and more common, at first without explanation, but later (as in, for instance, REF3: The Book of Lairs (1986)) with instructions to use that value and a formula in place of the DMG "to hit" tables -- subtract your roll from the THAC0 value to determine what AC you hit (i.e. a monster with THAC0 19 rolls a 17; 19-17=2 meaning the monster hit AC 2, just like the table). How to handle the repeated 20s on the table was, AFAICT, glossed over entirely.

By the time 2E AD&D was released (1989) this practice had become so widespread that it was included in the rules as the default -- the THAC0 value and the formula replaced the "to hit" tables of OD&D and 1E (and the 1E repeated 20s anomaly was removed, replaced IIRC by a rule (not present in the 1E RAW but commonly house-ruled) that a "natural 20" always hits). For people who hadn't witnessed the entire evolutionary process, this system seems bizarrely arcane and counter-intuitive, but the fact is that by this time the designers were operating on the basis of 15 years of inertia and tradition -- "everybody" knows that ACs start at 10 and go down, from there, that lower is better (and negative numbers are really good), and that as you gain levels your required number to hit those ACs also goes down; not because it's elegant or intuitive or makes any particular sense, but simply because that's the way it had always been.

From a modern, analytical perspective this seems absolutely crazy -- why would you stick with such a clunky and counter-intuitive system for 10 years (and, by the end of 2E's run, 20 years) instead of fixing it? There's no real answer to that question, except to say that people just didn't care that much about elegant or intuitive mechanics in those days (or, rather, people who were playing AD&D didn't -- those who did care about such things were generally off playing RuneQuest or GURPS or whatever). There was a long tradition behind the lower AC = better system affecting both backwards compatability and subjective "feel" (the AC 0 breakpoint feels significant in old-D&D; an at-a-glance indication that the monster you're fighting is really tough, or that your character has finally "hit the big time" (usually by virtue of a powerful magic item)) that made people reluctant to change and really, especially for people who'd been playing since the 1E days, the math isn't that hard or intrusive, and the calculation is pretty much automatic second-nature.

(Lastly: if you're using THAC0 and descending ACs in your game (presumably because you've got other reasons for wanting to play a pre-3E version of D&D) but want a quicker and more-intuitive formula than the standard one, use the following: THAC0 is a fixed/static target number, target AC is a modifier to the "to hit" roll -- so you roll your d20, add your normal modifiers (for strength, magic weapons, etc.), add the target's AC (i.e. if the target has a negative AC this is a negative add), and if the total equals or exceeds your THAC0 value you hit. For maximum speed and efficiency, the DM tells the players their opponents' AC; to keep things more mysterious the player tells the DM their subtotal (roll + mods) and THAC0 and the DM adds in the AC. This is, IME, not any more complex or difficult than the d20 method (and perhaps even a bit simpler as the modifiers tend to stay small (as the characters' mods get larger the AC mod gets smaller (and eventually becomes a negative), keeping the overall adjustment fairly small) and is also, ironically, almost a return to the earliest (1974) way of determining hits in D&D -- where the target's defense is a modifier to the attacker's "to hit" roll. If you don't like using THAC0 and have never tried playing this way before, give it a try and it might make the process seem easier and less wack.)
 

Ridley's Cohort said:
A chart is never easier. In some simple cases, it is not noticeably harder.
I don't see how that holds. All modifiers in AD&D can be applied to the die roll. The difficulty of reading the chart never varies, it's always a comparison of the final value of the die roll with the number in the correct X/Y axis position on the table.

Personally, I find that the d20 system and THACO are easier than tables for simple situations and the difference in difficulty between the formula systems and the table systems actually decreases as the complexity of the modifications increase.
 

Cadriel said:
I've always sort of liked the idea of Armor Class going down - it creates the idea of "AC 0" as a gold standard that just isn't there in 3.x.

I'm glad they got rid of that idea. AC should scale with level just like attack bonus. AC scales in 3.x with level; not with skill, unfortunately, but it does scale.
 

In a simpler era, I can see some merit to the idea of AC 1 (or AC 0) as some sort of top of the food chain armor type. But that concept was thoroughly demolished early in AD&D1e.
 

T. Foster, that has to be one of the best posts I have ever seen concerning Armor Class and its historical significance and implementation through out the entire history of Dungeons & Dragons. Very nice work.
 

(Psi)SeveredHead said:
I'm glad they got rid of that idea. AC should scale with level just like attack bonus. AC scales in 3.x with level; not with skill, unfortunately, but it does scale.
I find it...odd that you say it "should" scale with level. What does it add to the game if AC is just another part of an arms race with BAB? If AC increases at a similar rate to attack rating, the higher attack rating doesn't really mean anything. You get a bigger bonus, but it's against a similarly bigger target number. If you're rolling +1 vs. AC 15 at level 1, +5 vs. AC 20 at level 5, and +10 vs. AC 25 at level 10, it means exactly the same thing: you need a 14 or higher to hit. If AC is kept at a much slower (but nevertheless real) progression, the bigger to-hit bonus means that you actually hit enemies more often. If you're rolling +1 vs. AC 5 at level 1, +5 vs. AC 2 at level 5, and +10 vs. AC -1 at level 10, you're hitting on a 14, 13, and 11 respectively: your actual average results increase with level. 3.x seems to take this aspect (which I think is really a good thing) out of the game, which is part of why it's not the game I run.
 

One thing I miss about 2nd Edition is how THAC0 basically made it easier to tell when you would hit or miss during a fight. Once you had your THAC0 table written for the session all you basically did was roll and declare what AC you hit. Simple. The only real time it sucked was fighting something with a LOW AC when you were without armor.


(What, that kobold assassin hit? But I'm a 20 level fighter, he shouldn't have hit me! I don't care if my AC is 10 my elf fighter would have not be surprised by him: HE DOESN'T SLEEP!.. FINE I'll take the 1 Damage, but now that I'm up I'm going to kill the bastard...wait, what do you mean I have to Save vs Death!?!?!)
 

Cadriel said:
I find it...odd that you say it "should" scale with level. What does it add to the game if AC is just another part of an arms race with BAB? If AC increases at a similar rate to attack rating, the higher attack rating doesn't really mean anything. You get a bigger bonus, but it's against a similarly bigger target number. If you're rolling +1 vs. AC 15 at level 1, +5 vs. AC 20 at level 5, and +10 vs. AC 25 at level 10, it means exactly the same thing: you need a 14 or higher to hit. If AC is kept at a much slower (but nevertheless real) progression, the bigger to-hit bonus means that you actually hit enemies more often. If you're rolling +1 vs. AC 5 at level 1, +5 vs. AC 2 at level 5, and +10 vs. AC -1 at level 10, you're hitting on a 14, 13, and 11 respectively: your actual average results increase with level. 3.x seems to take this aspect (which I think is really a good thing) out of the game, which is part of why it's not the game I run.

Well, for one thing, you can hit lower-level enemies more easily. Secondly, there are those who think it's boring if at high levels you only have a 1-in-20 chance of missing.
 

Remove ads

Top