Maxperson
Morkus from Orkus
I disagree profoundly. As a mathematician and computer scientist I can tell you that this sort of area is exactly where I am currently engaged.
Back when I was a kid, and (as outlined in another post) I believe in the idea of verisimilitude and 'true to life play' arising out of accurate simulation, I thought "gosh, its impossible to do this with paper and pencil. How about a computer?" So I, and many others, thought things like that we would take D&D's simplistic combat system and simply add 500 more variables to it and code it all as some complicated computer algorithm and produce realistic results. Ha! What fools we were!
You see, that can't be done. Not by some simple process of a human being sitting down and creating a model and simply adding more elaboration to it until they think they've accounted for every variable and coupled them all right. If that was true children would have programmed self-driving cars in about 1987. Marvin Minsky would have produced general purpose AI in plenty of time for HAL 9000 to be booted up in the late 90's, as the movie so optimistically imagined.
Instead what we discovered is that reality is exceedingly intractable, and you can't even get close to analyzing it by any sort of 'mechanics'. Instead, only in the last 10 years, we have made progress via massive application of brute force pattern matching with reinforcement, and high order multivariate analyses.
So, nowadays, I can generate actionable predictions about complex business processes by running clusters of 1000's of servers for weeks at a time performing 500 dimensional correlation analyses against petabyte data sets to produce models which can predict things like who's likely to win a given basketball game on a given day, or which stock to buy, etc.
Cool. I'm not trying to mirror reality, so this does not apply to me. To improve realism you don't have to hit exactly like reality. That's a False Dichotomy. Realism is a scale, not all or nothing. You may not have mirrored reality with those 500 variables, but it was closer to reality to some degree than no program at all. Even it was only closer by .00001%.
Imagining that we can make a realistic model of wear and tear on weapons, and the likelihood of them failing at any given time using a few charts is simply not realistic at all.
It doesn't have to be. It only has to be more realistic than no system at all, which it is.
I can tell you that we made a model that predicts when a specific tire on a specific wheel is going to fail, and its pretty darn good, but it has to rely on an analysis that was done of 100's of thousands of full tire lifetimes of other tires, and its inputs include 1000's of data points related to each and every usage of said tire. It is still only maybe accurate to plus or minus 10%. That's good enough to make a business use case. It might well be good enough for an RPG too, but clearly games are far too abstract to support this.
That's awesome, but I don't need things to be that accurate for my game.