This is assuming D&D as of 3rd edition or later... and a success rate for the "average" character without any special bonuses or penalties of 50%.
I'd go lower than most and say a 5-6. But the reason is this: if I'm putting enough effort to really go as far as possible and make my special trick as reliable or as effective as possible, I'm likely to have close to a +5 from stats, and another potential +2 or more from bonuses. So if it is supposed to succeed 50% of the time on average without any special bonuses or penalties, I'm going to be failing only on a 2 or a 3. If it's something I'm good at from stats or have some resources invested in it, I'm likely to have anywhere from a 3-5. So my general average success rate, for something I want to be good in, but not totally specialized in, is going to be around 75% of the time. If I don't care about it specifically, but I'm somewhat good in it, I'll be around 60% of the time.
This is fairly noticeable if you look at 4th ed and 3rd ed stat allocations for point buys...
Main stats tend to have +4 or +5 bonus after modifiers intrinsic to the character and before level increases or gear, Secondary stats tend to be around +1 or +2 with the occasional +3, And tertiary stats tend to be neutral or positive, but are left up to whatever is left over, so are usually 10-12 (+0 or +1), and dumpstats are bare minimum.
However, given that D&D favors specialists, throughout 3rd ed and into 4th ed, level-based gear and stat increases changes the landscape of success somewhat over the 1-20 or 1-30 to have lower chances of success at non-specialties at high levels as compared to low levels. While 4th ed. may have decreased this factor as compared to 3rd ed., the presence of player chosen stat increases means that without specific counterbalances, this does not go away. Thankfully, spreading out the stat gains somewhat does help the close difference between good and great, but it also spreads out the difference between average and good. In 3rd, the normal difference between the specialty and an "average" level of ability was a +8 or +9 (which were often indistinguishable, assuming that at that level normal gear and buffs would give a an "average" character a 50% chance of success and assuming the general rule of autofail on a 1 applied.)
Now, what should we consider "reasonable expectations of success" for a given task, assuming the 50% average success rate applies? I'd say that in general, given my tastes, 80% success rates are good for things that character is supposed to be good at. A success rate of 95% is not inappropriate if it's vital to the character concept (and not overpowering). A 60-65% chance of success is fine for areas where a character is above average, but not particularly focused on at all.
Once combat or contested rolls come into play though, I'm thinking that those numbers should shift down a grade (basically, each +15% chance of success is one grade of success) against average enemies of similar level, and be reasonably similar against mooks, and down 1 and a 1/3 against superior enemies, so that the characters most specialized for hitting do about 80% of the time on a "boss" type enemy are likely wasting some their hit chance when trying to hit a mook if hitting only a single one.