Just because you played that way does not mean thats how everyone did. Trying to base your estimate of the Magic saturation in 1ED&D on the amount of treasure in the TSR modules is laughable.
Then, enlighten me. How would you base estimates of Magic saturation in 1e? I would think that OFFICIAL TSR modules would be a far better estimate than Joe's homebrew. Note, that these modules were set forward as standards of play. They were tournament adventures and certainly saw far more play in far more groups than anyone's home rules. By any objective standard I would say that modules are likely the only way to estimate magic saturation.
Never mind that most creatures with a treasure type had at least a 10% chance of having magic items. Given the xp tables of the time, you needed to fight far more encounters to go up a level than in 3e. Simple math shows that magic should be very common.
That it wasn't in a given group meant that the DM was artificially reducing treasure. But, this isn't a discussion about someone's homebrew, but a discussion about the rules as they actually exist. Or, at least that's what I thought.
Anyway, back to the point for a sec. Try this as an experiment:
Create 4 25 point 3.5 characters of 9th level. Equip them as follows: 150% of standard treasure, 100%, 75% and 50%. Now, run 4 EL 9 encounters against each group. Obviously reset each group after each encounter so that they start fresh with full wealth.
Keep track of the number of rounds the fight lasts and the amount of damage suffered by each party. Compare.
Come back when you've done this and you'll see that the "edge of a dime" balance in 3.5 is a myth. You can do a WIDE variation on the wealth by level in the game and not have too much of a breakdown.
In other words, if you simply want a lowER magic level in the game, Core supports that quite nicely.