Unfortunately, I can't find the Legend & Lore articles from the D&D Next playtest, but it would have likely been in one of those.
@Scribe, With the tip to check legends and lore, I did some digging, and while I have yet to find the quote in question, I did find something very interesting. In the original article introducing the concept of bounded accuracy (and coining the term for it) explained it thusly.
The basic premise behind the bounded accuracy system is simple: we make no assumptions on the DM's side of the game that the player's attack and spell accuracy, or their defenses, increase as a result of gaining levels.
Furthermore, the article goes on to say very much the opposite of what I’m asserting (that primary ability bonus is expected to start at +3, increase to +4 at 4th level, and increase again to +5 at 8th level).
We also make the same assumptions about character ability modifiers and skill bonuses. Thus, our expected DCs do not scale automatically with level, and instead a DC is left to represent the fixed value of the difficulty of some task, not the difficulty of the task relative to level.
However, this should be taken in context with the point on 5e’s development when the article was written. At the time, 5e’s proficiency bonus mechanic hadn’t been developed yet. Training in a skill or tool gave a flat +3 bonus that never increased, and different classes’ attack bonuses increased at different rates, as the article also alludes to:
Now, note that I said that we make no assumptions on the DM's side of the game about increased accuracy and defenses. This does not mean that the players do not gain bonuses to accuracy and defenses. It does mean, however, that we do not need to make sure that characters advance on a set schedule, and we can let each class advance at its own appropriate pace.
If bounded accuracy means they assume no increase in accuracy as level increases, and everyone’s to-hit bonus increases at exactly the same rate, that must necessarily mean that target numbers increase commensurately with them, or else accuracy would increase. And we see exactly that if we look under the hood. As long as players’ primary ability modifier increases at 4th and 8th level, they maintain that same degree of accuracy against level-appropriate monsters (which depending on your starting score in your primary ability ranges between 45% and 65%).
This seems to contradict the initial intent of bounded accuracy, but my hypothesis is that this was done in order to make possible another change that was introduced in the same packet as the proficiency bonus progression became unified: the equation of feats with ability score increases.
A link to the full article on the wayback machine, should anyone care to read it:
web.archive.org