Basically, the reason can be summed up in this graph. This is a graph of the estimated* PC power curve by editions:
See the way the 3E curve soars to insane values in the latter half? For casters, that's the result of getting boatloads of high-level spells. (3E spells also scale by caster level, of course, but most lower-level spells are capped, meaning once you're 10th level or so, those spells stop getting better. Save DCs do continue to rise as your casting stat increases, but high-level spells are the main culprit here.)
4E averted this by throwing out the entire system and starting over. 5E is working within the general framework of the 3E rules, so it's much easier to compare the two and see how things have changed. To bring casters down to earth, 5E had to do one of two things: Make high-level spells weaker, or give out fewer of them. They played around with this some during the playtest, but in the end they seem to have settled on "give out fewer of them" almost entirely. This makes sense to me. It's more fun to have a handful of super-awesome spells than a larger number of spells that are just mild improvements on what you had before.
[SIZE=-2]*Using monster XP values as a rough proxy for PC power. In other words, if the edition's encounter guidelines say you can fight monsters worth twice as much XP, you're estimated to be twice as powerful. Yeah, I know 3E's CR system was shaky at best, but it's what we got. The chart is "1st Level" to "Max Level" rather than "1st Level" to "20th Level," because 4E worked on a 30-level scale rather than a 20-level one.[/SIZE]