Not really. Since damage is more likely to be reduced with save proficiency it seems to even out. I haven't done extensive math on it, but anecdotally that seems to be the case.
Did I hear you say "could you please do math on it"? I think I did.
Suppose you are attacking a save with an X% chance to fail (aka hit). With save-for-half and D damage if they fail, you do
D*X% + D 1/2 (1-X%) = D( X% - 1/2 + X% ) = D/2 + 1.5X% D
Monsters have H hp. So your spell does D/H (0.5 + 1.5X%) of the monster's HP.
Now suppose you have a save-or-suck-so-much-you-might-as-well-be-dead. (Lesser save or sucks can be viewed as percentages of that). This has an X% chance of landing, and if it does it nullifies H HP in a foe. So its effectiveness is H/H * X% = X%.
Afterwards, we halve H and lower X by 0.2
Before damage: D/H (0.5 + 1.5X%)
Before suck: X%
After damage: D/(H/2) (0.5 + 1.5(X%-.2)) = 2D/H (0.2 + 1.5X%) = D/H(0.4 + 3X%)
After suck: X%-.2
Safe-for-half spells are net improved by this change. A fireball can take out more "total CR" in foes after your change, on average, than it could before in every reasonable case.
Meanwhile, save-or-suck got strictly worse.
For fixed numbers, assume a 50% chance to fail.
Before, 50% suck, and 1.25 D/H damage.
After, 30% suck, and 1.9 D/H damage.
Your fireball gets 52% more effective at killing stuff, while your hypnotic pattern gets 40% worse at shutting them down.