In a world in which principles and beliefs have tangible meaning, perhaps not. Moral issues do not always have straightforwardly utilitarian solutions, and often, the surest way to corrupt morality is to offer such a solution. An example might be as follows: Suppose that you *know* the only way to prevent the deaths of one million innocent people is to kill one innocent. Now say it's 100 innocents (precision bombing, say). Now say it's 1000 people (carpet bombing). Now say it's one million minus one (nuclear preemptive). Which is the morally correct solution?
Okay, super-contrived example, but it's late.

I don't doubt that there are situations in which a compromise of principles and beliefs is
truly necessary, but that's a matter of wisdom, not alignment. This, of course, is where morality gets complicated. I don't know that any of us in the real world have figured out the answers to some of those questions, and that, IMHO, is the point.