Although usually well-meaning, health insurance mandates ultimately harm consumers by making health insurance more expensive and requiring individuals to buy health benefits they would not choose if they had the option.