Health insurance mandate
A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of (or in addition to) a national health insurance plan.
A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of (or in addition to) a national health insurance plan.