1 Answers

A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of a national health insurance plan.

Requirement that people have health insurance
8 views

Related Questions