Is health insurance mandatory in US 2021?

Is health insurance mandatory in US 2021?

BY Anna Porretta Updated on January 21, 2022 As of 2019, the Obamacare individual mandate – which requires you to have health insurance or pay a tax penalty –no longer applies at the federal level. Jan 21, 2022

See also  What is the best health insurance company in the United States?