Are US citizens required to have health insurance?

Are US citizens required to have health insurance?

Health insurance coverage is no longer mandatory at the federal level, as of Jan. 1, 2019. Some states still require you to have health insurance coverage to avoid a tax penalty.

See also  What health insurance is most accepted in Florida?