Is medical insurance mandatory in Florida?

Is medical insurance mandatory in Florida?

A few states have passed their own health insurance requirements, but as we approach open enrollment for 2022 health plans, Florida is not one of them.

See also  What happens when you meet your deductible and out-of-pocket?