Is health insurance mandatory in WI?

Is health insurance mandatory in WI?

Wisconsin Healthcare Insurance: What you need to know There is no state law requiring employers to offer group healthcare insurance to their employees, but most employers do offer this benefit.

See also  Can I go to Kaiser with Anthem?