Can I travel to USA without insurance?

Can I travel to USA without insurance?

It’s not compulsory to have travel insurance to visit the USA; you can travel without it if you want. However, doing so means risking a potentially life-changing medical bill should something go wrong during your trip. Nov 8, 2021

See also  Is extended health care worth it in Canada?