Is it mandatory to have travel insurance for USA?

Is it mandatory to have travel insurance for USA?

Do I Need to Get US Travel Insurance? Technically, no. If you are traveling to the US for a short period of time, you are not required by law to have health insurance. However, the price of healthcare in the US is very high, so while medical insurance for visitors to the US is not mandatory, it is highly advisable.

See also  Are there sharks in Turks and Caicos?