Is healthcare a right in USA?

Is healthcare a right in USA?

Universal access to health care, without discrimination, is a human right enshrined in the Universal Declaration of Human Rights and the International Covenant on Economic, Social and Cultural Rights.

See also  Does Blue Cross Blue Shield of Texas cover out of country?