Research Library

Domestic Health Policy

Center for Health and Biosciences

Health care is a basic human right

Health care is a basic human right.

By Greg H. Jones and Hagop M. Kantarjian, M.D.

In this commentary, the authors examine the roots of the United States’ reluctance to embrace universal health care, concluding that it "is neither sound nor ethical in a nation that promulgates fairness and equal opportunity" to deny coverage.

Read "Health care in the U.S. should be a basic human right, not an entitlement," published Aug. 18, 2015, in the Baker Institute Blog.