Is Health Care a Right?

Is Health Care a Right?

President Joseph Biden declared on March 23 that health care is a right. “We have a duty not just to protect it, but to make it better and keep becoming a nation where health care is a right for all, not a privilege for a few,” he said in remarks at the James Cancer...