I'm sorry, Senator Obama, health care is not a right in America.
It is, however, one of the key indicators of how a country values its citizens and, as such, should be a universally provided benefit.
Subscribe to:
Post Comments (Atom)
One man's view of a small corner of the planet. The opinions are my own and, while I welcome your comments, I will maintain decorum here.
1 comment:
I'd suggest it could be argued that since "life" is one of the enumerated inalienable rights in the Declaration of Independence, and health care is certainly a factor in extending life, health care could be deemed a right.
That aside, as a compassionate society, I would like to think we should ensure access to health care simply as the right thing to do.
Spice
Post a Comment