There's always at least a little mumble in the news about the cost of healthcare and the idea of nationalizing it so everyone can afford it or whatever (I don't really care enough to actually look into what they're talking about).  A thought came to me while listening to some talk radio drivel one day -- the idea that many people feel entitled to be healthy, like it's their right as an American and as a human.  I would agree that it's good for certain basic medical services to be provided to everyone, even if they can't afford it.  Things like vaccinations and physicals and the like.  But I think that in general, we're ignoring the fact that health is a privilege, not a right, and that just like it's been since the beginning of the universe, the people with the money get the best stuff, or in this case, the best medicine and healthcare.  At some point we need to make the decision to not let that poor person get a liver transplant, even if it means they'll surely die.  This is a sensitive topic, and I'm speaking from a very limited, if not ignorant, perspective as a young, healthy, relatively rich (i.e. job-holding, debt-reducing) person.  I would love to hear other points of view on this. #health