Ok, I'm curious as to the root of the health care responsibilities.
Now, as a business manager, I have a question to workers and management.
When does it become My responsibility to cover You medically?
Now, think about a few things here. You get Paid, and can use your money how you wish. Buy car insurance, health insurance, big screen TV, I don't care, as long as you perform the work you are paid for. One might argue "It isn't enough" ok, great, get a job somewhere else. But when, exactly, does it become my responsibility to pay for Your medical? Am I your Father? Mother? Grandparents? Do I have a Say in how you ACT off of work hours? Do I get to tell you Not to smoke? Not to Have unprotected sex? Not to Drink alchohol? Don't eat at at McDonalds? No, I don't, but the States keep coming up with reasons why companies Owe you some form of coverage?
Someone who has Built a business from the ground up, and have sacrificed Their income for years and years, to finally get successful, Please explain to me why I "owe" anyone Anything at all other than Pay for their Work.
How on Earth have people become so convoluted as to think in any way, shape or form that they are Owed anything other than Pay for their work? I, for the life of me, am at a loss here. Granted, I see the Value of offering it to recruit good solid employees, that is a very good benefit, but tell me, prove to me somehow, that this is something Owed simply by employment.