The important thing to remember is that the company provides them with a livable wage, benefits, and a decent work environment. Not the government. And that is the problem with so many democrats. Ask a democrat "who creates jobs?" and the answer will invariably be "the government." With all their contempt for profits and love of regulations and taxes and public sector jobs, America's business environment is becoming more and more hostile.
What kind of Democrats are you talking to? All of my friends & family who are liberals don't say that at all. In fact, it's patently ridiculous. Maybe you're just being facetious, but you're mischaracterizing and over-generalizing. Liberals, at the least the ones I know, want government to curb the excesses of business -- keep business from deceiving or preying on consumers, prevent the business sector from doing reckless things like crash the economy, taking everyone's retirement accounts & investments with them. Nobody with half a brain wants to keep businesses from making money, but they do want business to make money in an ethical fashion. The liberals I know want to actually get something of value from regulations and taxes, not simply tax and regulate because we think people can't tie their shoelaces themselves. At least a few of us think that government can, in theory, still be a positive influence. Nobody I know expects the government to owe them a living.
I know conservatives/Republicans must get sick of being characterized as a bunch of gun-toting, Bible-thumping, Fox News-watching, abortion clinic-bombing hillbilly yokels. And liberals/Democrats get tired of being characterized as lazy pot-smoking college dropout business-hating, tax-and-spend, government-loving, living on welfare, Communist Socialist hippies. Sure there are people that fall into one or more of each of these stereotypes, but how about you do everyone a favor and stop repeating the stereotypes as if they're the only reality that exists.