The Lie: Healthcare is a human right

We are told in America by the Left that healthcare is a human right. Then the Left trots out some failing European system to bolster their point.

As with most things Left, they are disingenuous liars. Healthcare is a privilege; always has been, and always will be.

Ever wonder what primitive man did for healthcare?

 

Here’s a hint: He died! And it’s that way around much of the world. The fact is that healthcare is “earned” as a society becomes more civilized.

The idea that somebody can distribute healthcare is as ridiculous as saying humor.

Here is an infographic on Healthcare by the team at Work the World. It’s eye-opening to say the least!

 

Back to top button