Top News
Home >> Tag Archives: American rights

Tag Archives: American rights

When Did Everything Become A Right In This Country?

I was watching T.V. the other day, when a story came up about Obama-Care, on the screen was a liberal saying, “at least, health care is now a right.” As I sat there I said to myself, Just what we need another right in this country, I mean really, it seems everything is a right these days. This society is ...

Read More »