Throughout history, Liberals have done many things for the good of this Country. They freed the slaves, fought for civil rights and protect the environment. Today though, the word liberal has become a dirty word. Why is this? Come share your thoughts.
When Did Being A Liberal Become A Bad Thing?

Custom Search

No comments:
Post a Comment