Monday, March 23, 2015

The Dirty Word that is 'Progress'

I've lived in the United States of America all my life. I have never hated being from here, or felt that I was embarrassed of this country. But honestly? That's starting to change.

It's 2015, and it seems that the term 'progress' has become such a dirty word. You see states proposing legislation to legalize discrimination. To legalize hate crimes.

And the majority of people are all for it.

A country that used to be a melting pot of ideas, customs, and people is now a country that welcomes hate, discrimination, and ignorance. We're OK with defunding schools and nutrition programs. It's OK if poor children starve, because they're poor.

It's OK to have uneducated doctors voting on women's health issues, because it won't affect them, since they have a penis.

If you're gay, or even think about anything remotely gay, you're disgusting and deserve a bullet to the head.

It's great that you've chosen to join the military - to be a proud patriot for your country - but fuck you when you're broken and serve us no purpose anymore.

And the list goes on.

How can people be alright with this? How can this not weight heavily on anyone's conscience at night? How can another person's lifestyle who does not affect your own in any way be the center of your attention?

We really need to start focusing on real issues here. Human rights are being trampled, the poor are getting poorer, for such a great country, proper healthcare access is a luxury, veterans are yesterday's trash, and where does it end?

It's like civil rights and true feminism have been so perverted over time by people who have ulterior motives. It seems like everything that was once a golden word in terms of progress has now been tainted so far beyond repair.


This country used to be at the top of the line in terms of equality, rights, and progress. Now we're just regressing, which seems to be the new golden word.