Did anyone tell you what happiness is? That you had a right to feel it? I’m pretty sure the knowledge/awareness of “happy” and the expectation that I should or should be able to feel it are fucking me up.
Have people always expected to feel happy? What if feeling awful is the default? If I was okay with feeling awful my life would be much better. Is this an American thing? I do NOT feel happy.
© Michelle Routhieaux 2012