Jessica,
Taunton, MA.
Being Black in America means that you constantly have to change and work on yourself in order to please the dominant group in society (Whites). Its about making them feel comfortable and at ease even if that means rejecting your culture, traditions, ancestry, and basically everything that makes you who you are as a Black person. Society makes us feel that we should not be proud of our Blackness but instead work towards assimilating ourselves to be more White. Because after all, that’s what being an American is right?
Comments are closed.