Black Women Don’t Owe Anyone Anything

Black women, especially, don’t owe America anything.

Rosalyn Morris
4 min readDec 5, 2024

--

Photo by Diana Simumpande on Unsplash

The recent election has completely changed many Americans’ viewpoint of this country. Unless, by some stroke of miracle, Trump’s presidency turns out to be less disastrous than many of us anticipate, America as we once knew it is gone. The rose-colored glasses are off. The belief that this country will one day live up to what it has claimed to stand for — liberty, justice, and equality for all — is dead.

As such, many people have decided to change their relationship with America. For many Black women, this means detachment. I can only speak for myself, so I will make this a personal essay.

Many of us claim that we are American, and that is okay, but this country does not claim us back. It never has. It never will. I was born here, and maybe I will die here, but this country has never felt like home. I don’t love the dirt here, the very soil. If I were to spend my last days somewhere else, I don’t think I would have a longing in my soul to see this place once again before I took my last breath. Not to smell sweet magnolias in the air or drive by acres and acres of trees (which would probably be gone). I don’t have a connection to the land like that. That’s why I’ve always wanted to visit Africa, but now, I don’t think I would feel that sort of…

--

--

Responses (11)