What is an event that changed how you view the world?
You're viewing a single thread.
Seeing Racism brought back to life in 2008…
I'm going to guess you're American, is it that the election of Obama to the presidency caused all the closet racists to lose all sense? Or something else.