When Obama was elected president six months ago we witnessed people in McCain’s headquarters react with devastating disappointment. From that point on we could say that America would never be the same but the ultimate lingering question still remained – Would racism end?
I feel like racism has decreased since Obama’s arrival in office. I mean you have to respect a man who gets on his own race’s case about education. As Obama was growing up he witnessed alot of blacks using racism as an excuse for not being successful. Obama encouraged parents to start getting stricter when it comes to homework, making sure their children go to school everyday, and a better bed time. We just don’t realize that our destiny is literally in our hands. So I feel like racism has gone down slightly because some blacks are going to start taking education seriously. Therefore, Whites won’t have a choice but to get along with us because we took the same route as they did. Am I right?
By Markwan Wiley