I've always wondered about this. I've met white people that seemed nice and respectable, but I always wonder if they talk negatively about African Americans when they are around their race. For instance, your boss or co-workers. Do they as a whole have that racist bone in their bodies and some just hide it better than others? What do you think?