We call white guys "white boys" but only ever really refer to black guys as men. Why do you think that is? I have my own theories of course. White guys are seen as less masculine, especially in comparison to black guys. Whether they are or not is a matter of debate. (I think they are!) But it's inarguable than the view is widespread. (cue someone taking the time to find a pic showing the opposite of this) What's your opinion on this?