Please forgive me for asking a seemingly ignorant question, but I have to get the opinion of the majority. Is it true black men, get jobs, and buy nice cars, and such things for attention and acceptance from woman??? If that is really the case, then I need to change my perspective on things a little bit. And if this is truly the case, I can understand why black men sometimes point the finger at woman for being in an undesirable position in the community. Please give a young sista some enlightenment!